Advanced Search
Volume 44 Issue 3
Mar.  2022
Turn off MathJax
Article Contents
NIE Wei, WEN Huaizhi, XIE Liangbo, YANG Xiaolong, ZHOU Mu. Indoor Localization of UAV Using Monocular Vision[J]. Journal of Electronics & Information Technology, 2022, 44(3): 906-914. doi: 10.11999/JEIT211328
Citation: NIE Wei, WEN Huaizhi, XIE Liangbo, YANG Xiaolong, ZHOU Mu. Indoor Localization of UAV Using Monocular Vision[J]. Journal of Electronics & Information Technology, 2022, 44(3): 906-914. doi: 10.11999/JEIT211328

Indoor Localization of UAV Using Monocular Vision

doi: 10.11999/JEIT211328
Funds:  Chongqing Natural Science Foundation Project (cstc2019jcyj-msxmX0742), Chongqing Education Commission Science and Technology Research Project (KJQN202000630)
  • Received Date: 2021-11-21
  • Accepted Date: 2022-02-24
  • Rev Recd Date: 2022-02-23
  • Available Online: 2022-02-28
  • Publish Date: 2022-03-28
  • At present, Unmanned Aerial Vehicle (UAV) positioning technology relies mainly on the represented Global Positioning System (GPS). However, it is difficult to locate where GPS signals are missing in the room. On the other hand, the traditional indoor positioning technology uses mainly Bluetooth, WiFi, base station positioning and other methods to merge into a set of positioning system. However, this kind of methods are often affected by the environment, and they needs often to deploy multiple devices. In addition, they can only get far and near information, and can not know the device's posture in space. In this paper, an UAV indoor positioning system is proposed based on monocular vision. Firstly, the image taken by the camera is used, and combined with the feature point method and the direct method, to track the feature points first, then the direct method is used to match the features according to the key points, and then the camera position and posture are estimated. Then, the depth filter is used to estimate the 3D depth of feature points, and a sparse map in the current environment is established. Finally, the real environment is simulated using the three-dimensional visualization tool RVIZ of Robot Operating System (ROS). The simulation results show that the proposed method can achieve good performance in indoor environment, and the positioning accuracy reaches 0.04 m.
  • loading
  • [1]
    FAN Bangkui, LI Yun, ZHANG Ruiyu, et al. Review on the technological development and application of UAV systems[J]. Chinese Journal of Electronics, 2020, 29(2): 199–207. doi: 10.1049/cje.2019.12.006
    [2]
    陈友鹏, 李雷, 赖刘生, 等. 多旋翼无人机的特点及应用[J]. 时代汽车, 2021(16): 20–21. doi: 10.3969/j.issn.1672-9668.2021.16.010

    CHEN Youpeng, LI Lei, LAI Liusheng, et al. Features and applications of multi-rotor UAV[J]. Auto Time, 2021(16): 20–21. doi: 10.3969/j.issn.1672-9668.2021.16.010
    [3]
    NEMRA A and AOUF N. Robust INS/GPS sensor fusion for UAV localization using SDRE nonlinear filtering[J]. IEEE Sensors Journal, 2010, 10(4): 789–798. doi: 10.1109/JSEN.2009.2034730
    [4]
    ZAFARI F, GKELIAS A, and LEUNG K K. A survey of indoor localization systems and technologies[J]. IEEE Communications Surveys & Tutorials, 2019, 21(3): 2568–2599. doi: 10.1109/COMST.2019.2911558
    [5]
    赵帅杰. 基于WiFi/蓝牙融合的室内定位算法研究[D]. [硕士论文], 桂林电子科技大学, 2020.

    ZHAO Shuaijie. Research on indoor location algorithm based on WiFi and Bluetooth fusion[D]. [Master dissertation], Guilin University of Electronic Technology, 2020.
    [6]
    TAKETOMI T, UCHIYAMA H, and IKEDA S. Visual SLAM algorithms: A survey from 2010 to 2016[J]. IPSJ Transactions on Computer Vision and Applications, 2017, 9(1): 16. doi: 10.1186/s41074-017-0027-2
    [7]
    SILVEIRA G, MALIS E, and RIVES P. An efficient direct approach to visual SLAM[J]. IEEE Transactions on Robotics, 2008, 24(5): 969–979. doi: 10.1109/TRO.2008.2004829
    [8]
    DAVISON A J, REID I D, MOLTON N D, et al. MonoSLAM: Real-time single camera SLAM[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 29(6): 1052–1067. doi: 10.1109/TPAMI.2007.1049
    [9]
    KLEIN G and MURRAY D. Parallel tracking and mapping for small AR workspaces[C]. Proceedings of the 6th IEEE And ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, 2007: 225–234.
    [10]
    MUR-ARTAL R, MONTIEL J M M, and TARDÓS J D. ORB-SLAM: A versatile and accurate monocular SLAM system[J]. IEEE Transactions on Robotics, 2015, 31(5): 1147–1163. doi: 10.1109/TRO.2015.2463671
    [11]
    ENGEL J, SCHÖPS T, and CREMERS D. LSD-SLAM: Large-scale direct monocular SLAM[C]. Proceedings of the 13th European Conference on Computer Vision, Zurich, Switzerland, 2014: 834–849.
    [12]
    FORSTER C, ZHANG Zichao, GASSNER M, et al. SVO: Semidirect visual odometry for monocular and multicamera systems[J]. IEEE Transactions on Robotics, 2017, 33(2): 249–265. doi: 10.1109/TRO.2016.2623335
    [13]
    BAKER S and MATTHEWS I. Lucas-kanade 20 years on: A unifying framework[J]. International Journal of Computer Vision, 2004, 56(3): 221–255. doi: 10.1023/B:VISI.0000011205.11775.fd
    [14]
    TRIGGS B, MCLAUCHLAN P F, HARTLEY R I, et al. Bundle adjustment—a modern synthesis[C]. Proceedings of the International Workshop on Vision Algorithms, Corfu, Greece, 1999: 298–372.
    [15]
    VISWANATHAN D G. Features from accelerated segment test (FAST)[C]. Proceedings of the 10th workshop on Image Analysis for Multimedia Interactive Services, London, UK, 2009: 6–8.
    [16]
    DOLLÁR P, APPEL R, BELONGIE S, et al. Fast feature pyramids for object detection[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 36(8): 1532–1545. doi: 10.1109/TPAMI.2014.2300479
    [17]
    CHIU L C, CHANG T S, CHEN J Y, et al. Fast SIFT design for real-time visual feature extraction[J]. IEEE Transactions on Image Processing, 2013, 22(8): 3158–3167. doi: 10.1109/TIP.2013.2259841
    [18]
    MISTRY S and PATEL A. Image stitching using Harris feature detection[J]. International Research Journal of Engineering and Technology (IRJET) , 2016, 3(4): 1363–1369.
    [19]
    PIZZOLI M, FORSTER C, and SCARAMUZZA D. REMODE: Probabilistic, monocular dense reconstruction in real time[C]. Proceedings of 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 2014: 2609–2616.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(12)  / Tables(2)

    Article Metrics

    Article views (1090) PDF downloads(166) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return