Advanced Search
Volume 46 Issue 7
Jul.  2024
Turn off MathJax
Article Contents
LI Dongsheng, WANG Guoyan, LIU Jinxin, FAN Hongqi, LI Biao. Joint Internal and External Parameters Calibration of Optical Compound Eye Based on Random Noise Calibration Pattern[J]. Journal of Electronics & Information Technology, 2024, 46(7): 2898-2907. doi: 10.11999/JEIT230652
Citation: LI Dongsheng, WANG Guoyan, LIU Jinxin, FAN Hongqi, LI Biao. Joint Internal and External Parameters Calibration of Optical Compound Eye Based on Random Noise Calibration Pattern[J]. Journal of Electronics & Information Technology, 2024, 46(7): 2898-2907. doi: 10.11999/JEIT230652

Joint Internal and External Parameters Calibration of Optical Compound Eye Based on Random Noise Calibration Pattern

doi: 10.11999/JEIT230652
Funds:  The National Natural Science Foundation of China (62303478)
  • Received Date: 2023-07-03
  • Rev Recd Date: 2024-05-01
  • Available Online: 2024-05-17
  • Publish Date: 2024-07-29
  • In tasks such as precise guidance and obstacle avoidance navigation based on optical compound eyes, the calibration of optical compound eyes plays a crucial role in achieving high accuracy. The classical Zhang’s calibration method requires each ommatidium of the optical compound eyes to observe a complete chessboard pattern. However, the complexity of the optical compound eye structure makes it difficult to satisfy this requirement in practical applications. In this paper, a joint internal and external parameters calibration algorithm of optical compound eyes based on a random noise plate calibration pattern is proposed. This algorithm utilizes the local information captured by the ommatidia when photographing the random noise calibration pattern, enabling simple and fast calibration for optical compound eyes with arbitrary configurations and numbers of ommatidia. To improve the robustness of the calibration, a multi-threshold matching mechanism is introduced to address the issue of sparse feature point quantity in ommatidial visual fields leading to matching failures. Moreover, an error model for the joint internal and external parameters calibration of optical compound eyes is presented to evaluate the accuracy of the proposed algorithm. Experimental comparisons with Zhang’s calibration method demonstrate the robustness of the proposed algorithm. Furthermore, the high accuracy of the proposed joint calibration algorithm is validated in a physical system of optical compound eyes.
  • loading
  • [1]
    QI Qiming, FU Ruigang, SHAO Zhengzheng, et al. Multi-aperture optical imaging systems and their mathematical light field acquisition models[J]. Frontiers of Information Technology & Electronic Engineering, 2022, 23(6): 823–844. doi: 10.1631/FITEE.2100058.
    [2]
    王国锋, 王立, 刘岩, 等. 蝇复眼在导弹上的应用[J]. 弹箭与制导学报, 2002(S2): 102–104.

    WANG Guofeng, WANG Li, LIU Yan, et al. Research of missile based on ommateum[J]. Journal of Projectiles, Missiles and Guidance, 2002(S2): 102–104.
    [3]
    芦丽明. 蝇复眼在导弹上的应用研究[D]. [博士论文], 西北工业大学, 2002.

    LU Liming. Studies on missile guidance using fly’s ommateum technology[D]. [Ph. D. dissertation], Northwestern Polytechnical University, 2002.
    [4]
    SONG Yanfeng, HAO Qun, CAO Jie, et al. The application of artificial compound eye in precision guided munitions for urban combat[J]. Journal of Physics: Conference Series, 2023, 2460: 012170. doi: 10.1088/1742-6596/2460/1/012170.
    [5]
    MA Mengchao, LI Hang, GAO Xicheng, et al. Target orientation detection based on a neural network with a bionic bee-like compound eye[J]. Optics Express, 2020, 28(8): 10794–10805. doi: 10.1364/OE.388125.
    [6]
    DENG Xinpeng, QIU Su, JIN Weiqi, et al. Three-dimensional reconstruction method for bionic compound-eye system based on MVSNet network[J]. Electronics, 2022, 11(11): 1790. doi: 10.3390/electronics11111790.
    [7]
    MARTIN N and FRANCESCHINI N. Obstacle avoidance and speed control in a mobile vehicle equipped with a compound eye[C]. Intelligent Vehicles ‘94 Symposium, Paris, France, 1994: 381–386. doi: 10.1109/IVS.1994.639548.
    [8]
    HAN Yibo, LI Xia, LI Xiaocui, et al. Recognition and detection of wide field bionic compound eye target based on cloud service network[J]. Frontiers in Bioengineering and Biotechnology, 2022, 10: 865130. doi: 10.3389/fbioe.2022.865130.
    [9]
    雷卫宁, 郭云芝, 高挺挺. 基于仿生复眼的大视场探测系统结构研究[J]. 光学与光电技术, 2016, 14(3): 62–66.

    LEI Weining, GUO Yunzhi, and GAO Tingting. Study on the structure of large field view detection system based on bionic compound eye[J]. Optics & Optoelectronic Technology, 2016, 14(3): 62–66.
    [10]
    许黄蓉, 刘晋亨, 张远杰, 等. 无人机载型曲面仿生复眼成像测速系统[J]. 光子学报, 2021, 50(9): 0911004. doi: 10.3788/gzxb20215009.0911004.

    XU Huangrong, LIU Jinheng, ZHANG Yuanjie, et al. UAV-borne biomimetic curved compound-eye imaging system for velocity measurement[J]. Acta Photonica Sinica, 2021, 50(9): 0911004. doi: 10.3788/gzxb20215009.0911004.
    [11]
    KOO G, JUNG W, and DOH N. A two-step optimization for extrinsic calibration of Multiple Camera System (MCS) using depth-weighted normalized points[J]. IEEE Robotics and Automation Letters, 2021, 6(4): 6608–6615. doi: 10.1109/LRA.2021.3094412.
    [12]
    袁泽强, 谷宇章, 邱守猛, 等. 多相机式仿生曲面复眼的标定与目标定位[J]. 光子学报, 2021, 50(9): 0911005. doi: 10.3788/gzxb20215009.0911005.

    YUAN Zeqiang, GU Yuzhang, QIU Shoumeng, et al. Calibration and target position of bionic curved compound eye composed of multiple cameras[J]. Acta Photonica Sinica, 2021, 50(9): 0911005. doi: 10.3788/gzxb20215009.0911005.
    [13]
    赵子良, 张宗华, 高楠, 等. 基于ChArUco平板的多目相机标定[J]. 应用光学, 2021, 42(5): 848–852. doi: 10.5768/JAO202142.0502004.

    ZHAO Ziliang, ZHANG Zonghua, GAO Nan, et al. Calibration of multiple cameras based on ChArUco board[J]. Journal of Applied Optics, 2021, 42(5): 848–852. doi: 10.5768/JAO202142.0502004.
    [14]
    GEIGER A, MOOSMANN F, CAR Ö, et al. Automatic camera and range sensor calibration using a single shot[C]. 2012 IEEE International Conference on Robotics and Automation, Saint Paul, USA, 2012: 3936–3943. doi: 10.1109/ICRA.2012.6224570.
    [15]
    HARTLEY R and KANG S B. Parameter-free radial distortion correction with center of distortion estimation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 29(8): 1309–1321. doi: 10.1109/TPAMI.2007.1147.
    [16]
    ZHU Chen, ZHOU Zihan, XING Ziran, et al. Robust plane-based calibration of multiple non-overlapping cameras[C]. The 2016 Fourth International Conference on 3D Vision, Stanford, USA, 2016: 658–666. doi: 10.1109/3DV.2016.73.
    [17]
    HENG L, BÜRKI M, LEE G H, et al. Infrastructure-based calibration of a multi-camera rig[C]. Proceedings of 2014 IEEE International Conference on Robotics and Automation, Hong Kong, China, 2014: 4912–4919. doi: 10.1109/ICRA.2014.6907579.
    [18]
    LIN Yukai, LARSSON V, GEPPERT M, et al. Infrastructure-based multi-camera calibration using radial projections[C]. The 16th European Conference on Computer Vision, Glasgow, UK, 2020: 327–344. doi: 10.1007/978-3-030-58517-4_20.
    [19]
    HENG L, FURGALE P, and POLLEFEYS M. Leveraging image-based localization for infrastructure-based calibration of a multi-camera rig[J]. Journal of Field Robotics, 2015, 32(5): 775–802. doi: 10.1002/rob.21540.
    [20]
    DEXHEIMER E, PELUSE P, CHEN Jianhui, et al. Information-theoretic online multi-camera extrinsic calibration[J]. IEEE Robotics and Automation Letters, 2022, 7(2): 4757–4764. doi: 10.1109/LRA.2022.3145061.
    [21]
    ZHANG Z. A flexible new technique for camera calibration[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 1330–1334. doi: 10.1109/34.888718.
    [22]
    XING Ziran, YU Jingyi, and MA Yi. A new calibration technique for multi-camera systems of limited overlapping field-of-views[C]. 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vancouver, Canada, 2017: 5892–5899. doi: 10.1109/IROS.2017.8206482.
    [23]
    ATCHESON B, HEIDE F, and HEIDRICH W. CALTag: High precision fiducial markers for camera calibration[C]. The 15th International Workshop on Vision, Siegen, Germany, 2010: 41–48. doi: 10.2312/PE/VMV/VMV10/041-048.
    [24]
    FIALA M. ARTag, a fiducial marker system using digital techniques[C]. 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, USA, 2005: 590–596. doi: 10.1109/CVPR.2005.74.
    [25]
    FIALA M and SHU Chang. Self-identifying patterns for plane-based camera calibration[J]. Machine Vision and Applications, 2008, 19(4): 209–216. doi: 10.1007/s00138-007-0093-z.
    [26]
    HEROUT A, SZENTANDRÁSI I, ZACHARIÁ M, et al. Five shades of grey for fast and reliable camera pose estimation[C]. Proceedings of 2013 IEEE Conference on Computer Vision and Pattern Recognition, Portland, USA, 2013: 1384–1390. doi: 10.1109/CVPR.2013.182.
    [27]
    LI Bo, HENG L, KOSER K, et al. A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern[C]. 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 2013: 1301–1307. doi: 10.1109/IROS.2013.6696517.
    [28]
    SCARAMUZZA D, MARTINELLI A, and SIEGWART R. A flexible technique for accurate omnidirectional camera calibration and structure from motion[C]. Proceedings of the Fourth IEEE International Conference on Computer Vision Systems, New York, USA, 2006: 45. doi: 10.1109/ICVS.2006.3.
    [29]
    SCARAMUZZA D, MARTINELLI A, and SIEGWART R. A toolbox for easily calibrating omnidirectional cameras[C]. Proceedings of 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 2006: 5695–5701. doi: 10.1109/IROS.2006.282372.
    [30]
    WANG Zhou, BOVIK A C, SHEIKH H R, et al. Image quality assessment: From error visibility to structural similarity[J]. IEEE Transactions on Image Processing, 2004, 13(4): 600–612. doi: 10.1109/TIP.2003.819861.
    [31]
    FATMA A, KHALED K, and ZEMZEMI F. Design, construction and calibration of an omnidirectional camera[C]. 2013 International Conference on Individual and Collective Behaviors in Robotics, Sousse, Tunisia, 2013: 49–55. doi: 10.1109/ICBR.2013.6729265.
    [32]
    MEI C and RIVES P. Single view point omnidirectional camera calibration from planar grids[C]. 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, 2007: 3945–3950. doi: 10.1109/ROBOT.2007.364084.
    [33]
    HEIKKILA J and SILVEN O. A four-step camera calibration procedure with implicit image correction[C]. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Juan, USA, 1997: 1106–1112. doi: 10.1109/CVPR.1997.609468.
    [34]
    RAMEAU F, PARK J, BAILO O, et al. MC-Calib: A generic and robust calibration toolbox for multi-camera systems[J]. Computer Vision and Image Understanding, 2022, 217: 103353. doi: 10.1016/j.cviu.2021.103353.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(11)  / Tables(3)

    Article Metrics

    Article views (240) PDF downloads(27) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return