During lunar surface scientific explorations, high-accuracy self-localization of lunar rovers is a key problem to be solved. Aiming at accurate localization in the featureless environments on the lunar surface, we propose a new visual-inertial Simultaneous Localization and Mapping (SLAM) algorithm, which fuses the measurements of vision and the inertial sensor by pose-graph optimization to achieve high-precision self-localization. An optical flow tracking algorithm based on the quadtree method is proposed to address the unbounded front-end visual measurements correlation error in featureless environments. This algorithm can effectively track robust feature points, thereby improving the accuracy of pose estimation between adjacent frames. Moreover, an effective star point removal algorithm is proposed to effectively remove the star points at infinity, which is beneficial to solve localization accuracy decrease caused by unstable landmarks at infinity. A computer simulation system of the lunar surface environment as well as a set of various lunar visual inertial SLAM simulation datasets is further built, and several localization tests in different lunar simulation environments are conducted. Simulation results verify that our algorithm is more robust with better localization accuracy.
XIE Hongle
,
CHEN Weidong
,
FAN Yaxian
,
WANG Jingchuan
. Visual-inertial SLAM in featureless environments on lunar surface[J]. ACTA AERONAUTICAET ASTRONAUTICA SINICA, 2021
, 42(1)
: 524169
-524169
.
DOI: 10.7527/S1000-6893.2020.24169
[1] AN P, LIU Y, ZHANG W, et al. Vision-based simultaneous localization and mapping on lunar rover[C]//2018 IEEE 3rd International Conference on Image, Vision and Computing. Piscataway:IEEE Press, 2018:487-493.
[2] ALLAN M, WONG U, FURLONG P M, et al. Planetary rover simulation for lunar exploration missions[C]//2019 IEEE Aerospace Conference. Piscataway:IEEE Press, 2019:1-19.
[3] 刘传凯,王保丰,王镓,等. 嫦娥三号巡视器的惯导与视觉组合定姿定位[J].飞行器测控学报, 2014, 33(3):250-257. LIU C K, WANG B F, WANG J, et al. Integrated INS and vision-based orientation determination and positioning of CE-3 lunar rover[J]. Journal of Spacecraft TT&C Technology, 2014, 33(3):250-257(in Chinese).
[4] 王保丰,周建亮,唐歌实. 嫦娥三号巡视器视觉定位方法[J].中国科学:信息科学, 2014, 44(4):452-460. WANG B F, ZHOU J L, TANG G S, et al. Research on visual localization method of lunar rover[J]. Scientia Sinica Informationis, 2014, 44(4):452-460(in Chinese).
[5] 喻思琪,张小红,郭斐, 等.卫星导航进近技术进展[J]. 航空学报, 2019, 40(3):322200. YU S Q, ZHANG X H, GUO F, et al. Recent advances in precision approach based on GNSS[J]. Acta Aeronautica et Astronautica Sinica, 2019, 40(3):322200(in Chinese).
[6] ENGEL J, KOLTUN V, CREMERS D. Direct sparse odometry[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 40(3):611-625.
[7] FORSTER C, ZHANG Z, GASSNER M, et al. SVO:Semidirect visual odometry for monocular and multicamera systems[J]. IEEE Transactions on Robotics, 2016, 33(2):249-265.
[8] MUR-ARTAL R, TARDOS J D. ORB-SLAM2:An open-source slam system for monocular, stereo, and RGB-D cameras[J]. IEEE Transactions on Robotics, 2017, 33(5):1255-1262.
[9] GOMEZ-OJEDA R, MORENO F A, ZUNIGA-NOEL D, et al. PL-SLAM:A stereo SLAM system through the combination of points and line segments[J]. IEEE Transactions on Robotics, 2019, 35(3):734-746.
[10] LEUTENEGGER S, LYNEN S, BOSSE M, et al. Keyframe-based visual-inertial odometry using nonlinear optimization[J]. The International Journal of Robotics Research, 2015, 34(3):314-334.
[11] MUR-ARTAL R, TARDOS J D. Visual-inertial monocular SLAM with map reuse[J]. IEEE Robotics and Automation Letters, 2017, 2(2):796-803.
[12] QIN T, LI P L, SHEN S J. VINS-Mono:A robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics, 2018, 34(4):1004-1020.
[13] SONG B W, CHEN W D, WANG J C, et al. Long-term visual inertial slam based on time series map prediction[C]//2019 IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway:IEEE Press, 2019:5364-5369.
[14] XIE H L, CHEN W D, WANG J C, et al. Hierarchical quadtree feature optical flow tracking based sparse pose-graph visual-inertial SLAM[C]//2020 IEEE International Conference on Robotics and Automation. Piscataway:IEEE Press, 2020:58-64.
[15] FORSTER C, CARLONE L, DELLAERT F, et al. On-manifold preintegration for real-time visual-inertial odometry[J]. IEEE Transactions on Robotics, 2016, 33(1):1-21.
[16] LUCAS B D, KANADE T. An iterative image registration technique with an application to stereo vision[C]//Proceedings of the International Joint Conference on Artificial Intelligence,1981:674-679.
[17] BOUGUET J Y. Pyramidal implementation of the affine Lucas-Kanade feature tracker description of the algorithm[J]. Intel Corporation, 2001, 5(10):4-15.
[18] NEUBECK A, GOOL L V. Efficient non-maximum suppression[C]//18th International Conference on Pattern Recognition, 2006:850-855.
[19] SHI J B, TOMASI C. Good features to track[C]//1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 1994:593-600.
[20] CIVERA J, DAVISON A J, MONTIEL J M M. Inverse depth parametrization for monocular SLAM[J]. IEEE Transactions on Robotics, 2008, 24(5):932-945.
[21] COLD C. WIP Moon habitat[EB/OL]. (2015-05-20)[2020-04-29]. https://3dwarehouse.sketchup.com/model/854d90c0-c7e4-4f1c-a278-63710f9e8104/WIP-Moon-habitat.
[22] BURRI M, NIKOLIC J, GOHL P, et al. The EuRoC micro aerial vehicle datasets[J]. The International Journal of Robotics Research, 2016, 35(10):1157-1163.
[23] STURM J, ENGELHARD N, ENDRES F, et al. A benchmark for the evaluation of RGB-D SLAM systems[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway:IEEE Press, 2012:573-580.
[24] ZHANG Z, SCARAMUZZA D. A tutorial on quantitative trajectory evaluation for visual (-inertial) odometry[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway:IEEE Press, 2018:7244-7251.
[25] WEI F, ZHENG L. Rapid and robust initialization for monocular visual inertial navigation within multi-state Kalman filter[J]. Chinese Journal of Aeronautics, 2018, 31(1):148-160.
[26] KANG H, AN J, LEE J. IMU-vision based localization algorithm for lunar rover[C]//2019 IEEE/ASME International Conference on Advanced Intelligent Mechatronics. Piscataway:IEEE Press, 2019:766-771.
[27] WEI L, LEE S. 3D peak based long range rover localization[C]//2016 7th International Conference on Mechanical and Aerospace Engineering, 2016:600-604.
[28] HEWITT R A, BOUKAS E, AZKARATE M, et al. The Katwijk beach planetary rover dataset[J]. The International Journal of Robotics Research, 2018, 37(1):3-12.