综述

无人机韧性自主定位技术综述

  • 赵春晖 ,
  • 刘安萌 ,
  • 吕洋 ,
  • 潘泉
展开
  • 1.西北工业大学 自动化学院,西安 710072
    2.信息融合技术教育部重点实验室,西安 710072
.E-mail: lyu.yang@nwpu.edu.cn

收稿日期: 2023-04-07

  修回日期: 2023-05-17

  录用日期: 2023-06-28

  网络出版日期: 2023-07-07

基金资助

国家自然科学基金(62073264);陕西省重点研发计划(2021ZDLGY01-01)

A survey of resilient self-localization for UAV

  • Chunhui ZHAO ,
  • Anmeng LIU ,
  • Yang LYU ,
  • Quan PAN
Expand
  • 1.School of Automation,Northwestern Polytechnical University,Xi’an 710072,China
    2.Key Laboratory of Information Fusion Technology,Ministry of Education,Xi’an 710072,China

Received date: 2023-04-07

  Revised date: 2023-05-17

  Accepted date: 2023-06-28

  Online published: 2023-07-07

Supported by

National Natural Science Foundation of China(62073264);Provincial Key Research Program of Shaanxi(2021ZDLGY01-01)

摘要

当前无人机(UAV)自主定位技术研究多针对特定硬件配置平台在稀疏友好环境中满足简单任务时的定位要求,在大范围复杂稠密环境和长周期复杂任务时不具备持续性、高可靠性和强适应性,制约了无人机更大规模和更广范围的应用。本文聚焦无人机韧性自主定位技术,从自主定位系统回路中的感知、估计、控制3个核心环节出发,关注持续性、可靠性和适应性等韧性指标,按多源冗余信息融合、鲁棒后端估计和具备感知意识的控制策略对国内外研究工作进行了梳理评述,指出在韧性指标要求下当前无人机自主定位技术的局限性,以及在有限机载资源条件下进行方法集成的技术难点,对无人机韧性自主定位技术的发展方向进行了展望。

本文引用格式

赵春晖 , 刘安萌 , 吕洋 , 潘泉 . 无人机韧性自主定位技术综述[J]. 航空学报, 2024 , 45(8) : 28839 -028839 . DOI: 10.7527/S1000-6893.2023.28839

Abstract

At present, research on self-localization technology for Unmanned Aerial Vehicle (UAV) is primarily focused on meeting the localization requirements for simple tasks in sparse and friendly environments, employing specific hardware configurations. However, these technologies lack continuity, high reliability, and strong adaptability in large-scale, complex, and dense environments or for long-term and challenging missions, which limits the wide-range and large-scale application of UAV. This paper focuses on resilient self-localization technology for UAV, starting from the three core elements of perception, estimation and control in the self-localization system loop, emphasizing resilience indicators such as continuity, reliability, and adaptability. The paper reviews and evaluates domestic and foreign research work from the perspectives of redundant information fusion, robust pose estimation and perception-aware control strategy. The limitations of current UAV self-localization technology under resilience indicators are highlighted, and the technical difficulties of method integration under limited onboard resources are pointed out. Finally, development trends concerning UAV resilient self-localization technology are prospected.

参考文献

1 赵春晖, 胡劲文, 吕洋, 等. 无人机空域感知与碰撞规避技术[M]. 西安: 西北工业大学出版社, 2019: 20-25.
  ZHAO C H, HU J W, LYU Y, et al. UAV sense and avoid technology[M]. Xi’an: Northwestern Polytechnical University Press, 2019: 20-25 (in Chinese).
2 DARPA. Fast lightweight autonomy[EB/OL]. (2017-10-13) [2023-04-07]. .
3 DARPA. DARPA subterranean (SubT) challenge[EB/OL]. (2017-08-07) [2023-04-07].
4 ALEXIS K. Towards a science of resilient robotic autonomy[DB/OL]. arXiv preprint2004.02403, 2020.
5 SANTAMARIA-NAVARRO A, THAKKER R, FAN D D, et al. Towards resilient autonomous navigation of drones[C]∥The International Symposium of Robotics Research. Cham: Springer Cham, 2022: 922-937.
6 DESOUZA G N, KAK A C. Vision for mobile robot navigation: A survey[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence200224(2): 237-267.
7 秦永元, 张洪钺, 汪叔华. 卡尔曼滤波与组合导航原理[M]. 3版. 西安: 西北工业大学出版社, 2015: 287-288.
  QIN Y Y, ZHANG H Y, WANG S H. Kalman filter and integrated navigation principle[M]. 3rd ed. Xi’an: Northwestern Polytechnical University Press, 2015: 287-288 (in Chinese).
8 QI H H, MOORE J B. Direct Kalman filtering approach for GPS/INS integration[J]. IEEE Transactions on Aerospace and Electronic Systems200238(2): 687-693.
9 ZHAO C H, WANG R Z, ZHANG T W, et al. Visual odometry and scene matching integrated navigation system in UAV[C]∥17th International Conference on Information Fusion (FUSION). Piscataway: IEEE Press, 2014: 1-6.
10 SHAN T X, ENGLOT B, MEYERS D, et al. LIO-SAM: tightly-coupled lidar inertial odometry via smoothing and mapping[C]∥2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). New York: ACM, 2020: 5135–5142.
11 MUR-ARTAL R, TARDóS J D. Visual-inertial monocular SLAM with map reuse[J]. IEEE Robotics and Automation Letters20172(2): 796-803.
12 ROZENBERSZKI D, MAJDIK A L. LOL: Lidar-only odometry and localization in 3D point cloud maps[C]∥2020 IEEE International Conference on Robotics and Automation (ICRA). Piscataway: IEEE Press, 2020: 4379-4385.
13 REN K, DING L, WAN M J, et al. Target localization based on cross-view matching between UAV and satellite[J]. Chinese Journal of Aeronautics202235(9): 333-341.
14 CARVALHO H, DEL MORAL P, MONIN A, et al. Optimal nonlinear filtering in GPS/INS integration[J]. IEEE Transactions on Aerospace and Electronic Systems199733(3): 835-850.
15 LI J X, BI Y C, LI K, et al. Accurate 3D localization for MAV swarms by UWB and IMU fusion[C]∥2018 IEEE 14th International Conference on Control and Automation (ICCA). Piscataway: IEEE Press, 2018: 100-105.
16 MUELLER M W, HAMER M, D’ANDREA R. Fusing ultra-wideband range measurements with accelerometers and rate gyroscopes for quadrocopter state estimation[C]∥2015 IEEE International Conference on Robotics and Automation (ICRA). Piscataway: IEEE Press, 2015: 1730-1736.
17 DAVISON A J, REID I D, MOLTON N D, et al. MonoSLAM: Real-time single camera SLAM[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence200729(6): 1052-1067.
18 FORSTER C, ZHANG Z C, GASSNER M, et al. SVO: Semidirect visual odometry for monocular and multicamera systems[J]. IEEE Transactions on Robotics201733(2): 249-265.
19 QIN T, LI P L, SHEN S J. VINS-mono: A robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics201834(4): 1004-1020.
20 ECKENHOFF K, GENEVA P, HUANG G Q. MIMC-VINS: A versatile and resilient multi-IMU multi-camera visual-inertial navigation system[J]. IEEE Transactions on Robotics202137(5): 1360-1380.
21 WANG C, ZHANG H D, NGUYEN T M, et al. Ultra-wideband aided fast localization and mapping system[C]∥2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway: IEEE Press, 2017: 1602-1609.
22 EBADI K, PALIERI M, WOOD S, et al. DARE-SLAM: Degeneracy-aware and resilient loop closing in perceptually-degraded environments[J]. Journal of Intelligent & Robotic Systems2021102(1): 2.
23 BURRI M, NIKOLIC J, GOHL P, et al. The EuRoC micro aerial vehicle datasets[J]. International Journal of Robotics Research201635(10): 1157-1163.
24 ENGEL J, KOLTUN V, CREMERS D. Direct sparse odometry[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence201840(3): 611-625.
25 CAMPOS C, ELVIRA R, RODRíGUEZ J J G, et al. ORB-SLAM3: An accurate open-source library for visual, visual-inertial, and multimap SLAM[J]. IEEE Transactions on Robotics202137(6): 1874-1890.
26 LEUTENEGGER S, LYNEN S, BOSSE M, et al. Keyframe-based visual–inertial odometry using nonlinear optimization[J]. International Journal of Robotics Research201534(3): 314-334.
27 ZHANG J, SINGH S. LOAM: Lidar odometry and mapping in real-time[C]∥Robotics: Science and Systems Conference. Robotics: Science and Systems Foundation, 2014: 1-9.
28 SHAN T X, ENGLOT B. LeGO-LOAM: Lightweight and ground-optimized lidar odometry and mapping on variable terrain[C]∥2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway: IEEE Press, 2018: 4758-4765.
29 XU W, CAI Y X, HE D J, et al. FAST-LIO2: Fast direct LiDAR-inertial odometry[J]. IEEE Transactions on Robotics202238(4): 2053-2073.
30 NGUYEN T M, YUAN S H, CAO M Q, et al. MILIOM: Tightly coupled multi-input lidar-inertia odometry and mapping[J]. IEEE Robotics and Automation Letters20216(3): 5573-5580.
31 SONG Y, GUAN M Y, TAY W P, et al. UWB/LiDAR fusion for cooperative range-only SLAM[C]∥2019 International Conference on Robotics and Automation (ICRA). Piscataway: IEEE Press, 2019: 6568-6574.
32 GRAETER J, WILCZYNSKI A, LAUER M. LIMO: lidar-monocular visual odometry[C]∥2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway: IEEE Press, 2018: 7872-7879.
33 SHIN Y S, PARK Y S, KIM A. DVL-SLAM: Sparse depth enhanced direct visual-LiDAR SLAM[J]. Autonomous Robots202044(2): 115-130.
34 SHAN T X, ENGLOT B, RATTI C, et al. LVI-SAM: Tightly-coupled lidar-visual-inertial odometry via smoothing and mapping[C]∥2021 IEEE International Conference on Robotics and Automation (ICRA). Piscataway: IEEE Press, 2021: 5692-5698.
35 LIN J R, ZHANG F. R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-inertial-visual tightly-coupled state estimation and mapping package[C]∥2022 International Conference on Robotics and Automation (ICRA). Piscataway: IEEE Press, 2022: 10672-10678.
36 NGUYEN T M, CAO M Q, YUAN S H, et al. VIRAL-fusion: A visual-inertial-ranging-lidar sensor fusion approach[J]. IEEE Transactions on Robotics202238(2): 958-977.
37 李家宁, 田永鸿. 神经形态视觉传感器的研究进展及应用综述[J]. 计算机学报202144(6): 1258-1286.
  LI J N, TIAN Y H. Recent advances in neuromorphic vision sensors: A survey[J]. Chinese Journal of Computers202144(6): 1258-1286 (in Chinese).
38 ZHOU Y, GALLEGO G, SHEN S J. Event-based stereo visual odometry[J]. IEEE Transactions on Robotics202137(5): 1433-1450.
39 SUN S H, CIOFFI G, DE VISSER C, et al. Autonomous quadrotor flight despite rotor failure with onboard vision sensors: Frames vs. events[J]. IEEE Robotics and Automation Letters20216(2): 580-587.
40 ZHU A Z, ATANASOV N, DANIILIDIS K. Event-based visual inertial odometry[C]∥2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Piscataway: IEEE Press, 2017: 5816-5824.
41 LE GENTIL C, TSCHOPP F, ALZUGARAY I, et al. IDOL: A Framework for IMU-DVS Odometry using Lines[C]∥2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway: IEEE Press, 2020: 5863-5870.
42 李卓一. GNSS拒止的复杂环境中无人机自主导航技术研究[D]. 西安: 西北工业大学自动化学院, 2021: 17-38.
  LI Z Y. Research on autonomous navigation technology of unmanned aerial vehicles in complex GNSS-denied environments[D]. Xi’an: School of Automation, Northwestern Polytechnical University, 2021: 17-38 (in Chinese).
43 韩国良. 无人机自主返航仿生导航方法研究[D]. 长沙: 国防科技大学, 2021: 8-31.
  HAN G L. Bionic navigation method for autonomous return of UAV[D].Changsha: National University of Defense Technology, 2021: 8-31 (in Chinese).
44 HARRIS C, STEPHENS M. A combined corner and edge detector[C]∥Proceedings ofthe Alvey Vision Conference 1988. Manchester: Alvey Vision Club, 1988: 147-151.
45 蔡香玉, 盛业华, 黄毅, 等. 融合Harris-Laplace算子的SURF算法与无人机影像匹配[J]. 测绘科学201843(11): 20-26, 32.
  CAI X Y, SHENG Y H, HUANG Y, et al. A SURF algorithm combined with Harris-Laplace and UAV images matching[J]. Science of Surveying and Mapping201843(11): 20-26, 32 (in Chinese).
46 唐永鹤, 陶华敏, 卢焕章, 等. 一种基于Harris算子的快速图像匹配算法[J]. 武汉大学学报(信息科学版)201237(4): 406-409, 414.
  TANG Y H, TAO H M, LU H Z, et al. A fast image matching algorithm based on Harris operator[J]. Geomatics and Information Science of Wuhan University201237(4): 406-409, 414 (in Chinese).
47 MUR-ARTAL R, MONTIEL J M M, TARDóS J D. ORB-SLAM: A versatile and accurate monocular SLAM system[J]. IEEE Transactions on Robotics201531(5): 1147-1163.
48 MUR-ARTAL R, TARDóS J D. ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras[J]. IEEE Transactions on Robotics201733(5): 1255-1262.
49 RUBLEE E, RABAUD V, KONOLIGE K, et al. ORB: An efficient alternative to SIFT or SURF[C]∥ 2011 International Conference on Computer Vision. Piscataway: IEEE Press, 2011: 2564-2571.
50 LUCAS B D, KANADE T. An iterative image registration technique with an application to stereo vision[C]∥ Proceedings of the 7th international joint conference on Artificial intelligence - Volume 2. New York: ACM, 1981: 674–679.
51 张怀捷, 马静雅, 刘浩源, 等. 视觉与惯性融合的多旋翼飞行机器人室内定位技术[J]. 航空学报202344(5): 426964.
  ZHANG H J, MA J Y, LIU H Y, et al. Indoor positioning technology of multi-rotor flying robot based on visual-inertial fusion[J]. Acta Aeronautica et Astronautica Sinica202344(5): 426964 (in Chinese).
52 SHI J B, TOMASI. Good features to track[C]∥1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE Press, 2002: 593-600.
53 HE Y J, ZHAO J, GUO Y, et al. PL-VIO: Tightly-coupled monocular visual-inertial odometry using point and line features[J]. Sensors201818(4): 1159.
54 ROSTEN E, DRUMMOND T. Fusing points and lines for high performance tracking[C]∥Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1. Piscataway: IEEE Press, 2005: 1508-1515.
55 GROMPONE VON GIOI R, JAKUBOWICZ J, MOREL J M, et al. LSD: A fast line segment detector with a false detection control[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence201032(4): 722-732.
56 LYU Y, YUAN S H, XIE L H. Structure priors aided visual-inertial navigation in building inspection tasks with auxiliary line features[J]. IEEE Transactions on Aerospace and Electronic Systems202258(4): 3037-3048.
57 ZHENG F, TSAI G, ZHANG Z, et al. Trifo-VIO: Robust and efficient stereo visual inertial odometry using points and lines[C]∥2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). New York: ACM, 2018: 3686–3693.
58 GOMEZ-OJEDA R, MORENO F A, ZU?IGA-NO?L D, et al. PL-SLAM: A stereo SLAM system through the combination of points and line segments[J]. IEEE Transactions on Robotics201935(3): 734-746.
59 王婧. 城市复杂环境下无人机自主定位与测姿技术研究[C]∥第十二届中国卫星导航年会论文集——S06 时间基准与精密授时. 北京:中国卫星导航系统管理办公室学术交流中心, 2021: 102-109.
  WANG J. Research on autonomous positioning and attitude measurement technology of UAV in complex urban environment[C]∥Proceedings of the 12th China Satellite Navigation Annual Conference—S06 Time Benchmark and Precision Timing. Beijing: Academic Exchange Center, China Satellite Navigation System Management Office, 2021: 102-109.
60 YANG Y L, GENEVA P, ZUO X X, et al. Tightly-coupled aided inertial navigation with point and plane features[C]∥2019 International Conference on Robotics and Automation (ICRA). Piscataway: IEEE Press, 2019: 6094-6100.
61 MOURIKIS A I, ROUMELIOTIS S I. A multi-state constraint Kalman filter for vision-aided inertial navigation[C]∥Proceedings 2007 IEEE International Conference on Robotics and Automation. Piscataway: IEEE Press, 2007: 3565-3572.
62 YANG Y L, HUANG G Q. Observability analysis of aided INS with heterogeneous features of points, lines, and planes[J]. IEEE Transactions on Robotics201935(6): 1399-1418.
63 FU Q, WANG J L, YU H S, et al. PL-VINS: Real-time monocular visual-inertial SLAM with point and line features[DB/OL]. arXiv preprint2009.07462, 2020.
64 BLOESCH M, OMARI S, HUTTER M, et al. Robust visual inertial odometry using a direct EKF-based approach[C]∥2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg: IEEE Press, 2015: 298-304.
65 SILVEIRA G, MALIS E, RIVES P. An efficient direct approach to visual SLAM[J]. IEEE Transactions on Robotics200824(5): 969-979.
66 CHEN C H, WANG B, LU C X, et al. A survey on deep learning for localization and mapping: Towards the age of spatial machine intelligence[DB/OL]. arXiv preprint2006.12567, 2020.
67 ?ATAL O, JANSEN W, VERBELEN T, et al. LatentSLAM: Unsupervised multi-sensor representation learning for localization and mapping[C]∥2021 IEEE International Conference on Robotics and Automation (ICRA). Piscataway: IEEE Press, 2021: 6739-6745.
68 COSTANTE G, MANCINI M, VALIGI P, et al. Exploring representation learning with CNNs for frame-to-frame ego-motion estimation[J]. IEEE Robotics and Automation Letters20161(1): 18-25.
69 刘欣, 吴俊娴, 张占月. 一种基于卫星图像匹配的无人机自主定位算法[J]. 航天返回与遥感202142(2): 130-138.
  LIU X, WU J X, ZHANG Z Y. A UAV autonomous positioning algorithm based on satellite image matching[J]. Spacecraft Recovery & Remote Sensing202142(2): 130-138 (in Chinese).
70 LIANG H J, SANKET N J, FERMüLLER C, et al. SalientDSO: Bringing attention to direct sparse odometry[J]. IEEE Transactions on Automation Science and Engineering201916(4): 1619-1626.
71 PAN J T, CANTON FERRER C, MCGUINNESS K, et al. SalGAN: Visual saliency prediction with generative adversarial networks[DB/OL]. arXiv preprint: 1701.01081, 2017.
72 WANG S, CLARK R, WEN H K, et al. DeepVO: Towards end-to-end visual odometry with deep recurrent convolutional neural networks[C]∥2017 IEEE International Conference on Robotics and Automation (ICRA). Piscataway: IEEE Press, 2017: 2043-2050.
73 蓝朝桢, 阎晓东, 崔志祥, 等. 用于无人机自主绝对定位的实时特征匹配方法[J]. 测绘科学技术学报202037(3): 264-268, 274.
  LAN C Z, YAN X D, CUI Z X, et al. Real-time feature matching method for the autonomous absolute location of UAV[J]. Journal of Geomatics Science and Technology202037(3): 264-268, 274 (in Chinese).
74 DETONE D, MALISIEWICZ T, RABINOVICH A. SuperPoint: Self-supervised interest point detection and description[C]∥2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). Piscataway: IEEE Press, 2018: 337-33712.
75 LIANOS K N, SCH?NBERGER J L, POLLEFEYS M, et al. VSO: Visual semantic odometry[C]∥European Conference on Computer Vision. Cham: Springer, 2018: 246-263.
76 LYNEN S, ACHTELIK M W, WEISS S, et al. A robust and modular multi-sensor fusion approach applied to MAV navigation[C]∥2013 IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway: IEEE Press, 2013: 3923-3929.
77 MOURIKIS A I, ROUMELIOTIS S I. A dual-layer estimator architecture for long-term localization[C]∥2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops. Piscataway: IEEE Press, 2008: 1-8.
78 高翔, 张涛, 刘毅, 等. 视觉SLAM十四讲: 从理论到实践[M]. 北京: 电子工业出版社, 2017: 3-18, 257-258.
  GAO X, ZHANG T, LIU Y, et al. Fourteen lectures on visual SLAM: From theory to practice[M]. Beijing: Publishing House of Electronics Industry, 2017: 3-18, 257-258 (in Chinese).
79 SIBLEY G, MATTHIES L, SUKHATME G. Sliding window filter with application to planetary landing[J]. Journal of Field Robotics201027(5): 587-608.
80 KOTTAS D G, HESCH J A, BOWMAN S L, et al. On the consistency of vision-aided inertial navigation[M]∥Experimental Robotics. Berlin: Springer, 2013: 303-317.
81 FISCHLER M A, BOLLES R C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography[M]∥Readings in Computer Vision. Amsterdam: Elsevier, 1987: 726-740.
82 SCH?NBERGER J L, FRAHM J M. Structure-from-motion revisited[C]∥2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Piscataway: IEEE Press, 2016: 4104-4113.
83 CIVERA J, DAVISON A J, MARTíNEZ M J M. Inverse depth parametrization for monocular SLAM[J]. IEEE Transactions on Robotics200824(5): 932-945.
84 MCLAUCHLAN P. The variable state dimension filter[R]. Guildford: University of Surrey, 1999.
85 MAYBECK P. Stochastic models, estimation and control, vol. 1[M]. New York: Academic, 1979: 10-20.
86 HUANG G P, MOURIKIS A I, ROUMELIOTIS S I. A first-estimates Jacobian EKF for improving SLAM consistency[C]∥Experimental Robotics. Berlin: Springer, 2009: 373-382.
87 LI M Y, MOURIKIS A I. High-precision, consistent EKF-based visual-inertial odometry[J]. The International Journal of Robotics Research201332(6): 690-711.
88 ROUMELIOTIS S I, BURDICK J W. Stochastic cloning: A generalized framework for processing relative state measurements[C]∥Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292). Piscataway: IEEE Press, 2002: 1788-1795.
89 DONG-SI T C, MOURIKIS A I. Motion tracking with fixed-lag smoothing: Algorithm and consistency analysis[C]∥2011 IEEE International Conference on Robotics and Automation. Piscataway: IEEE Press, 2011: 5655-5662.
90 HUANG G P, MOURIKIS A I, ROUMELIOTIS S I. An observability-constrained sliding window filter for SLAM[C]∥2011 IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway: IEEE Press, 2011: 65-72.
91 NERURKAR E D, WU K J, ROUMELIOTIS S I. C-KLAM: Constrained keyframe-based localization and mapping[C]∥2014 IEEE International Conference on Robotics and Automation (ICRA). Piscataway: IEEE Press, 2014: 3638-3643.
92 KLEIN G, MURRAY D. Parallel tracking and mapping for small AR workspaces[C]∥2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. Piscataway: IEEE Press, 2007: 225-234.
93 KAESS M, RANGANATHAN A, DELLAERT F. iSAM: Incremental smoothing and mapping[J]. IEEE Transactions on Robotics200824(6): 1365-1378.
94 KAESS M, JOHANNSSON H, ROBERTS R, et al. iSAM2: Incremental smoothing and mapping using the Bayes tree[J]. International Journal of Robotics Research201231(2): 216-235.
95 LATIF Y, CADENA C, NEIRA J. Robust graph SLAM back-ends: A comparative analysis[C]∥2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway: IEEE Press, 2014: 2683-2690.
96 SüNDERHAUF N, PROTZEL P. Switchable constraints for robust pose graph SLAM[C]∥2012 IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway: IEEE Press, 2012: 1879-1884.
97 AGARWAL P, TIPALDI G D, SPINELLO L, et al. Robust map optimization using dynamic covariance scaling[C]∥2013 IEEE International Conference on Robotics and Automation. Piscataway: IEEE Press, 2013: 62-69.
98 OLSON E, AGARWAL P. Inference on networks of mixtures for robust robot mapping[J]. International Journal of Robotics Research201332(7): 826-840.
99 LATIF Y, CADENA C, NEIRA J. Robust loop closing over time for pose graph SLAM[J]. International Journal of Robotics Research201332(14): 1611-1626.
100 VASILEIOS T. Resilient submodular maximization for control and sensing[D]. Philadelphia: University of Penns?ylvania, 2018: 1-8.
101 HARSHAW C, FELDMAN M, WARD J, et al. Submodular maximization beyond non-negativity: Guarantees, fast algorithms, and applications[DB/OL]. arXiv preprint1904.09354, 2019.
102 BALLOTTA L, SCHENATO L, CARLONE L. Computation-communication trade-offs and sensor selection in real-time estimation for processing networks[J]. IEEE Transactions on Network Science and Engineering20207(4): 2952-2965.
103 JAWAID S T, SMITH S L. Submodularity and greedy algorithms in sensor scheduling for linear dynamical systems[J]. Automatica201561: 282-288.
104 MOUSAVI H K, MOTEE N. Estimation with fast feature selection in robot visual navigation[J]. IEEE Robotics and Automation Letters20205(2): 3572-3579.
105 CARLONE L, KARAMAN S. Attention and anticipation in fast visual-inertial navigation[C]∥2017 IEEE International Conference on Robotics and Automation (ICRA). Piscataway: IEEE Press, 2017: 3886-3893.
106 KHOSOUSSI K, GIAMOU M, SUKHATME G S, et al. Reliable graphs for SLAM[J]. International Journal of Robotics Research201938(2-3): 260-298.
107 CHEN Y B, HUANG S D, ZHAO L, et al. Cramér–Rao bounds and optimal design metrics for pose-graph SLAM[J]. IEEE Transactions on Robotics202137(2): 627-641.
108 FALANGA D, FOEHN P, LU P, et al. PAMPC: perception-aware model predictive control for quadrotors[C]∥2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway: IEEE Press, 2018: 1-8.
109 ALMADHOUN R, ABDULDAYEM A, TAHA T, et al. Guided next best view for 3D reconstruction of large complex structures[J]. Remote Sensing201911(20): 2440.
110 CAO N N, LOW K H, DOLAN J M. Multi-robot informative path planning for active sensing of environmental phenomena: A tale of two algorithms[DB/OL]. arXiv preprint: 302.0723, 2013.
111 ZHANG Z C, SCARAMUZZA D. Fisher information field: An efficient and differentiable map for perception-aware planning[DB/OL]. arXiv preprint2008.03324, 2020.
112 SALARIS P, COGNETTI M, SPICA R, et al. Online optimal perception-aware trajectory generation[J]. IEEE Transactions on Robotics201935(6): 1307-1322.
113 OPENAI, ACHIAM J,et al. GPT-4 technical report[DB/OL]. arXiv preprint: 2303.08774, 2023.
114 苏翎菲, 化永朝, 董希旺, 等. 人与无人机集群多模态智能交互方法[J]. 航空学报202243(S1): 727001.
  SU L F, HUA Y Z, DONG X W, et al. Human-UAV swarm multi-modal intelligent interaction methods[J]. Acta Aeronautica et Astronautica Sinica202243(S1): 727001 (in Chinese).
文章导航

/