基于视惯融合的机舰相对位姿和甲板晃动测量

  • 王秋富 ,
  • 张倬 ,
  • 毕道明 ,
  • 孙晓亮 ,
  • 于起峰
展开
  • 1. 国防科技大学空天科学学院
    2. 中国航空工业集团公司沈阳飞机设计研究所

收稿日期: 2024-09-26

  修回日期: 2025-01-10

  网络出版日期: 2025-01-10

基金资助

国家自然科学基金

Onboard visual-inertial relative pose and deck motion measurement for autonomous landing

  • WANG Qiu-Fu ,
  • ZHANG Zhuo ,
  • BI Dao-Ming ,
  • SUN Xiao-Liang ,
  • YU Qi-Feng
Expand

Received date: 2024-09-26

  Revised date: 2025-01-10

  Online published: 2025-01-10

Supported by

National Natural Science Foundation of China

摘要

舰载机自主着舰引导中,基于机载单目视觉的机舰间相对位姿测量存在精度不高、测量频率低、结果不连续等问题,且无法实现舰船甲板晃动测量,难以满足自主着舰引导中控制系统的严格应用要求。针对上述问题,提出基于视惯融合的机舰相对位姿和甲板晃动测量方法。利用机载单目视觉和惯性单元的测量信息,以二维关键点图像坐标为观测向量,构建基于误差状态卡尔曼的紧耦合滤波框架,实现动态背景、目标平台运动等受限观测条件下机舰间相对位姿和舰船甲板晃动的高效、高精度解算。针对机舰运动特性,提出误差状态异步更新策略,分步更新飞机和舰船运动状态信息,提高了位姿参数的估计精度。实验结果表明,相对机载单目位姿估计,方法全序列平均相对定位精度提升约180%,平均平移误差降低至3%,甲板晃动欧拉角平均测量误差约为0.1°,滤波平均耗时小于0.2ms,实现了机舰相对位姿和甲板晃动的高效、高精度测量,可作为后端算法与多种视觉前端组合,为舰载机自主着舰引导提供可靠依据。

本文引用格式

王秋富 , 张倬 , 毕道明 , 孙晓亮 , 于起峰 . 基于视惯融合的机舰相对位姿和甲板晃动测量[J]. 航空学报, 0 : 1 -0 . DOI: 10.7527/S1000-6893.2024.31268

Abstract

Due to the insufficient accuracy, low frequency, discontinuity and deficiency in deck motion estimation, it is difficult for onboard monocular pose measurement to achieve robust autonomous landing guidance. To address the above issues, an on-board visual-inertial measurement method based on error state Kalman filter is put forward. The proposed framework tightly integrates 2D key points and IMU data to realize efficient and accurate relative pose and deck motion estimation under the constrained condition of dynamic backgrounds, moving target, et al. Considering the motion characteristics of the aircraft and ship, a novel asynchronous error state updating strategy is proposed to achieve high-precision performance. The experimental results demonstrate that the average relative positioning accuracy is improved by about 180% with average translation error decreasing to 3% of the counterpart compared to monocular methods. As to deck motion estimation, the average error of the ship Euler angle is about 0.1 degree. A cycle of state prediction and update can be conducted within 0.2 milliseconds. The superior performance in accuracy and efficiency of relative and deck motion estimation guarantees significant capacity of the proposed method to integrate with various visual frontends, to perform sound autonomous landing guidance.

参考文献

[1] 甄子洋, 王新华, 江驹, 等. 舰载机自动着舰引导与控制研究进展[J]. 航空学报, 2017, 38(02): 020435. ZHEN Z Y, WANG X H, JIANG J, et al. Research pro-gress in guidance and control of automatic carrier landing of carrier-based aircraft[J]. Acta Aeronautica et Astro-nautica Sinica, 2017, 38(2): 020435 (in Chinese). [2] 张志冰, 甄子洋, 江驹, 等. 舰载机自动着舰引导与控制综述[J]. 南京航空航天大学学报, 2018, 50(06): 734-744. ZHANG Z B, ZHEN Z Y, JIANG J, et al. Review on de-velopment in guidance and control of automatic carrier landing of carrier-based aircraft[J]. Journal of Nanjing University of Aeronautics & Astronautics, 2018, 50(6): 734-744 (in Chinese). [3] MENG Y, WANG W, HAN H, et al. A visual/inertial integrated landing guidance method for UAV landing on the ship [J]. Aerospace Science and Technology, 2019, 85: 474-480. [4] ZHANG Z, WANG Q F, BI D M, et al. MC-LRF based pose measurement system for shipborne aircraft automatic landing [J]. Chinese Journal of Aeronautics, 2023, 36(8): 298-312. [5] ZHOU J X, Wang Q F, Zhang Z, et al. Aircraft Carrier Pose Tracking Based on Adaptive Region in Visual Land-ing [J]. Drones, 2022, 6(7): 182. [6] CHEN Z Y, CHEN Q R, WANG Q F, et al. Infrared lights-based monocular pose measurement for autono-mous rendezvous of dual-motion platforms [C]//Third In-ternational Conference on Machine Vision, Automatic Identification, and Detection (MVAID 2024). SPIE, 2024: 48. [7] CHEN Q R, CHEN Z Y, ZHANG Z, et al. Vision-guided autonomous landing technology for noncooperative target [C]//Third International Conference on Machine Vision, Automatic Identification, and Detection (MVAID 2024). SPIE, 2024: 46. [8] XIN L, TANG Z, GAI W, et al. Vision-Based Autono-mous Landing for the UAV: A Review[J]. Aerospace, 2022, 9(11): 634. [9] 李启军, 王擎宇, 高江炜, 等. 舰船舰载机着舰导航技术发展研究[J]. 舰船科学技术, 2023, 45(23): 206-211. LI Q J, WANG Q Y, GAO J W, et al. Development re-search of carrier-borne aircraft landing navigation tech-nology[J]. Ship Science and Technology, 2023, 45(23): 206-211 (in Chinese). [10] 甄冲, 曲晓雷, 王翼丰, 等. 视觉引导误差对自动着舰性能影响研究[J]. 航空工程进展, 2023:1-8. ZHEN C, QU X L,WANG Y F, et al. Impact of visual guidance error on automatic carrier landing perfor-mance[J]. Advances in Aeronautical Science and Engi-neering, 2023:1-8 (in Chinese). [11] 齐广峰, 吕军锋. MEMS惯性技术的发展及应用 [J]. 电子设计工程, 2015, 23(01): 87-89+92. QI G F, LV J F, Evolution and application of MEMS iner-tial technology[J]. Electronic Design Engineering, 2015, 23(01): 87-89+92 (in Chinese). [12] 司书斌, 赵大伟, 徐婉莹, 等. 视觉-惯性导航定位技术研究进展[J]. 中国图象图形学报, 2021, 26(06): 1470-1482. SI S B, ZHAO D W, XU W Y, et al. Review on visual-inertial navigation and positioning technology[J]. Journal of Image and Graphics, 2021, 26(06): 1470-1482 (in Chinese). [13] NING Y. A Comprehensive Introduction of Visual-Inertial Navigation[DB/OL]. arXiv preprint: 2307.11758, 2023. [14] MOURIKIS A I, ROUMELIOTIS S I. A Multi-State Constraint Kalman Filter for Vision-aided Inertial Naviga-tion [C]//Proceedings 2007 IEEE International Conference on Robotics and Automation. IEEE, 2007. [15] QIN T, LI P, SHEN S. VINS-Mono: A Robust and Versa-tile Monocular Visual-Inertial State Estimator[J]. IEEE Transactions on Robotics, 2018, 34(4): 1004-1020. [16] QIU K, QIN T, GAO W, et al. Tracking 3-D Motion of Dynamic Objects Using Monocular Visual-Inertial Sensing [J]. IEEE Transactions on Robotics, 2019, 35(4): 799-816. [17] HENEIN M, KENNEDY G, ILA V, et al. Simultaneous Localization and Mapping with Dynamic Rigid Objects [DB/OL]. arXiv preprint: 1805.03800, 2018. [18] ECKENHOFF K, YANG Y, GENEVA P, et al. Tightly-Coupled Visual-Inertial Localization and 3-D Rigid-Body Target Tracking[J]. IEEE Robotics and Automation Let-ters, 2019, 4(2): 1541-1548. [19] GYAGENDA N, HATILIMA J V, ROTH H, et al. A re-view of GNSS-independent UAV navigation tech-niques[J]. Robotics and Autonomous Systems, 2022, 152: 104069. [20] BORTZ J E. A New Mathematical Formulation for Strapdown Inertial Navigation[J]. IEEE Transactions on Aerospace and Electronic Systems, 1971. 7(1): 61-66. [21] SOLà J. Quaternion kinematics for the error-state Kal-man filter[DB/OL]. arXiv preprint: 1711.02508, 2015. [22] 高翔. 自动驾驶与机器人中的SLAM技术[M]. 北京: 电字工业出版社, 2023: 79-80. [23] SOLà J, DERAY J, ATCHUTHAN D. A micro Lie theo-ry for state estimation in robotics[DB/OL]. arXiv pre-print:1812.01537, 2018. [24] DENNINGER M, SUNDERMEYER M, WINKELBAUER D, et al. BlenderProc[DB/OL]. arXiv preprint: 1911.01911, 2019. [25] LEPETIT V, MORENO N F, FUA P. EPnP: An accurate O(n) solution to the PnP problem[J]. International Journal of Computer Vision, 2008, 81(2): 155-166. [26] FISCHLER M A, BOLLES R C. Random sample con-sensus[J]. Communications of the ACM, 1981, 24(6): 381-395.
Options
文章导航

/