Articles

Onboard visual-inertial relative pose and deck motion easurement for autonomous landing

  • Qiufu WANG ,
  • Daoming BI ,
  • Zhuo ZHANG ,
  • Xiaoliang SUN ,
  • Qifeng YU
Expand
  • 1.College of Aerospace Science and Engineering,National University of Defense Technology,Changsha 410073,China
    2.Hunan Provincial Key Laboratory of Image Measurement and Vision Navigation,Changsha,Hunan 410073,China
    3.Shenyang Aircraft Design and Research Institute,AVIC,Shenyang 110850,China

Received date: 2024-09-26

  Revised date: 2024-12-10

  Accepted date: 2025-01-03

  Online published: 2025-01-10

Supported by

National Natural Science Foundation of China(12272404);Science and Technology Innovation Program of Hunan Province(2023RC3023)

Abstract

Due to the insufficient accuracy, low frequency, discontinuity and deficiency in deck motion estimation, it is difficult for onboard monocular pose measurement to achieve robust autonomous landing guidance. To address the above issues, an on-board visual-inertial measurement method based on error state Kalman filter is put forward. The proposed framework tightly integrates 2D key points and IMU data to realize efficient and accurate relative pose and deck motion estimation under the constrained condition of dynamic backgrounds, moving target, et al. Considering the motion characteristics of the aircraft and ship, a novel asynchronous error state updating strategy is proposed to achieve high-precision performance. The experimental results demonstrate that the average relative positioning accuracy is improved by about 180% with average translation error decreasing to 3% of the counterpart compared to monocular methods. As to deck motion estimation, the average error of the ship Euler angle is about 0.1°. A cycle of state prediction and update can be conducted within 0.02 ms. The superior performance in accuracy and efficiency of relative and deck motion estimation guarantees significant capacity of the proposed method to integrate with various visual frontends, to perform sound autonomous landing guidance.

Cite this article

Qiufu WANG , Daoming BI , Zhuo ZHANG , Xiaoliang SUN , Qifeng YU . Onboard visual-inertial relative pose and deck motion easurement for autonomous landing[J]. ACTA AERONAUTICAET ASTRONAUTICA SINICA, 2025 , 46(13) : 531268 -531268 . DOI: 10.7527/S1000-6893.2024.31268

References

[1] 甄子洋, 王新华, 江驹, 等. 舰载机自动着舰引导与控制研究进展[J]. 航空学报201738(2): 020435.
  ZHEN Z Y, WANG X H, JIANG J, et al. Research progress in guidance and control of automatic carrier landing of carrier-based aircraft[J]. Acta Aeronautica et Astronautica Sinica201738(2): 020435 (in Chinese).
[2] 张志冰, 甄子洋, 江驹, 等. 舰载机自动着舰引导与控制综述[J]. 南京航空航天大学学报201850(6): 734-744.
  ZHANG Z B, ZHEN Z Y, JIANG J, et al. Review on development in guidance and control of automatic carrier landing of carrier-based aircraft[J]. Journal of Nanjing University of Aeronautics & Astronautics201850(6): 734-744 (in Chinese).
[3] MENG Y, WANG W, HAN H, et al. A visual/inertial integrated landing guidance method for UAV landing on the ship[J]. Aerospace Science and Technology201985: 474-480.
[4] ZHANG Z, WANG Q F, BI D M, et al. MC-LRF based pose measurement system for shipborne aircraft automatic landing[J]. Chinese Journal of Aeronautics202336(8): 298-312.
[5] ZHOU J X, WANG Q F, ZHANG Z, et al. Aircraft carrier pose tracking based on adaptive region in visual landing[J]. Drones20226(7): 182.
[6] CHEN Z Y, CHEN Q R, WANG Q F, et al. Infrared lights-based monocular pose measurement for autonomous rendezvous of dual-motion platforms[C]∥Third International Conference on Machine Vision, Automatic Identification, and Detection (MVAID 2024). New York: SPIE, 2024: 48.
[7] CHEN Q R, CHEN Z Y, ZHANG Z, et al. Vision-guided autonomous landing technology for noncooperative target[C]∥Third International Conference on Machine Vision, Automatic Identification, and Detection (MVAID 2024). New York: SPIE, 2024: 46.
[8] XIN L, TANG Z M, GAI W Q, et al. Vision-based autonomous landing for the UAV: A review[J]. Aerospace20229(11): 634.
[9] 李启军, 王擎宇, 高江炜, 等. 航母舰载机着舰导航技术发展研究[J]. 舰船科学技术202345(23): 206-211.
  LI Q J, WANG Q Y, GAO J W, et al. Development research of carrier-borne aircraft landing navigation technology[J]. Ship Science and Technology202345(23): 206-211 (in Chinese).
[10] 甄冲, 曲晓雷, 王翼丰, 等. 视觉引导误差对自动着舰性能影响研究[J]. 航空工程进展202516(1): 101-107, 116.
  ZHEN C, QU X L, WANG Y F, et al. Impact of visual guidance error on automatic carrier landing performance[J]. Advances in Aeronautical Science and Engineering202516(1): 101-107, 116 (in Chinese).
[11] 齐广峰, 吕军锋. MEMS惯性技术的发展及应用[J]. 电子设计工程201523(1): 87-89, 92.
  QI G F, LV J F. Evolution and application of MEMS inertial technology[J]. Electronic Design Engineering201523(1): 87-89, 92 (in Chinese).
[12] 司书斌, 赵大伟, 徐婉莹, 等. 视觉—惯性导航定位技术研究进展[J]. 中国图象图形学报202126(6): 1470-1482.
  SI S B, ZHAO D W, XU W Y, et al. Review on visual-inertial navigation and positioning technology[J]. Journal of Image and Graphics202126(6): 1470-1482 (in Chinese).
[13] NING Y. A comprehensive introduction of visual-inertial navigation[DB/OL]. arXiv preprint: 2307.11758, 2023.
[14] MOURIKIS A I, ROUMELIOTIS S I. A multi-state constraint Kalman filter for vision-aided inertial navigation[C]∥ Proceedings 2007 IEEE International Conference on Robotics and Automation. Piscataway: IEEE Press, 2007: 3565-3572.
[15] QIN T, LI P L, SHEN S J. VINS-Mono: A robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics201834(4): 1004-1020.
[16] QIU K J, QIN T, GAO W L, et al. Tracking 3-D motion of dynamic objects using monocular visual-inertial sensing[J]. IEEE Transactions on Robotics201935(4): 799-816.
[17] HENEIN M, KENNEDY G, ILA V, et al. Simultaneous localization and mapping with dynamic rigid objects[DB/OL]. arXiv preprint1805. 03800, 2018.
[18] ECKENHOFF K, YANG Y L, GENEVA P, et al. Tightly-coupled visual-inertial localization and 3-D rigid-body target tracking[J]. IEEE Robotics and Automation Letters20194(2): 1541-1548.
[19] GYAGENDA N, HATILIMA J V, ROTH H, et al. A review of GNSS-independent UAV navigation techniques[J]. Robotics and Autonomous Systems2022152: 104069.
[20] BORTZ J E. A new mathematical formulation for strapdown inertial navigation[J]. IEEE Transactions on Aerospace and Electronic Systems1971, AES-7(1): 61-66.
[21] SOLà J. Quaternion kinematics for the error-state Kalman filter[DB/OL]. arXiv preprint: 1711. 02508, 2015.
[22] 高翔. 自动驾驶与机器人中的SLAM技术: 从理论到实践[M]. 北京: 电子工业出版社, 2023: 79-80.
  GAO X. SLAM technology in autonomous driving and robot: From theory to practice[M]. Beijing: Publishing House of Electronics Industry, 2023: 79-80 (in Chinese).
[23] SOLà J, DERAY J, ATCHUTHAN D. A micro Lie theory for state estimation in robotics[DB/OL]. arXiv preprint1812. 01537, 2018.
[24] DENNINGER M, SUNDERMEYER M, WINKELBAUER D, et al. BlenderProc[DB/OL]. arXiv preprint1911. 01911, 2019.
[25] LEPETIT V, MORENO-NOGUER F, FUA P. EPnP: An accurate On) solution to the PnP problem[J]. International Journal of Computer Vision200981(2): 155-166.
[26] FISCHLER M A, BOLLES R C. Random sample consensus[J]. Communications of the ACM198124(6): 381-395.
[27] FORSTER C, CARLONE L, DELLAERT F, et al. On-manifold preintegration for real-time visual-inertial odometry[J]. IEEE Transactions on Robotics201733(1): 1-21.
Outlines

/