Due to the insufficient accuracy, low frequency, discontinuity and deficiency in deck motion estimation, it is difficult for onboard monocular pose measurement to achieve robust autonomous landing guidance. To address the above issues, an on-board visual-inertial measurement method based on error state Kalman filter is put forward. The proposed framework tightly integrates 2D key points and IMU data to realize efficient and accurate relative pose and deck motion estimation under the constrained condition of dynamic backgrounds, moving target, et al. Considering the motion characteristics of the aircraft and ship, a novel asynchronous error state updating strategy is proposed to achieve high-precision performance. The experimental results demonstrate that the average relative positioning accuracy is improved by about 180% with average translation error decreasing to 3% of the counterpart compared to monocular methods. As to deck motion estimation, the average error of the ship Euler angle is about 0.1 degree. A cycle of state prediction and update can be conducted within 0.2 milliseconds. The superior performance in accuracy and efficiency of relative and deck motion estimation guarantees significant capacity of the proposed method to integrate with various visual frontends, to perform sound autonomous landing guidance.
[1] 甄子洋, 王新华, 江驹, 等. 舰载机自动着舰引导与控制研究进展[J]. 航空学报, 2017, 38(02): 020435.
ZHEN Z Y, WANG X H, JIANG J, et al. Research pro-gress in guidance and control of automatic carrier landing of carrier-based aircraft[J]. Acta Aeronautica et Astro-nautica Sinica, 2017, 38(2): 020435 (in Chinese).
[2] 张志冰, 甄子洋, 江驹, 等. 舰载机自动着舰引导与控制综述[J]. 南京航空航天大学学报, 2018, 50(06): 734-744.
ZHANG Z B, ZHEN Z Y, JIANG J, et al. Review on de-velopment in guidance and control of automatic carrier landing of carrier-based aircraft[J]. Journal of Nanjing University of Aeronautics & Astronautics, 2018, 50(6): 734-744 (in Chinese).
[3] MENG Y, WANG W, HAN H, et al. A visual/inertial integrated landing guidance method for UAV landing on the ship [J]. Aerospace Science and Technology, 2019, 85: 474-480.
[4] ZHANG Z, WANG Q F, BI D M, et al. MC-LRF based pose measurement system for shipborne aircraft automatic landing [J]. Chinese Journal of Aeronautics, 2023, 36(8): 298-312.
[5] ZHOU J X, Wang Q F, Zhang Z, et al. Aircraft Carrier Pose Tracking Based on Adaptive Region in Visual Land-ing [J]. Drones, 2022, 6(7): 182.
[6] CHEN Z Y, CHEN Q R, WANG Q F, et al. Infrared lights-based monocular pose measurement for autono-mous rendezvous of dual-motion platforms [C]//Third In-ternational Conference on Machine Vision, Automatic Identification, and Detection (MVAID 2024). SPIE, 2024: 48.
[7] CHEN Q R, CHEN Z Y, ZHANG Z, et al. Vision-guided autonomous landing technology for noncooperative target [C]//Third International Conference on Machine Vision, Automatic Identification, and Detection (MVAID 2024). SPIE, 2024: 46.
[8] XIN L, TANG Z, GAI W, et al. Vision-Based Autono-mous Landing for the UAV: A Review[J]. Aerospace, 2022, 9(11): 634.
[9] 李启军, 王擎宇, 高江炜, 等. 舰船舰载机着舰导航技术发展研究[J]. 舰船科学技术, 2023, 45(23): 206-211.
LI Q J, WANG Q Y, GAO J W, et al. Development re-search of carrier-borne aircraft landing navigation tech-nology[J]. Ship Science and Technology, 2023, 45(23): 206-211 (in Chinese).
[10] 甄冲, 曲晓雷, 王翼丰, 等. 视觉引导误差对自动着舰性能影响研究[J]. 航空工程进展, 2023:1-8.
ZHEN C, QU X L,WANG Y F, et al. Impact of visual guidance error on automatic carrier landing perfor-mance[J]. Advances in Aeronautical Science and Engi-neering, 2023:1-8 (in Chinese).
[11] 齐广峰, 吕军锋. MEMS惯性技术的发展及应用 [J]. 电子设计工程, 2015, 23(01): 87-89+92.
QI G F, LV J F, Evolution and application of MEMS iner-tial technology[J]. Electronic Design Engineering, 2015, 23(01): 87-89+92 (in Chinese).
[12] 司书斌, 赵大伟, 徐婉莹, 等. 视觉-惯性导航定位技术研究进展[J]. 中国图象图形学报, 2021, 26(06): 1470-1482.
SI S B, ZHAO D W, XU W Y, et al. Review on visual-inertial navigation and positioning technology[J]. Journal of Image and Graphics, 2021, 26(06): 1470-1482 (in Chinese).
[13] NING Y. A Comprehensive Introduction of Visual-Inertial Navigation[DB/OL]. arXiv preprint: 2307.11758, 2023.
[14] MOURIKIS A I, ROUMELIOTIS S I. A Multi-State Constraint Kalman Filter for Vision-aided Inertial Naviga-tion [C]//Proceedings 2007 IEEE International Conference on Robotics and Automation. IEEE, 2007.
[15] QIN T, LI P, SHEN S. VINS-Mono: A Robust and Versa-tile Monocular Visual-Inertial State Estimator[J]. IEEE Transactions on Robotics, 2018, 34(4): 1004-1020.
[16] QIU K, QIN T, GAO W, et al. Tracking 3-D Motion of Dynamic Objects Using Monocular Visual-Inertial Sensing [J]. IEEE Transactions on Robotics, 2019, 35(4): 799-816.
[17] HENEIN M, KENNEDY G, ILA V, et al. Simultaneous Localization and Mapping with Dynamic Rigid Objects [DB/OL]. arXiv preprint: 1805.03800, 2018.
[18] ECKENHOFF K, YANG Y, GENEVA P, et al. Tightly-Coupled Visual-Inertial Localization and 3-D Rigid-Body Target Tracking[J]. IEEE Robotics and Automation Let-ters, 2019, 4(2): 1541-1548.
[19] GYAGENDA N, HATILIMA J V, ROTH H, et al. A re-view of GNSS-independent UAV navigation tech-niques[J]. Robotics and Autonomous Systems, 2022, 152: 104069.
[20] BORTZ J E. A New Mathematical Formulation for Strapdown Inertial Navigation[J]. IEEE Transactions on Aerospace and Electronic Systems, 1971. 7(1): 61-66.
[21] SOLà J. Quaternion kinematics for the error-state Kal-man filter[DB/OL]. arXiv preprint: 1711.02508, 2015.
[22] 高翔. 自动驾驶与机器人中的SLAM技术[M]. 北京: 电字工业出版社, 2023: 79-80.
[23] SOLà J, DERAY J, ATCHUTHAN D. A micro Lie theo-ry for state estimation in robotics[DB/OL]. arXiv pre-print:1812.01537, 2018.
[24] DENNINGER M, SUNDERMEYER M, WINKELBAUER D, et al. BlenderProc[DB/OL]. arXiv preprint: 1911.01911, 2019.
[25] LEPETIT V, MORENO N F, FUA P. EPnP: An accurate O(n) solution to the PnP problem[J]. International Journal of Computer Vision, 2008, 81(2): 155-166.
[26] FISCHLER M A, BOLLES R C. Random sample con-sensus[J]. Communications of the ACM, 1981, 24(6): 381-395.