Indoor integrated navigation system for unmanned aerial vehicles based on neural network predictive compensation

  • GUAN Xiangzhong ,
  • CAI Chenxiao ,
  • ZHAI Wenhua ,
  • WANG Lei ,
  • SHAO Peng
Expand
  • 1. Shanghai Electro-Mechanical Engineering Institute, Shanghai 201109, China;
    2. School of Automation, Nanjing University of Science and Technology, Nanjing 210094, China

Received date: 2019-11-15

  Revised date: 2019-11-28

  Online published: 2020-01-10

Abstract

Aiming at the problem that the reliability of data fusion will be drastically reduced when the environmental characteristics of the unmanned aerial vehicle are mutated, this paper proposes an algorithm to address the problem based on the prediction and compensation of neural network. First, the extended Kalman filter and particle filter are used for data fusion of laser and optical flow sensor, and then the Radial Basis Function (RBF) neural network is used to estimate the error before and after applying the particle filter. When the laser data is reliable, the RBF neural network enters the learning mode. When the laser data are interrupted or unreliable, the system is compensated by using the trained model. The results of the hover and trajectory experiments of unmanned aerial vehicles in the indoor environment show that when the laser data are unreliable, the compensated position for navigating is still reliable.

Cite this article

GUAN Xiangzhong , CAI Chenxiao , ZHAI Wenhua , WANG Lei , SHAO Peng . Indoor integrated navigation system for unmanned aerial vehicles based on neural network predictive compensation[J]. ACTA AERONAUTICAET ASTRONAUTICA SINICA, 2020 , 41(S1) : 723790 -723790 . DOI: 10.7527/S1000-6893.2019.23790

References

[1] 吴显亮, 石宗英, 钟宜生. 无人机视觉导航研究综述[J]. 系统仿真学报, 2010, 22(S1):62-65. WU X L, SHI Z Y, ZHONG Y S. Review of UAV visual navigation research[J]. Journal of System Simulation, 2010, 22(S1):62-65(in Chinese).
[2] HOW J P, BEHIHKE B, FRANK A, et al. Real-time indoor autonomous vehicle test environment[J]. IEEE Control Systems, 2008, 28(2):51-64.
[3] BACHRACH A, PRENTICE S, HE R, et al. RANGE-Robust autonomous navigation in GPS-denied environments[J]. Journal of Field Robotics, 2011, 28(5):644-666.
[4] TOURNIER G, VALENTI M, HOW J, et al. Estimation and control of a quadrotor vehicle using monocular vision and moire patterns[C]//AIAA Guidance, Navigation and Control Conference and Exhibit. Reston:AIAA, 2006:21-24.
[5] RONDON E, GARCIA-CARRILLO L R, FANTONI I. Vision-based altitude, position and speed regulation of a quadrotor rotorcraft[C]//2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2010:628-633.
[6] BAZIN J C, KWEON I, DEMONCEAUX C, et al. UAV attitude estimation by vanishing points in catadioptric images[C]//2008 IEEE International Conference on Robotics and Automation, 2008:2743-2749.
[7] KANADE T, AMIDI O, KE Q. Real-time and 3D vision for autonomous small and micro air vehicles[C]//43rd IEEE Conference on Decision and Control, 2014, 2:1655-1662.
[8] 童丸丸. 用于UAV导航的合成视觉方法[D]. 杭州:浙江大学, 2010. TONG W W. Synthetic vision method for UAV navigation[D]. Hangzhou:Zhejiang University, 2010(in Chinese).
[9] STOWERS J, BAINBRIDGE-SMITH A, HAYES M, et al. Optical flow for heading estimation of a quadrotor helicopter[J]. International Journal of Micro Air Vehicles, 2009, 1(4):229-239.
[10] VERVELD M J, CHU Q P, WAGTER C D, et al. Optic flow based state estimation for an indoor micro air vehicle[M]. 2012.
[11] SOBERS D, CHOWDHARY G, JOHNSON E N. Indoor navigation for unmanned aerial vehicles[M]. 2009.
[12] SALAZAR S, ROMERO H, GOMEZ J, et al. Real-time stereo visual servoing control of an UAV having eight-rotors[C]//20096th International Conference on Electrical Engineering, Computing Science and Automatic Control, 2009:1-11.
[13] YU H, BEARD R, BYME J. Vision-based navigation frame mapping and planning for collision avoidance for miniature air vehicles[J]. Control Engineering Practice, 2012, 18(7):824-836.
[14] AHRENS S, LEVINE D, ANDREWS G, et al. Vision-based guidance and control of a hovering vehicle in unknown, GPS-denied environments[C]//2009 IEEE International Conference on Robotics and Automation, 2009:2643-2648.
[15] CHITRAKARAN V K, DAWSON D M, CHEN J, et al. Vision Assisted Autonomous Landing of an Unmanned Aerial Vehicle[C]//44th IEEE Conference on Decision and Control, 2005:1465-1470.
[16] BILLS C, CHEN J, SAXENA A. Autonomous MAV flight in indoor environments using single image perspective cues[C]//2012 IEEE international conference on Robotics and automation (ICRA), 2012:5776-5783.
[17] GREEN W E, OH P Y, BARROWS G. Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments[C]//2014 IEEE International Conference on Robotics and Automation, 2014, 3:2347-2352.
[18] SUN S L, DENG Z L. Multi-sensor optimal information fusion Kalman filter[J]. Automatica, 2014, 40(6):1017-1023.
[19] SUN S. Multi-sensor optimal information fusion Kalman filters with applications[J]. Aerospace Science and Technology, 2004, 8(1):57-62.
[20] OLFATI-SABER R. Distributed Kalman filtering for sensor networks[C]//200746th IEEE Conference on Decision and Control, 2007:5492-5498.
[21] OLFATI-SABER R, JALALKAMALI P. Collaborative target tracking using distributed Kalman filtering on mobile sensor networks[C]//2011 American Control Conference, 2011:1100-1105.
[22] 黄小平,王岩,缪鹏程. 粒子滤波原理及应用[M]. 北京:电子工业出版社, 2017, 40. HUANG X P, WANG Y, MIAO P C. Particle filtering principle and application[M]. Beijing:Electronic Industry Press, 2017, 40(in Chinese).
[23] 王尔申.基于广义回归神经网络的粒子滤波算法研究[J]. 沈阳航空航天大学学报,2014,31(6):54-58. WANG E S. Research on particle filter algorithm based on generalized regression neural network[J]. Journal of Shenyang University of Aeronautics and Astronautics, 2014, 31(6):54-58(in Chinese).
[24] 刘金琨.RBF神经网络自适应控制[M]. 北京:清华大学出版社,2014:1. LIU J K. RBF neural network adaptive control[M]. Beijing:Tsinghua University Press, 2014:1(in Chinese).
[25] 王尔申, 李兴凯, 张芝贤, 等. 基于广义回归神经网络的粒子滤波算法研究[J]. 沈阳航空航天大学学报,2014,31(6):54-58. WANG E S, LI X K, ZHANG Z X, et al. Research on particle filter algorithm based on generalized regression neural network[J]. Journal of Shenyang University of Aeronautics and Astronautics, 2014, 31(6):54-58(in Chinese).
[26] WANG X, JIANG A G, WANG S. Distributed sensor networks for multi-sensor data fusion in intelligent maintenance[C]//3rd International Symposium on Instrumentation Science and Technology, 2004:587-592.
[27] WANG F, CUI J Q, CHEN B M, et al. A comprehensive UAV indoor navigation system based on vision optical flow and laser FastSLAM[J]. Acta Automatica Sinica, 2013, 39(11):1889-1899.
[28] 杭义军,刘建业,李荣冰,等. 基于混合特征匹配的微惯性/激光雷达组合导航方法[J]. 航空学报, 2014, 35(9):2583-2592. HANG Y J, LIU J Y, LI R B, et al. MEMS IMU/LADAR integrated navigation method based on mixed feature match[J]. Acta Aeronautica et Astronautica Sinica, 2014, 35(9):2583-2592(in Chinese).
[29] KONG T H, FANG Z, LI P. Indoor integrated navigation of micro aerial vehicle based on radar-scanner and inertial navigation system[J]. Control Theory & Applications, 2014, 31(5):607-613.
[30] BAR-SHALOM Y. On the track-to-track correlation problem[J]. IEEE Transactions on Automatic Control, 1981, 26(2):571-572.
[31] CARLSON N A. Federated square root filter for decentralized parallel processors[J]. IEEE Transactions on Aerospace and Electronic Systems, 1990, 26(3):517-525.
[32] ABDULHAFIZ W A, KHAMIS A. Bayesian approach to multisensor data fusion with pre-and post-filtering[C]//201310th IEEE International Conference on Networking, Sensing and Control (ICNSC), 2013:373-378.
[33] CHEN Z, CAI Y. Fata fusion algorithm for multi-sensor dynamic system based on interacting multiple model[J]. Journal of Shanghai Jiaotong University (Science), 2015, 20(3):265-272.
Outlines

/