文章快速检索 高级检索

1. 上海机电工程研究所, 上海 201109;
2. 南京理工大学 自动化学院, 南京 210094

Indoor integrated navigation system for unmanned aerial vehicles based on neural network predictive compensation
GUAN Xiangzhong1, CAI Chenxiao2, ZHAI Wenhua1, WANG Lei1, SHAO Peng1
1. Shanghai Electro-Mechanical Engineering Institute, Shanghai 201109, China;
2. School of Automation, Nanjing University of Science and Technology, Nanjing 210094, China
Abstract: Aiming at the problem that the reliability of data fusion will be drastically reduced when the environmental characteristics of the unmanned aerial vehicle are mutated, this paper proposes an algorithm to address the problem based on the prediction and compensation of neural network. First, the extended Kalman filter and particle filter are used for data fusion of laser and optical flow sensor, and then the Radial Basis Function (RBF) neural network is used to estimate the error before and after applying the particle filter. When the laser data is reliable, the RBF neural network enters the learning mode. When the laser data are interrupted or unreliable, the system is compensated by using the trained model. The results of the hover and trajectory experiments of unmanned aerial vehicles in the indoor environment show that when the laser data are unreliable, the compensated position for navigating is still reliable.
Keywords: combined navigation system    RBF neural network    extended Kalman filter    particle filter    predictive compensation

1 EKF/PF室内组合导航系统

 图 1 EKF/PF室内组合导航系统结构 Fig. 1 Structure of EKF / PF integrated indoor navigation system

 ${{p_{{\rm{ ins}}{\rm{. }}t}} = \mathit{\boldsymbol{ \boldsymbol{\varPhi} }}{p_{{\rm{ ins}}{\rm{. }}t - 1}}}$ （1）
 ${{p_{t,t - 1}} = \mathit{\boldsymbol{ \boldsymbol{\varPhi} }}{p_{t - 1}}{\mathit{\boldsymbol{ \boldsymbol{\varPhi} }}^{\rm{T}}} + \mathit{\boldsymbol{Q}}}$ （2）

 ${{H_t} = {p_{t,t - 1}}{\mathit{\boldsymbol{C}}^{\rm{T}}}{{({\mathit{\boldsymbol{C}}_p}_{t,t - 1}{\mathit{\boldsymbol{C}}^{\rm{T}}} + \mathit{\boldsymbol{R}})}^{ - 1}}}$ （3）
 ${{p_t} = {p_{t,t - 1}} + {H_k}({p_{{\rm{of}},t}} - \mathit{\boldsymbol{C}}{p_{{\rm{ins}},t}})}$ （4）
 ${{p_t} = (\mathit{\boldsymbol{I}} - {H_t}\mathit{\boldsymbol{C}}){p_{k,k - 1}}}$ （5）

 $\left\{ {\begin{array}{*{20}{l}} {{\psi _t} = {\psi _{t - 1}} + \Delta {\psi _t}}\\ {{x_t} = {x_{t - 1}} + {\rm{cos}}{\psi _t}\Delta {x_t} - {\rm{sin}}{\psi _t}\Delta {y_t}}\\ {{y_t} = {y_{t - 1}} + {\rm{sin}}{\psi _t}\Delta {x_t} + {\rm{cos}}{\psi _t}\Delta {y_t}} \end{array}} \right.$ （6）

 $Z(t) = h(X(t))$ （7）

 $\left\{ {\begin{array}{*{20}{l}} { - {5^\circ } \le {\psi _0} \le {5^\circ }}\\ { - 2{\kern 1pt} {\kern 1pt} {\kern 1pt} {\rm{cm}} \le {x_0} \le 2{\kern 1pt} {\kern 1pt} {\kern 1pt} {\rm{cm}}}\\ { - 2{\kern 1pt} {\kern 1pt} {\kern 1pt} {\rm{cm}} \le {y_0} \le 2{\kern 1pt} {\kern 1pt} {\kern 1pt} {\rm{cm}}} \end{array}} \right.$

 $\omega _t^i = {\rm{exp}}\left( { - \frac{1}{2} \cdot (Z_t^i - {Z_t})} \right)$ （8）

 $\omega _t^{(i)} = \frac{{\omega _t^i}}{{\sum\limits_{i = 1}^N {\omega _t^i} }}$ （9）

2 基于RBF神经网络误差补偿的室内组合导航系统 2.1 RBF神经网络模型

RBF神经网络于1988年提出，相比于其他神经网络，其具有良好的泛化能力，网络结构相对简单，避免了冗长的计算。RBF神经网络包含3层:输入层、隐含层和输出层，其中隐含层的神经元激活函数由径向基函数构成[25-26]。RBF神经网络的结构如图 2所示。

 图 2 RBF神经网络模型结构 Fig. 2 Structure of RBF neural network model

 ${h_j} = \exp \left( { - \frac{{{{(x - {c_j})}^2}}}{{2b_j^2}}} \right)$ （10）

RBF网络的权值为

 $w = \{ {w_1},{w_2}, \cdots ,{w_m}\}$ （11）

RBF网络输出为

 $y = {w_1}{h_1} + {w_2}{h_2} + \cdots + {w_m}{h_m}$ （12）
2.2 基于RBF神经网络补偿的导航系统设计

 图 3 基于RBFNN数据融合并行结构学习模式系统框图 Fig. 3 Block diagram of parallel structure learning mode system based on RBFNN data fusion
 图 4 基于RBFNN数据融合并行结构预测模式系统框图 Fig. 4 Block diagram of parallel structure prediction mode system based on RBFNN data fusion

2.3 RBF神经网络训练方法

 $e = \Delta {p_{\rm{d}}} - \Delta {p_{{\rm{rbf}}}}$ （13）

 $E = {(\Delta {p_{\rm{d}}} - \Delta {p_{{\rm{rbf}}}})^2}$ （14）

 $\left\{ {\begin{array}{*{20}{l}} {{w_j}(k) = {w_j}(k - 1) + \Delta {w_j}(k) + \alpha \Delta {w_j}(k - 1)}\\ {{c_j}(k) = {c_j}(k - 1) + \Delta {c_j}(k) + \beta \Delta {c_j}(k - 1)}\\ {{b_j}(k) = {b_j}(k - 1) + \Delta {b_j}(k) + \gamma \Delta {b_j}(k - 1)} \end{array}} \right.$ （15）

3 实验结果 3.1 定点实验结果

 图 5 实验平台及主要传感器 Fig. 5 Experimental platform and main sensors

 传感器 精度参数 数值 INS 位置精度/m >5 速度精度/(m·s-1) 10 px4flow光流传感器 光流运算速度/Hz 120 最大感应角速率/((°)·s-1) 2 000 最大数据更新速度/Hz 780 UTM-30LX二维激光扫描仪 测量范围/m 0.1~30;Max.60 m (270°) 测量精度/m (0.1~10):±0.03;(10~30):±0.05 角度分辨率/(°) 0.25

 图 6 补偿前后位置与实际位置对比 Fig. 6 Comparison of position and actual position before and after compensation

 图 7 补偿前后位置误差对比 Fig. 7 Comparison of position error before and after compensation

 类型 x轴误差/cm y轴误差/cm 补偿前 补偿后 补偿前 补偿后 平均误差 3.663 7 1.999 4.637 7 1.147 9 误差极值 14.270 2 5.503 5 11.910 5 2.865 6
3.2 轨迹实验结果

 图 8 补偿前后轨迹与实际轨迹对比 Fig. 8 Comparison of trajectory and actual trajectory before and after compensation
4 结论

 [1] 吴显亮, 石宗英, 钟宜生. 无人机视觉导航研究综述[J]. 系统仿真学报, 2010, 22(S1): 62-65. WU X L, SHI Z Y, ZHONG Y S. Review of UAV visual navigation research[J]. Journal of System Simulation, 2010, 22(S1): 62-65. (in Chinese) Cited By in Cnki (87) | Click to display the text [2] HOW J P, BEHIHKE B, FRANK A, et al. Real-time indoor autonomous vehicle test environment[J]. IEEE Control Systems, 2008, 28(2): 51-64. Click to display the text [3] BACHRACH A, PRENTICE S, HE R, et al. RANGE-Robust autonomous navigation in GPS-denied environments[J]. Journal of Field Robotics, 2011, 28(5): 644-666. Click to display the text [4] TOURNIER G, VALENTI M, HOW J, et al. Estimation and control of a quadrotor vehicle using monocular vision and moire patterns[C]//AIAA Guidance, Navigation and Control Conference and Exhibit. Reston: AIAA, 2006: 21-24. [5] RONDON E, GARCIA-CARRILLO L R, FANTONI I. Vision-based altitude, position and speed regulation of a quadrotor rotorcraft[C]//2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2010: 628-633. [6] BAZIN J C, KWEON I, DEMONCEAUX C, et al. UAV attitude estimation by vanishing points in catadioptric images[C]//2008 IEEE International Conference on Robotics and Automation, 2008: 2743-2749. [7] KANADE T, AMIDI O, KE Q. Real-time and 3D vision for autonomous small and micro air vehicles[C]//43rd IEEE Conference on Decision and Control, 2014, 2: 1655-1662. [8] 童丸丸.用于UAV导航的合成视觉方法[D].杭州: 浙江大学, 2010. TONG W W. Synthetic vision method for UAV navigation[D]. Hangzhou: Zhejiang University, 2010(in Chinese). [9] STOWERS J, BAINBRIDGE-SMITH A, HAYES M, et al. Optical flow for heading estimation of a quadrotor helicopter[J]. International Journal of Micro Air Vehicles, 2009, 1(4): 229-239. Click to display the text [10] VERVELD M J, CHU Q P, WAGTER C D, et al. Optic flow based state estimation for an indoor micro air vehicle[M]. 2012. [11] SOBERS D, CHOWDHARY G, JOHNSON E N. Indoor navigation for unmanned aerial vehicles[M]. 2009. [12] SALAZAR S, ROMERO H, GOMEZ J, et al. Real-time stereo visual servoing control of an UAV having eight-rotors[C]//20096th International Conference on Electrical Engineering, Computing Science and Automatic Control, 2009: 1-11. [13] YU H, BEARD R, BYME J. Vision-based navigation frame mapping and planning for collision avoidance for miniature air vehicles[J]. Control Engineering Practice, 2012, 18(7): 824-836. Click to display the text [14] AHRENS S, LEVINE D, ANDREWS G, et al. Vision-based guidance and control of a hovering vehicle in unknown, GPS-denied environments[C]//2009 IEEE International Conference on Robotics and Automation, 2009: 2643-2648. [15] CHITRAKARAN V K, DAWSON D M, CHEN J, et al. Vision Assisted Autonomous Landing of an Unmanned Aerial Vehicle[C]//44th IEEE Conference on Decision and Control, 2005: 1465-1470. [16] BILLS C, CHEN J, SAXENA A. Autonomous MAV flight in indoor environments using single image perspective cues[C]//2012 IEEE international conference on Robotics and automation (ICRA), 2012: 5776-5783. [17] GREEN W E, OH P Y, BARROWS G. Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments[C]//2014 IEEE International Conference on Robotics and Automation, 2014, 3: 2347-2352. [18] SUN S L, DENG Z L. Multi-sensor optimal information fusion Kalman filter[J]. Automatica, 2014, 40(6): 1017-1023. Click to display the text [19] SUN S. Multi-sensor optimal information fusion Kalman filters with applications[J]. Aerospace Science and Technology, 2004, 8(1): 57-62. Click to display the text [20] OLFATI-SABER R. Distributed Kalman filtering for sensor networks[C]//200746th IEEE Conference on Decision and Control, 2007: 5492-5498. [21] OLFATI-SABER R, JALALKAMALI P. Collaborative target tracking using distributed Kalman filtering on mobile sensor networks[C]//2011 American Control Conference, 2011: 1100-1105. [22] 黄小平, 王岩, 缪鹏程. 粒子滤波原理及应用[M]. 北京: 电子工业出版社, 2017. HUANG X P, WANG Y, MIAO P C. Particle filtering principle and application[M]. Beijing: Electronic Industry Press, 2017. (in Chinese) [23] 王尔申. 基于广义回归神经网络的粒子滤波算法研究[J]. 沈阳航空航天大学学报, 2014, 31(6): 54-58. WANG E S. Research on particle filter algorithm based on generalized regression neural network[J]. Journal of Shenyang University of Aeronautics and Astronautics, 2014, 31(6): 54-58. (in Chinese) Cited By in Cnki (6) | Click to display the text [24] 刘金琨. RBF神经网络自适应控制[M]. 北京: 清华大学出版社, 2014: 1. LIU J K. RBF neural network adaptive control[M]. Beijing: Tsinghua University Press, 2014: 1. (in Chinese) [25] 王尔申, 李兴凯, 张芝贤, 等. 基于广义回归神经网络的粒子滤波算法研究[J]. 沈阳航空航天大学学报, 2014, 31(6): 54-58. WANG E S, LI X K, ZHANG Z X, et al. Research on particle filter algorithm based on generalized regression neural network[J]. Journal of Shenyang University of Aeronautics and Astronautics, 2014, 31(6): 54-58. (in Chinese) Cited By in Cnki (6) | Click to display the text [26] WANG X, JIANG A G, WANG S. Distributed sensor networks for multi-sensor data fusion in intelligent maintenance[C]//3rd International Symposium on Instrumentation Science and Technology, 2004: 587-592. [27] WANG F, CUI J Q, CHEN B M, et al. A comprehensive UAV indoor navigation system based on vision optical flow and laser FastSLAM[J]. Acta Automatica Sinica, 2013, 39(11): 1889-1899. Click to display the text [28] 杭义军, 刘建业, 李荣冰, 等. 基于混合特征匹配的微惯性/激光雷达组合导航方法[J]. 航空学报, 2014, 35(9): 2583-2592. HANG Y J, LIU J Y, LI R B, et al. MEMS IMU/LADAR integrated navigation method based on mixed feature match[J]. Acta Aeronautica et Astronautica Sinica, 2014, 35(9): 2583-2592. (in Chinese) Cited By in Cnki | Click to display the text [29] KONG T H, FANG Z, LI P. Indoor integrated navigation of micro aerial vehicle based on radar-scanner and inertial navigation system[J]. Control Theory & Applications, 2014, 31(5): 607-613. Click to display the text [30] BAR-SHALOM Y. On the track-to-track correlation problem[J]. IEEE Transactions on Automatic Control, 1981, 26(2): 571-572. Click to display the text [31] CARLSON N A. Federated square root filter for decentralized parallel processors[J]. IEEE Transactions on Aerospace and Electronic Systems, 1990, 26(3): 517-525. Click to display the text [32] ABDULHAFIZ W A, KHAMIS A. Bayesian approach to multisensor data fusion with pre-and post-filtering[C]//201310th IEEE International Conference on Networking, Sensing and Control (ICNSC), 2013: 373-378. [33] CHEN Z, CAI Y. Fata fusion algorithm for multi-sensor dynamic system based on interacting multiple model[J]. Journal of Shanghai Jiaotong University (Science), 2015, 20(3): 265-272. Click to display the text
http://dx.doi.org/10.7527/S1000-6893.2019.23790

0

#### 文章信息

GUAN Xiangzhong, CAI Chenxiao, ZHAI Wenhua, WANG Lei, SHAO Peng

Indoor integrated navigation system for unmanned aerial vehicles based on neural network predictive compensation

Acta Aeronautica et Astronautica Sinica, 2020, 41(S1): 723790.
http://dx.doi.org/10.7527/S1000-6893.2019.23790