导航

ACTA AERONAUTICAET ASTRONAUTICA SINICA ›› 2023, Vol. 44 ›› Issue (5): 426964-426964.doi: 10.7527/S1000-6893.2022.26964

• Material Engineering and Mechanical Manufacturing • Previous Articles    

Indoor positioning technology of multi⁃rotor flying robot based on visual-inertial fusion

Huaijie ZHANG1, Jingya MA2, Haoyuan LIU1, Pin GUO1, Huichao DENG1(), Kun XU1, Xilun DING1   

  1. 1.Robotics Institute,Beihang University,Beijing  100191,China
    2.Institute of Spacecraft System Engineering CAST,Beijing  100094,China
  • Received:2022-01-17 Revised:2022-02-17 Accepted:2022-03-23 Online:2022-04-13 Published:2022-04-12
  • Contact: Huichao DENG E-mail:denghuichao@buaa.edu.cn
  • Supported by:
    National Natural Science Foundation of China(91748201)

Abstract:

With the development of artificial intelligence technology, the application scenarios of UAV tend to be diverse. People’s demand for UAV is not only satisfied with flight, but also endows it with the role of flying robot, imposing higher requirements for its autonomous navigation, positioning in complex environment and intelligent cooperation. According to the positioning requirements of indoor scene, the indoor positioning of multi rotor flying robot is realized by integrating vision and inertial data. Besides, an image enhancement algorithm is added to the visual front end to improve the gray contrast of the image. Aiming at the drift problem in visual-inertial fusion positioning of flying robot, a strategy of feature point extraction and image frame release based on image information is proposed to improve the positioning accuracy. Aiming at the indoor autonomous tracking and landing task of flying robot, a flying robot autonomous landing system based on visual positioning is designed. Moreover, a flying robot model is built in Gazebo to verify its effectiveness. The positioning algorithms are compared and evaluated under the EuRoC dataset. A flying robot platform is built in the real scene for indoor positioning experiments. The task of autonomous tracking and landing of ground platform in indoor scene is completed. The error analysis is carried out by using the positioning truth value provided by the motion capture system. The results show that the positioning technology can meet the requirements of autonomous tracking and landing tasks in indoor scenes.

Key words: flying robot, indoor positioning, autonomous tracking and landing, sensor data fusion, visual navigation

CLC Number: