导航

Acta Aeronautica et Astronautica Sinica ›› 2025, Vol. 46 ›› Issue (11): 531281.doi: 10.7527/S1000-6893.2024.31281

• Articles • Previous Articles    

UAV swarm positioning method based on monocular vision and ranging information

Kun LI1,2, Shuhui BU1,2(), Jiapeng LI1,2, Juboxi WANG1,2, Pengcheng HAN1,2, Xiaohan LI1,2, Haowei LI1,2   

  1. 1.School of Aeronautics,Northwestern Polytechnical University,Xi’an 710072,China
    2.National Key Laboratory of Aircraft Configuration Design,Xi’an 710072,China
  • Received:2024-09-27 Revised:2024-10-17 Accepted:2024-11-22 Online:2024-12-12 Published:2024-11-29
  • Contact: Shuhui BU E-mail:bushuhui@nwpu.edu.cn
  • Supported by:
    Postdoctoral Fellowship Program of CPS(GZB20240986)

Abstract:

Unmanned Aerial Vehicle (UAV) swarms play a pivotal role in the low-altitude economy. Accurate swarm positioning information underpins mission coordination, resource optimization, and efficient scheduling among drones, thereby facilitating the sustainable advancement of the low-altitude economy. In complex environments, however, Global Satellite Navigation System (GNSS) signals may be disrupted, rendering it difficult for UAV swarms to obtain accurate positioning data and compromising their ability to function collaboratively. To address this challenge in GNSS-denied environments, this paper presents a UAV swarm positioning method that integrates monocular vision and ranging information. Visual Odometry (VO) is employed to enable autonomous positioning for each UAV within the swarm. A communication framework is designed to transmit only essential data, including visual keyframes, pose frames, and map points, to the central server, thus reducing communication bandwidth. The concept of pose frame is introduced to address the limitation that keyframes cannot fuse with ranging information. The central server aligns maps from different UAVs based on the common view relationships between keyframes or the constraints between pose frames and their corresponding ranging data. The server then fuses and optimizes these maps using both visual and ranging information, achieving accurate swarm positioning. After global optimization, the server sends the corrected keyframe and map point data back to the local map of UAV’s VO to further enhance positioning accuracy. The proposed method is validated through simulations and experiments. Results demonstrate that the swarm positioning error is reduced to 0.49 m, outperforming current state-of-the-art visual positioning methods. Additionally, the scale error is reduced to 3.2%, effectively resolving the problem of scale ambiguity inherent in monocular visual positioning. This method proposed enables precise UAV swarm positioning based solely on inter-UAV ranging information, eliminating the need for shared visual features, and providing robust positioning data for UAV swarms operating in complex environments.

Key words: UAV swarm, visual positioning, range measurement, swarm positioning, graph optimization

CLC Number: