Bulletin of Surveying and Mapping ›› 2024, Vol. 0 ›› Issue (9): 87-95.doi: 10.13474/j.cnki.11-2246.2024.0916

Previous Articles     Next Articles

Visual/inertial/ultra-wideband dataset based on unmanned platform in complex scenes

LUO Haolong1,2, YANG Zidi1, LI Xueqiang1,3, ZOU Danping2, LI Jiansheng1, LI Guangyun1   

  1. 1. School of Geospatial Information, Information Engineering University, Zhengzhou 450001, China;
    2. Shanghai Key Laboratory of Navigation and Location-based Services, Shanghai Jiao Tong University, Shanghai 200240, China;
    3. Troops 61618, Beijing 100080, China
  • Received:2024-07-25 Published:2024-10-09

Abstract: The navigation and SLAM technologies based on multi-sensor fusion are currently the mainstream direction of development, and their research and applications in complex scenes have increasingly attracted widespread attention. Nevertheless, there exists a relative dearth of multi-sensor datasets specifically designed for complex scenes, particularly those that incorporate ultra-wide band (UWB) sensors. To facilitate users to test and verify multi-source sensor fusion algorithms in complex scenes and explore the shortcomings and potential development directions of multi-source sensor fusion and SLAM technology,both unmanned vehicles and drones are utilized to gather visual, inertial, and UWB data in seven diverse scenes, including dynamic scenes, non-line-of-sight scenes, and large-scale scenes. Furthermore, a high-precision optical motion capture system is employed to provide real-time six-degree-of-freedom true positions and orientations for the dataset. Finally, four state-of-the-art open-source algorithms, namely VINS-MONO, VINS-FUSION, VIR-SLAM and ORB-SLAM3, are utilized to conduct experimental verification and analysis on all scene sequences. The experimental results demonstrate that the data from all scene sequences are effective and usable.

Key words: multi-sensor fusion, dataset, complex scenes, UWB, unmanned platform

CLC Number: