Bulletin of Surveying and Mapping ›› 2024, Vol. 0 ›› Issue (11): 38-43.doi: 10.13474/j.cnki.11-2246.2024.1107

Previous Articles     Next Articles

Airborne realistic 3D reconstruction based on visual pose correction

GAO Yunhan1,2, LIN Yili1,2, ZHANG Jinghan1,2, QIAN Wei1,2, XING Yubo1,2, SHI Hang1, XIE Yangmin1,2   

  1. 1. School of Mechatronic Engineering and Automation, Shanghai University, Shanghai 200444, China;
    2. Shanghai Key Laboratory of Intelligent Manufacturing and Robotics, Shanghai University, Shanghai 200444, China
  • Received:2024-03-05 Published:2024-12-05

Abstract: In urban surveying and mapping, drones equipped with LiDAR hover to collect data are prone to motion distortion due to the influence of fuselage vibration, resulting in poor fusion modeling performance. This article proposes a three-dimensional reconstruction method for airborne laser real scenes based on visual pose correction. The method utilizes video information obtained from binocular cameras during the scanning process of LiDAR to process the trajectory of LiDAR pose changes to correct the pose of the laser point cloud. Then, the corrected laser point cloud is projected onto the coordinate system of a monocular camera, and color information is obtained through collinear equations to achieve information fusion and generate a three-dimensional true color point cloud, and evaluation criteria for four feature dimensions of straightness, flatness, verticality, and fusion coloring are established for the true color point cloud before and after visual pose correction. The experimental results show that the true color point cloud after visual pose correction has a maximum improvement of 77.3% in straightness, 54.5% in flatness, and 54.53° in verticality compared to before correction. The color attachment is significantly more accurate.

Key words: sensors, airborne mobile sensing system, visual pose correction, true color point cloud

CLC Number: