Bulletin of Surveying and Mapping ›› 2023, Vol. 0 ›› Issue (10): 61-66.doi: 10.13474/j.cnki.11-2246.2023.0296

Previous Articles     Next Articles

Texture mapping for real-scene 3D model and real-time video image fusion

YANG Song1,2, CHEN Chongcheng1,2   

  1. 1. Key Lab of Spatial Data Mining and Information Sharing of Ministry of Education, Fuzhou University, Fuzhou 350108, China;
    2. National Engineering Research Center of Geospatial Information Technology, Fuzhou University, Fuzhou 350108, China
  • Received:2022-12-29 Published:2023-10-28

Abstract: Aiming at the problems of fragmentation of video texture, obvious fusion boundary difference and poor real-time performance of video stream when merging real-time video and real-time 3D model, an improved real-time 3D model and real-time video image fusion method is proposed. Firstly, we use oblique photogrammetry technology to construct a real-scene 3D model, build streaming media to realize real-time transmission of video images to 3D scenes, match the corresponding relationship between video pixels and real-scene model fragment vertices. Then introduce a flattening mapping strategy that takes into account the depth of the scene to eliminate texture splitting when the video is integrated into the model. Finally the transition at the fusion seam is smoother based on the distance mask. The experimental results show that the mapping strategy proposed in this paper has an excellent fusion effect, which can eliminate the fragmentation phenomenon of the fusion screen and the abrupt feeling of the fusion boundary, and the frame rate of the scene is above 50 frame/s in the case of fusion of multi-channel video. The system has a good carrying capacity and real-time performance is good, and the results can be applied to construct a digital twin scene of real-time fusion of virtual and real.

Key words: virtual and real fusion, video texture, augmented virtual environment, perspective projection, digital twins

CLC Number: