Bulletin of Surveying and Mapping ›› 2025, Vol. 0 ›› Issue (4): 90-95.doi: 10.13474/j.cnki.11-2246.2025.0415

Previous Articles    

Improved feature pyramid pooling for obstacle rxtraction in remote sensing images

SUN Kai, XU Qing, ZHANG Ruixin, SU Youneng   

  1. School of Geospatial Information, University of Information Engineering, Zhengzhou 450000, China
  • Received:2024-09-24 Published:2025-04-28

Abstract: The extraction of obstacles from high-resolution remote sensing images is one of the crucial bases for off-road path planning, as accurate obstacle locations can significantly reduce transit costs. Traditional surveying methods for obstacle extraction are inefficient and susceptible to human factors and terrain influences, making them unsuitable for complex battlefield environments. Current deep learning methods face issues such as feature loss and inadequate resolution when extracting obstacles like residential areas and water systems, particularly struggling with precision in identifying small-scale features, resulting in outputs that fail to meet requirements. To address these challenges, a method utilizing a feature pyramid attention network (ResT-PNet) for extracting features from remote sensing images has been proposed in this paper.Employing a feature pyramid pooling module to obtain global semantic information.Firstly, a feature fusion module has been constructed to integrate feature information across different scales, enhancing the feature extraction efficacy. Then, spatial and channel attention mechanisms have been introduced to minimize the loss of detail information and to integrate local and global features. Finally, comparative experiments and model applicability validation have been conducted. The results indicating that the proposed model achieves higher accuracy and better distinguishes small-scale obstacles, thereby providing support for off-road path planning.

Key words: feature extraction, convolutional neural network, attention mechanism, feature pyramid, path planning

CLC Number: