Bulletin of Surveying and Mapping ›› 2021, Vol. 0 ›› Issue (2): 98-102,116.doi: 10.13474/j.cnki.11-2246.2021.0052

Previous Articles     Next Articles

LiDAR and visual information fusion position system of living work robot for distribution network

REN Qingting1, LI Shuai1, Lü Peng1, ZHANG Tong2   

  1. 1. StateGrid Ruijia (Tianjin) Intelligent Robot Co., Ltd., Tianjin 300467, China;
    2. Beijng Guodian Futong Science And Technology Development Co., Ltd., Beijing 100070, China
  • Received:2020-04-15 Revised:2020-11-11 Online:2021-02-25 Published:2021-03-09

Abstract: This paper proposes a system based on single-line LiDAR sensor and vision fusion method to detect the spatial three-dimensional coordinates of the wire. It can improve the recognition and positioning accuracy of live working robot for distribution network effectively. LiDAR and visual information fusion position system based on radar sensors and visual fusion method to calculate the spatial three-dimensional coordinates of the wire. Living work robot can work outdoor with bright sunlight environment by using the multiple sensor fusion algorithm. At first, the LiDAR and the camera should be calibrated in advance to make the image pixels correspond to the LiDAR depth point cloud. Then, the paper uses Canny algorithm and Hough transform to calculate the position of wire. Finally, the robot can recognise and position wire in high precision and high efficiency under outdoor light.

Key words: LiDAR, computer vision, living work robot for distribution network, calibration, DBSCAN

CLC Number: