[1] MUR-ARTAL R,MONTIEL J M M,TARDÓS J D.ORB-SLAM: a versatile and accurate monocular SLAM system[J].IEEE Transactions on Robotics,2015,31(5): 1147-1163. [2] MUR-ARTAL R,TARDÓS J D.ORB-SLAM2: an open-source SLAM system for monocular,stereo,and RGB-D cameras[J].IEEE Transactions on Robotics,2017,33(5): 1255-1262. [3] CAMPOS C,ELVIRA R,RODRÍGUEZ J J G,et al.ORB-SLAM3: an accurate open-source library for visual,visual-inertial,and multimap SLAM[J].IEEE Transactions on Robotics,2021,37(6): 1874-1890. [4] MILDENHALL B,SRINIVASAN P P,TANCIK M,et al.NeRF: representing scenes as neural radiance fields for view synthesis[C]//Proceedings of 2020 Computer Vision-ECCV.Cham: Springer,2020: 405-421. [5] SUCAR E,LIU Shikun,ORTIZ J,et al.iMAP: implicit mapping and positioning in real-time[C]//Proceedings of 2021 IEEE/CVF International Conference on Computer Vision (ICCV).Montreal,QC:IEEE,2021: 6209-6218. [6] WANG Hengyi,WANG Jingwen,AGAPITO L.Co-SLAM: joint coordinate and sparse parametric encodings for neural real-time SLAM[C]//Proceedings of 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition.Vancouver:IEEE,2023: 13293-13302. [7] ZHU Zihan,PENG Songyou,LARSSON V,et al.NICE-SLAM: neural implicit scalable encoding for SLAM[C]//Proceedings of 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition.New Orleans:IEEE,2022: 12776-12786. [8] JOHARI M M,CARTA C,FLEURET F.ESLAM: efficient dense SLAM system based on hybrid representation of signed distance fields[C]//Proceedings of 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition.Vancouver:IEEE,2023: 17408-17419. [9] YANG Xingrui,LI Hai,ZHAI Hongjia,et al.Vox-fusion: dense tracking and mapping with voxel-based neural implicit representation[C]//Proceedings of 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).Singapore:IEEE,2022: 499-507. [10] SANDSTRÖM E,LI Yue,VAN GOOL L,et al.Point-SLAM: dense neural point cloud-based SLAM[C]//Proceedings of 2023 IEEE/CVF International Conference on Computer Vision.Paris:IEEE,2023: 18387-18398. [11] TEED Z,DENG Jia.RAFT: recurrent all-pairs field transforms for optical flow[M]//Computer Vision-ECCV 2020.Cham: Springer International Publishing,2020: 402-419. [12] GODARD C,MAC AODHA O,BROSTOW G J.Unsupervised monocular depth estimation with left-right consistency[C]//Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition.Honolulu:IEEE,2017: 6602-6611. [13] HUANG H,CHEN Y,ZHANG T,et al.360Roam: real-time indoor roaming using geometry-aware 360° radiance fields[EB/OL].[2025-01-19].https://arxiv.org/abs/2208.02705. [14] FRIDOVICH-KEIL S,YU A,TANCIK M,et al.Plenoxels: radiance fields without neural networks[C]//Proceedings of 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition.New Orleans:IEEE,2022: 5491-5500. [15] RUBLEE E,RABAUD V,KONOLIGE K,et al.ORB: an efficient alternative to SIFT or SURF[C]//Proceedings of 2011 International Conference on Computer Vision.Barcelona:IEEE,2011: 2564-2571. [16] KERBL B,KOPANAS G,LEIMKUEHLER T,et al.3D Gaussian splatting for real-time radiance field rendering[J].ACM Transactions on Graphics,2023,42(4): 1-14. [17] CHUNG C M,TSENG Y C,HSU Y C,et al.Orbeez-SLAM: a real-time monocular visual SLAM with ORB features and NeRF-realized mapping[C]//Proceedings of 2023 IEEE International Conference on Robotics and Automation (ICRA).London:IEEE,2023: 9400-9406. [18] ZHANG Youmin,TOSI F,MATTOCCIA S,et al.GO-SLAM: global optimization for consistent 3D instant reconstruction[C]//Proceedings of 2023 IEEE/CVF International Conference on Computer Vision.Paris:IEEE,2023: 3704-3714. [19] STURM J,ENGELHARD N,ENDRES F,et al.A benchmark for the evaluation of RGB-D SLAM systems[C]//Proceedings of 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.Vilamoura-Algarve:IEEE,2012: 573-580. [20] ZHANG R,ISOLA P,EFROS A A,et al.The unreasonable effectiveness of deep features as a perceptual metric[C]//Proceedings of 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.Salt Lake city:IEEE,2018: 586-595. |