[1] LUO Haolong, LI Guangyun, ZOU Danping, et al.UAV navigation with monocular visual inertial odometry under GNSS-denied environment[J].IEEE Transactions on Geoscience and Remote Sensing, 2023, 61(1-15). [2] 李广云, 范百兴.精密工程测量技术及其发展[J].测绘学报, 2017, 46(10):1742-1751. [3] NGUYEN T M, QIU Z, CAO M, et al.Single landmark distance-based navigation[J].IEEE Transactions on Control Systems Technology, 2019, 28(5):2021-2028. [4] NGUYEN T H, NGUYEN T M, XIE Lihua.Tightly-coupled ultra-wideband-aided monocular visual SLAM with degenerate anchor configurations[J].Autonomous Robots, 2020, 44(8):1519-1534. [5] 罗豪龙, 李广云, 欧阳文, 等.基于自适应卡尔曼滤波的TDOA定位方法[J].测绘科学技术学报, 2020, 37(3):252-257. [6] 李雪强, 李建胜, 王安成, 等.视觉/惯性/UWB组合导航技术综述[J].测绘科学, 2023, 48(6):49-58. [7] GEIGER A, LENZ P, STILLER C, et al.Vision meets robotics:the KITTI dataset[J].The International Journal of Robotics Research, 2013, 32(11):1231-1237. [8] BURRI M, NIKOLIC J, GOHL P, et al.The EuRoC micro aerial vehicle datasets[J].The International Journal of Robotics Research, 2016, 35(10):1157-1163. [9] MADDERN W, PASCOE G, LINEGAR C, et al.1 year, 1000km:the Oxford RobotCar dataset[J].The International Journal of Robotics Research, 2017, 36(1):3-15. [10] SCHUBERT D, GOLL T, DEMMEL N, et al.The TUM VI benchmark for evaluating visual-inertial odometry[C]//Proceedings of 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems.Madrid:IEEE, 2018. [11] DELMERICO J, CIESLEWSKI T, REBECQ H, et al.Are we ready for autonomous drone racing? the UZH-FPV drone racing dataset[C]//Proceedings of 2019 International Conference on Robotics and Automation.Montreal:IEEE, 2019:6713-6719. [12] YIN J, LI A, LI T, et al.M2DGRr:a multi-sensor and multi-scenario SLAM dataset for ground robots[J].IEEE Robotics and Automation Letters, 2021, 7(2):2266-2273. [13] NGUYEN T-M, YUAN S, CAO M, et al.NTU VIRAL:a visual-inertial-ranging-LiDAR dataset, from an aerial vehicle viewpoint[J].The International Journal of Robotics Research.2022, 41(3):270-280. [14] YIN Jie, JIANG Haitao, WANG Jiale, et al.A Robust and Efficient EKF-based GNSS-Visual-Inertial Odometry[C]//Proceedings of 2023 IEEE International Conference on Robotics and Biomimetics (ROBIO).[S.l.]:IEEE, 2023. [15] QIAN W, XIA Z, XIONG J, et al.Manipulation task simulation using ROS and Gazebo[C]//Proceedings of 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014).Bali:IEEE, 2014. [16] 周全, 张扬, 李子申, 等.低轨卫星导航增强星座地基观测仿真系统设计[J].测绘通报, 2022(2):110-115. [17] 罗豪龙, 李建胜, 邹丹平, 等.大场景光学运动捕捉系统标定方法研究[J].光子学报, 2023, 52(11):1111003. [18] WANG J, GU P, WANG L, et al.RVIO:an effective localization algorithm for range-aided visual-inertial odometry system[J].IEEE Transactions on Intelligent Transportation Systems.2024, 25(2):1476-1490. [19] JIA S, JIAO Y, ZHANG Z, et al.FEJ-VIRO:a consistent first-estimate Jacobian visual-inertial-ranging odometry[C]//Proceedings of 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).Kyoto:IEEE, 2022:1336-1343. [20] JIA S, XIONG R, WANG Y.Distributed initialization for visual-inertial-ranging odometry with position-unknown UWB network[C]//Proceedings of 2023 IEEE International Conference on Robotics and Automation (ICRA).London:IEEE, 2023:6246-6252. [21] CAO Yanjun, BELTRAME G.VIR-SLAM:visual, inertial, and ranging SLAM for single and multi-robot systems[J].Autonomous Robots, 2021, 45(6):905-917. [22] FURGALE P, REHDER J, SIEGWART R.Unified temporal and spatial calibration for multi-sensor systems[C]//Proceedings of 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.Tokyo:IEEE, 2013. [23] QIN Tong, LI Peiliang, SHEN Shaojie.VINS-Mono:a robust and versatile monocular visual-inertial state estimator[J].IEEE Transactions on Robotics, 2017, 34(4):1004-1020. [24] QIN Tong, CAO Shaozu, PAN Jie, et al.A general optimization-based framework for global pose estimation with multiple sensors[C]//Proceedings of 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).[S.l.]:IEEE, 2019. [25] CAMPOS C, ELVIRAL R, RODRIGUEZ J, et al.ORB-SLAM3:an accurate open-source library for visual, visual-inertial, and multimap SLAM[J].IEEE Transactions on Robotics, 2021, 37(6):1874-1890. |