欢迎您访问《智慧农业(中英文)》官方网站! English

农业复杂场景下多机器人协同SLAM研究进展与展望

  • 马楠 ,
  • 曹姗姗 ,
  • 白涛 ,
  • 孔繁涛 ,
  • 孙伟
展开
  • 1.新疆农业大学 计算机与信息工程学院,新疆 乌鲁木齐 830052
    2.中国农业科学院农业信息研究所,北京 100081
    3.国家农业科学数据中心,北京 100081
    4.智能农业教育部工程研究中心,新疆 乌鲁木齐 830052
    5.新疆农业信息化工程技术研究中心,新疆 乌鲁木齐 830052
    6.中国农业科学院农业经济与发展研究所,北京 100081
马 楠,研究方向为农业信息技术。E-mail:1457883572@qq.com
孙 伟,博士,研究员,研究方向为智慧畜牧和时空信息分析。E-mail:sunwei02@caas.cn

收稿日期: 2024-06-11

  网络出版日期: 2024-11-21

基金资助

中国农业科学院科技创新工程(10-IAED-RC-09-2024);新疆维吾尔自治区重点研发任务专项(2022B02049-1-3);国家重点研发计划项目(2023YFD200080503);新疆维吾尔自治区高校基本科研业务费科研项目(XJEDU2022J009);科技部科技创新 2030重大项目(2022ZD0115800)

版权

, ,

Research Progress and Prospect of Multi-robot Collaborative SLAM in Complex Agricultural Scenarios

  • MA Nan ,
  • CAO Shanshan ,
  • BAI Tao ,
  • KONG Fantao ,
  • SUN Wei
Expand
  • 1.College of Computer and Information Engineering, Xinjiang Agricultural University, Urumqi 830052, China
    2.Agricultural Information Institute of CAAS, Beijing 100081, China
    3.National Agriculture Science Data Center, Beijing 100081, China
    4.Engineering Research Center of Intelligent Agriculture Ministry of Education, Urumqi 830052, China
    5.Xinjiang Agricultural Informatization Engineering Technology Research Center, Urumqi 830052, China
    6.Institute of Agricultural Economics and development, Beijing 100081, China.
MA Nan, E-mail: 1457883572@qq.com.
SUN Wei, E-mail: sunwei02@caas.cn.

Received date: 2024-06-11

  Online published: 2024-11-21

Supported by

Science and Technology Innovation Program of the Chinese Academy of Agricultural Sciences(10-IAED-RC-09-2024);Key Research and Development Task Project of the Xinjiang Uygur Autonomous Region(2022B02049-1-3);National Key Research and Development Program of China(2023YFD200080503);Basic Research Project Fund for Universities in Xinjiang Uygur Autonomous Region(XJEDU2022J009);Major Project of the Science and Technology Innovation 2030 Initiative by the Ministry of Science and Technology(2022ZD0115800)

Copyright

copyright©2024 by the authors

摘要

[目的/意义] 在大田作业、野外放牧、果园采收等典型农业应用场景下,多机器人(包括移动式智能农机装备等)高精度快速协同同步定位与建图(Simultaneous Localization and Mapping, SLAM)是智慧农业乃至无人农场的关键基础和核心支撑。与单机器人SLAM相比,多机器人协同SLAM具有精度高、范围广、实时性强、扩展性好等优势,但在农业种植和养殖等自然复杂环境下,由于场景动态可变、地形复杂多变、环境丰富多样、通信约束受限等多重因素叠加影响,尚存在诸多问题与挑战。[进展] 现有研究主要是从通用基础技术的视角对多机器人SLAM的研究脉络、优缺点、适用条件和关键核心问题等方面进行总结归纳,但缺乏针对农业复杂场景特性的剖析。本研究面向农业复杂场景的主要特征,以“多传感器数据融合—协同定位—协同建图—回环检测”为关键技术主线,分析了多机器人协同SLAM的优缺点及其在农业领域的适用性;从多机器人协同作业的视角,明晰了集中式、分布式和混合式三种主要协同框架的优势、局限性及适用的典型农业应用场景;进而探讨了农业复杂场景下多机器人SLAM存在的多传感器融合精度偏低、协同通信环境受限、相对位姿估计准确性不高等突出问题。[结论/展望]从优化数据融合底层算法、融合深度学习和强化学习、引入大语言模型、应用数字孪生技术等方面,对农业复杂环境下多机器人SLAM的未来发展方向和趋势进行了展望。

本文引用格式

马楠 , 曹姗姗 , 白涛 , 孔繁涛 , 孙伟 . 农业复杂场景下多机器人协同SLAM研究进展与展望[J]. 智慧农业, 2024 : 1 -21 . DOI: 10.12133/j.smartag.SA202406005

Abstract

[Significance] The rapid development of artificial intelligence and automation has greatly expanded the scope of agricultural automation, with applications such as precision farming using unmanned machinery, robotic grazing in outdoor environments, and automated harvesting by orchard-picking robots. Collaborative operations among multiple agricultural robots enhance production efficiency and reduce labor costs, driving the development of smart agriculture. Multi-robot simultaneous localization and mapping (SLAM) plays a pivotal role by ensuring accurate mapping and localization, which are essential for the effective management of unmanned farms. Compared to single-robot SLAM, multi-robot systems offer several advantages, including higher localization accuracy, larger sensing ranges, faster response times, and improved real-time performance. These capabilities are particularly valuable for completing complex tasks efficiently. However, deploying multi-robot SLAM in agricultural settings presents significant challenges. Dynamic environmental factors, such as crop growth, changing weather patterns, and livestock movement, increase system uncertainty. Additionally, agricultural terrains vary from open fields to irregular greenhouses, requiring robots to adjust their localization and path-planning strategies based on environmental conditions. Communication constraints, such as unstable signals or limited transmission range, further complicate coordination between robots. These combined challenges make it difficult to implement multi-robot SLAM effectively in agricultural environments. To unlock the full potential of multi-robot SLAM in agriculture, it is essential to develop optimized solutions that address the specific technical demands of these scenarios. [Progress] Existing review studies on multi-robot SLAM mainly focus on a general technological perspective, summarizing trends in the development of multi-robot SLAM, the advantages and limitations of algorithms, universally applicable conditions, and core issues of key technologies. However, there is a lack of analysis specifically addressing multi-robot SLAM under the characteristics of complex agricultural scenarios. This study focuses on the main features and applications of multi-robot SLAM in complex agricultural scenarios. The study analyzes the advantages and limitations of multi-robot SLAM, as well as its applicability and application scenarios in agriculture, focusing on four key components: multi-sensor data fusion, collaborative localization, collaborative map building, and loopback detection. From the perspective of collaborative operations in multi-robot SLAM, the study outlines the classification of SLAM frameworks, including three main collaborative types: centralized, distributed, and hybrid. Based on this, the study summarizes the advantages and limitations of mainstream multi-robot SLAM frameworks, along with typical scenarios in robotic agricultural operations where they are applicable. Additionally, it discusses key issues faced by multi-robot SLAM in complex agricultural scenarios, such as low accuracy in mapping and localization during multi-sensor fusion, restricted communication environments during multi-robot collaborative operations, and low accuracy in relative pose estimation between robots. [Conclusions and Prospects] To enhance the applicability and efficiency of multi-robot SLAM in complex agricultural scenarios, future research needs to focus on solving these critical technological issues. First, the development of enhanced data fusion algorithms will facilitate improved integration of sensor information, leading to greater accuracy and robustness of the system. Second, the combination of deep learning and reinforcement learning techniques is expected to empower robots to better interpret environmental patterns, adapt to dynamic changes, and make more effective real-time decisions. Third, large language models will enhance human-robot interaction by enabling natural language commands, improving collaborative operations. Finally, the integration of digital twin technology will support more intelligent path planning and decision-making processes, especially in unmanned farms and livestock management systems. The convergence of digital twin technology with SLAM is projected to yield innovative solutions for intelligent perception and is likely to play a transformative role in the realm of agricultural automation. This synergy is anticipated to revolutionize the approach to agricultural tasks, enhancing their efficiency and reducing the reliance on labor.

参考文献

1 ZHAO X Z, SHAO S L, WANG T, et al. A review of multi-robot collaborative simultaneous localization and mapping[C]// 2023 IEEE International Conference on Unmanned Systems (ICUS). Piscataway, New Jersey, USA: IEEE, 2023: 900-905.
2 RIZK Y, AWAD M, TUNSTEL E W. Cooperative heterogeneous multi-robot systems: A survey[J]. ACM computing surveys, 2019, 52(2): ID 29.
3 李纪鑫, 吴宗卓, 赫磊, 等. 多机器人协同的SLAM算法研究[J]. 自动化与仪器仪表, 2023(9): 205-209.
  LI J X, WU Z Z, HE L, et al. SLAM algorithm for multi-robot collaboration[J]. Automation & instrumentation, 2023(9): 205-209.
4 卫恒, 吕强, 林辉灿, 等. 多机器人SLAM后端优化算法综述[J]. 系统工程与电子技术, 2017, 39(11): 2553-2565.
  WEI H, LYU Q, LIN H C, et al. Survey on multi-robot SLAM back-end optimization algorithm[J]. Systems engineering and electronics, 2017, 39(11): 2553-2565.
5 阴贺生, 裴硕, 徐磊, 等. 多机器人视觉同时定位与建图技术研究综述[J]. 机械工程学报, 2022, 58(11): 11-36.
  YIN H S, PEI S, XU L, et al. Review of research on multi-robot visual simultaneous localization and mapping[J]. Journal of mechanical engineering, 2022, 58(11): 11-36.
6 裴凌, 李涛, 花彤, 等. 多源融合定位算法综述[J]. 南京信息工程大学学报(自然科学版), 2022, 14(6): 635-648.
  PEI L, LI T, HUA T, et al. A survey of multi-source fusion positioning algorithms[J]. Journal of Nanjing university of information science & technology (natural science edition), 2022, 14(6): 635-648.
7 张迎雪, 陈萌, 陈金宝, 等. 多机器人智能化协同技术研究进展[J]. 载人航天, 2021, 27(6): 767-778.
  ZHANG Y X, CHEN M, CHEN J B, et al. Research progress of intelligent cooperative technology for multiple robots[J]. Manned spaceflight, 2021, 27(6): 767-778.
8 王曦杨, 陈炜峰, 尚光涛, 等. 基于多机器人的协同VSLAM综述[J/OL].南京信息工程大学学报(自然科学版), 1-31. [2024-03-04].
  WANG X Y, CHEN W F, SHANG G T, et al. A review of VSLAM based on multiple robots[J/OL]. Journal of Nanjing University of information science & technology, 1-31. [2024-03-04]. .
9 ALOUI K, GUIZANI A, HAMMADI M, et al. Systematic literature review of collaborative SLAM applied to autonomous mobile robots[C]// 2022 IEEE Information Technologies & Smart Industrial Systems (ITSIS). Piscataway, New Jersey, USA: IEEE, 2022: 1-5.
10 DAI B, HE Y Q, GU F, et al. A vision-based autonomous aerial spray system for precision agriculture[C]// 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO). Piscataway, New Jersey, USA: IEEE, 2017: 507-513.
11 LIU J, SINGH H, ELSAYED S, et al. Planning-assisted context-sensitive autonomous shepherding of dispersed robotic swarms in obstacle-cluttered environments[EB/OL]. arXiv: , 2023.
12 JU C, KIM J, SEOL J, et al. A review on multirobot systems in agriculture[J]. Computers and electronics in agriculture, 2022, 202: ID 107336.
13 满忠贤, 何杰, 刘善琪, 等. 智能农机多机协同收获作业控制方法与试验[J]. 农业工程学报, 2024, 40(1): 17-26.
  MAN Z X, HE J, LIU S Q, et al. Method and test for operating multi-machine cooperative harvesting in intelligent agricultural machinery[J]. Transactions of the Chinese society of agricultural engineering, 2024, 40(1): 17-26.
14 POTENA C, KHANNA R, NIETO J, et al. AgriColMap: Aerial-ground collaborative 3D mapping for precision farming[J]. IEEE robotics and automation letters, 2019, 4(2): 1085-1092.
15 TOURRETTE T, DEREMETZ M, NAUD O, et al. Close coordination of mobile robots using radio beacons: A new concept aimed at smart spraying in agriculture[C]// 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway, New Jersey, USA: IEEE, 2018: 7727-7734.
16 MITCHELL H B. Multi-sensor data fusion: An introduction[M]. Berlin, New York: Springer Verlag, 2007
17 ROYO S, BALLESTA-GARCIA M. An overview of lidar imaging systems for autonomous vehicles[J]. Applied sciences, 2019, 9(19): ID 4093.
18 ALONSO V, DACAL-NIETO A, BARRETO L, et al. Industry 4.0 implications in machine vision metrology: An overview[J]. Procedia manufacturing, 2019, 41: 359-366.
19 PODDAR S, KUMAR V, KUMAR A. A comprehensive overview of inertial sensor calibration techniques[J]. Journal of dynamic systems, measurement, and control, 2017, 139(1): ID 011006.
20 马争光, 赵永国, 刘成业, 等. 激光和视觉融合SLAM方法研究综述[J]. 计算机测量与控制, 2019, 27(3): 1-6.
  MA Z G, ZHAO Y G, LIU C Y, et al. Survey of SLAM with laser-camera fusion sensor[J]. Computer measurement & control, 2019, 27(3): 1-6.
21 刘辉, 蒙丽雯, 段一戬, 等. 基于4D相关性金字塔的雷达-视觉传感器外参在线标定方法[J]. 中国激光, 2024,51 (17): 184-195.
  LIU H, MENG L W, DUAN Y J, et al. Online calibration method of lidar-visual sensor external parameters based on 4D correlation pyramid[J]. Chinese Journal of Lasers, 2024,51 (17): 184-195.
22 ZHANG Q L, PLESS R. Extrinsic calibration of a camera and laser range finder (improves camera calibration)[C]// 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway, Jersey New, USA: IEEE, 2004, 3: 2301-2306.
23 覃兴胜, 李晓欢, 唐欣, 等. 基于标定板关键点的激光雷达与相机外参标定方法[J]. 激光与光电子学进展, 2022, 59(4): ID 0428001.
  QIN X S, LI X H, TANG X, et al. Extrinsic calibration method of lidar and camera based on key points of calibration board[J]. Laser & optoelectronics progress, 2022, 59(4): ID 0428001.
24 KUMMERLE J, KUHNER T, LAUER M. Automatic calibration of multiple cameras and depth sensors with a spherical target[C]// 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway, New Jersey, USA: IEEE, 2018.
25 PANDEY G, MCBRIDE J, SAVARESE S, et al. Automatic targetless extrinsic calibration of a 3D LiDAR and camera by maximizing mutual information[J]. Proceedings of the AAAI conference on artificial intelligence, 2021, 26(1): 2053-2059.
26 ZHU Y W, ZHENG C R, YUAN C J, et al. CamVox: A low-cost and accurate lidar-assisted visual SLAM system[C]// 2021 IEEE International Conference on Robotics and Automation (ICRA). Piscataway, New Jersey, USA: IEEE, 2021: 5049-5055.
27 LIU X Y, YUAN C J, ZHANG F. Targetless extrinsic calibration of multiple small FoV LiDARs and cameras using adaptive voxelization[J]. IEEE transactions on instrumentation and measurement, 2022, 71: 1-12.
28 IYER G, RAM R K, MURTHY J K, et al. CalibNet: Geometrically supervised extrinsic calibration using 3D spatial transformer networks[C]// 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway, New Jersey, USA: IEEE, 2018: 1110-1117.
29 LYU X D, WANG B Y, DOU Z W, et al. LCCNet: LiDAR and camera self-calibration using cost volume network[C]// 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). Piscataway, New Jersey, USA: IEEE, 2021: 2888-2895.
30 WU Y, ZHU M, LIANG J. PSNet: LiDAR and camera registration using parallel subnetworks[J]. IEEE access, 2022, 10: 70553-70561.
31 SEISKARI O, RANTALANKILA P, KANNALA J, et al. HybVIO: Pushing the limits of real-time visual-inertial odometry[C]// 2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV). Piscataway, New Jersey, USA: IEEE, 2022: 287-296.
32 MOURIKIS A I, ROUMELIOTIS S I. A multi-state constraint Kalman filter for vision-aided inertial navigation[C]// Proceedings 2007 IEEE International Conference on Robotics and Automation. Piscataway, New Jersey, USA: IEEE, 2007: 3565-3572.
33 LEUTENEGGER S, FURGALE P, RABAUD V, et al. Keyframe-based visual-inertial SLAM using nonlinear optimization[C]// Robotics: Science and Systems IX. Robotics: Science and Systems Foundation, 2013.
34 LI P L, QIN T, HU B T, et al. Monocular visual-inertial state estimation for mobile augmented reality[C]// 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). Piscataway, New Jersey, USA: IEEE, 2017: 11-21.
35 QIN T, SHEN S J. Robust initialization of monocular visual-inertial estimation on aerial robots[C]// 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway, New Jersey, USA: IEEE, 2017: 4225-4232.
36 BLOESCH M, BURRI M, OMARI S, et al. Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback[J]. The international journal of robotics research, 2017, 36(10): 1053-1072.
37 VON STUMBERG L, CREMERS D. DM-VIO: Delayed marginalization visual-inertial odometry[J]. IEEE robotics and automation letters, 2022, 7(2): 1408-1415.
38 HOSSAIN S, LIN X K. Uncertainty-aware tightly-coupled GPS fused LIO-SLAM[EB/OL]. arXiv:, 2022.
39 ZHANG J, SINGH S. LOAM: LiDAR odometry and mapping in real-time[C]// Robotics: Science and Systems. Berkeley, California, USA: Robotics: Science and Systems Foundation, 2014: 1-9
40 SHAN T X, ENGLOT B. LeGO-LOAM: Lightweight and ground-optimized lidar odometry and mapping on variable terrain[C]// 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway, New Jersey, USA: IEEE, 2018: 4758-4765.
41 YE H Y, CHEN Y Y, LIU M. Tightly coupled 3D lidar inertial odometry and mapping[C]// 2019 International Conference on Robotics and Automation (ICRA). Piscataway, New Jersey, USA: IEEE, 2019: 3144-3150.
42 SHAN T X, ENGLOT B, MEYERS D, et al. LIO-SAM: Tightly-coupled lidar inertial odometry via smoothing and mapping[C]// 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway, New Jersey, USA: IEEE, 2020: 5135-5142.
43 CHEN B F, ZHAO H W, ZHU R Y, et al. Marked-LIEO: Visual marker-aided LiDAR/IMU/encoder integrated odometry[J]. Sensors, 2022, 22(13): ID 4749.
44 ZHANG H K, DU L, BAO S, et al. LVIO-fusion: Tightly-coupled LiDAR-visual-inertial odometry and mapping in degenerate environments[J]. IEEE robotics and automation letters, 2024, 9(4): 3783-3790.
45 ZHANG J, SINGH S. Visual-lidar odometry and mapping: Low-drift, robust, and fast[C]// 2015 IEEE International Conference on Robotics and Automation (ICRA). Piscataway, New Jersey, USA: IEEE, 2015: 2174-2181.
46 SHAO W Z, VIJAYARANGAN S, LI C, et al. Stereo visual inertial LiDAR simultaneous localization and mapping[C]// 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway, New Jersey, USA: IEEE, 2019: 370-377.
47 GRAETER J, WILCZYNSKI A, LAUER M. LIMO: LiDAR-monocular visual odometry[C]// 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway, New Jersey, USA: IEEE, 2018: 7872-7879.
48 WISTH D, CAMURRI M, DAS S, et al. Unified multi-modal landmark tracking for tightly coupled lidar-visual-inertial odometry[J]. IEEE robotics and automation letters, 2021, 6(2): 1004-1011.
49 SHAN T X, ENGLOT B, RATTI C, et al. LVI-SAM: Tightly-coupled lidar-visual-inertial odometry via smoothing and mapping[C]// 2021 IEEE International Conference on Robotics and Automation (ICRA). Piscataway, New Jersey, USA: IEEE, 2021: 5692-5698.
50 裴昭义. 面向协作SLAM的多机器人系统学习方法研究[D]. 哈尔滨: 哈尔滨工业大学, 2022.
  PEI Z Y. Research on learning method of multi-robot system oriented to cooperative SLAM[D]. Harbin: Harbin Institute of Technology, 2022.
51 REN H L, WU J D, LIN T L, et al. Research on an intelligent agricultural machinery unmanned driving system[J]. Agriculture, 2023, 13(10): ID 1907.
52 ELAMVAZHUTHI K, KAKISH Z, SHIRSAT A, et al. Controllability and stabilization for herding a robotic swarm using a leader: A mean-field approach[J]. IEEE transactions on robotics, 2021, 37(2): 418-432.
53 ZHUANG Y, WANG Z D, YU H Y, et al. A robust extended H∞ filtering approach to multi-robot cooperative localization in dynamic indoor environments[J]. Control engineering practice, 2013, 21(7): 953-961.
54 KEVIN E, HUANG G Q. State-transition and observability constrained EKF for multi-robot cooperative localization[C]// 2015 34th Chinese Control Conference (CCC). Piscataway, New Jersey, USA: IEEE, 2015: 7404-7410.
55 HAO N, HE F, TIAN C, et al. KD-EKF: A kalman decomposition based extended kalman filter for multi-robot cooperative localization[EB/OL]. arXiv: , 2022.
56 ZHAO L, DAI H Y, LANG L, et al. An adaptive filtering method for cooperative localization in leader-follower AUVs[J]. Sensors, 2022, 22(13): ID 5016.
57 WANASINGHE T R, MANN G K I, GOSINE R G. Decentralized cooperative localization for heterogeneous multi-robot system using split covariance intersection filter[C]// 2014 Canadian Conference on Computer and Robot Vision. Piscataway, New Jersey, USA: IEEE, 2014: 167-174.
58 HAN R H, CHEN S D, BU Y S, et al. Decentralized cooperative multi-robot localization with EKF[EB/OL]. arXiv: , 2018.
59 TIAN C G, HAO N, HE F H, et al. Distributed consistent multi-robot cooperative localization: A coordinate transformation approach[EB/OL]. arXiv:, 2023.
60 ZHOU L, LINZHOU L Z, LUZHANG Y J, et al. Distributed cubature Kalman filter cooperative localization based on parameterized-belief propagation[J]. Journal of internet technology, 2022, 23(3): 497-507.
61 TSAI C C, CHAN C C, TAI F C. Cooperative localization using fuzzy decentralized extended information filtering for homogenous omnidirectional mobile multi-robot system[C]// International conference on system science and engineering. Piscataway, New Jersey, USA: IEEE, 2015
62 ZHANG S J, CAO Y. Cooperative localization approach for multi-robot systems based on state estimation error compensation[J]. Sensors, 2019, 19(18): ID 3842.
63 TANVEER M H, SGORBISSA A, THOMAS A. An IPM approach to multi-robot cooperative localization: Pepper humanoid and wheeled robots in a shared space[M]// Lecture Notes in Electrical Engineering. Cham: Springer International Publishing, 2019: 429-447.
64 WANG X D, SUN S D, LI T C, et al. Fault tolerant multi-robot cooperative localization based on covariance union[J]. IEEE robotics and automation letters, 2021, 6(4): 7799-7806.
65 ZHANG B X, LIU J, CHEN H Y. AMCL based map fusion for multi-robot SLAM with heterogenous sensors[C]// 2013 IEEE International Conference on Information and Automation (ICIA). Piscataway, New Jersey, USA: IEEE, 2013: 822-827.
66 CHEN H Y, ZHONG J, FU Y L, et al. Pose-graph based 3D map fusion with distributed robot system[C]// 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014). Piscataway, New Jersey, USA: IEEE, 2014: 1608-1613.
67 YUE Y F, YANG C L, WANG Y Z, et al. Multi-robot map fusion framework using heterogeneous sensors[C]// 2019 IEEE International Conference on Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM). Piscataway, New Jersey, USA: IEEE, 2019: 536-541.
68 XU B L, SUN S Q. Online map fusion system based on sparse point-cloud[J]. International journal of automation and control, 2021, 15(4/5): ID 585.
69 VELáSQUEZ HERNáNDEZ C A, PRIETO ORTIZ F A. A real-time map merging strategy for robust collaborative reconstruction of unknown environments[J]. Expert systems with applications, 2020, 145: ID 113109.
70 CHAI C, HUANG Y Z. Two-step method grid map merging based on corner points[C]// 2022 41st Chinese Control Conference (CCC). Piscataway, New Jersey, USA: IEEE, 2022: 6298-6303.
71 ANDERSSON L A A, NYGARDS J. C-SAM: Multi-Robot SLAM using square root information smoothing[C]// 2008 IEEE International Conference on Robotics and Automation. Piscataway, New Jersey, USA: IEEE, 2008: 2798-2805.
72 CUNNINGHAM A, INDELMAN V, DELLAERT F. DDF-SAM 2.0: Consistent distributed smoothing and mapping[C]// 2013 IEEE International Conference on Robotics and Automation. Piscataway, New Jersey, USA: IEEE, 2013: 5220-5227.
73 CHANG H J, GEORGE LEE C S, HU Y C, et al. Multi-robot SLAM with topological/metric maps[C]// 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, New Jersey, USA: IEEE, 2007: 1467-1472.
74 SUN Y, SUN R C, YU S M, et al. A grid map fusion algorithm based on maximum common subgraph[C]// 2018 13th World Congress on Intelligent Control and Automation (WCICA). Piscataway, New Jersey, USA: IEEE, 2018: 58-63.
75 HADIAN JAZI S. Map-merging using maximal empty rectangles in a multi-robot SLAM process[J]. Journal of mechanical science and technology, 2020, 34(6): 2573-2583.
76 LIN Z Y, ZHU J H, JIANG Z T, et al. Merging grid maps in diverse resolutions by the context-based descriptor[J]. ACM transactions on Internet technology, 2021,21(4): ID 91.
77 WANG M H, CONG M, DU Y, et al. Multi-robot raster map fusion without initial relative position[J]. Robotic intelligence and automation, 2023, 43(5): 498-508.
78 ?ZKUCUR N E, AKIN H L. Cooperative multi-robot map merging using fast-SLAM[M]// Lecture Notes in Computer Science. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010: 449-460.
79 XU W J, JIANG R X, CHEN Y W. Map alignment based on PLICP algorithm for multi-robot SLAM[C]// 2012 IEEE International Symposium on Industrial Electronics. Piscataway, New Jersey, USA: IEEE, 2012: 926-930.
80 HIDAYAT F, TRILAKSONO B R, HINDERSAH H. Distributed multi robot simultaneous localization and mapping with consensus particle filtering[J]. Journal of physics: Conference series, 2017, 801: ID 012003.
81 XU X C, CHEN Z X, GUO J X, et al. Collaborative localization of aerial and ground mobile robots through orthomosaic map[C]// 2020 IEEE International Conference on Real-time Computing and Robotics (RCAR). Piscataway, New Jersey, USA: IEEE, 2020: 122-127.
82 MALAKOUTI-KHAH H, SADEGHZADEH-NOKHODBERIZ N, MONTAZERI A. Simultaneous localization and mapping in a multi-robot system in a dynamic environment with unknown initial correspondence[J]. Frontiers in robotics and AI, 2023, 10: ID 1291672.
83 NATARAJAN R, GENNERT M A. Efficient factor graph fusion for multi-robot mapping and beyond[C]// 2018 21st International Conference on Information Fusion (FUSION). Piscataway, New Jersey, USA: IEEE, 2018: 1137-1145.
84 ZHANG Y T, HSIAO M, DONG J, et al. MR-iSAM2: Incremental smoothing and mapping with multi-root Bayes tree for multi-robot SLAM[C]// 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway, New Jersey, USA: IEEE, 2021: 8671-8678.
85 CHEN W F, WANG X Y, GAO S P, et al. Overview of multi-robot collaborative SLAM from the perspective of data fusion[J]. Machines, 2023, 11(6): ID 653.
86 ZOU D P, TAN P. CoSLAM: Collaborative visual SLAM in dynamic environments[J]. IEEE transactions on pattern analysis and machine intelligence, 2013, 35(2): 354-366.
87 SCHMUCK P, CHLI M. CCM-SLAM: Robust and efficient centralized collaborative monocular simultaneous localization and mapping for robotic teams[J]. Journal of field robotics, 2019, 36(4): 763-781.
88 KARRER M, SCHMUCK P, CHLI M. CVI-SLAM—Collaborative visual-inertial SLAM[J]. IEEE robotics and automation letters, 2018, 3(4): 2762-2769.
89 ZHANG T J, ZHANG L, CHEN Y, et al. CVIDS: A collaborative localization and dense mapping framework for multi-agent based visual-inertial SLAM[J]. IEEE transactions on image processing, 2022, 31: 6562-6576.
90 CHANG Y, EBADI K, DENNISTON C E, et al. LAMP 2.0: A robust multi-robot SLAM system for operation in challenging large-scale underground environments[J]. IEEE robotics and automation letters, 2022, 7(4): 9175-9182.
91 TIAN Y L, CHANG Y, HERRERA ARIAS F, et al. Kimera-multi: Robust, distributed, dense metric-semantic SLAM for multi-robot systems[J]. IEEE transactions on robotics, 2022, 38(4): 2022-2038.
92 LAJOIE P Y, RAMTOULA B, CHANG Y, et al. DOOR-SLAM: Distributed, online, and outlier resilient SLAM for robotic teams[J]. IEEE robotics and automation letters, 2020, 5(2): 1656-1663.
93 HUANG Y W, SHAN T X, CHEN F F, et al. DiSCo-SLAM: Distributed scan context-enabled multi-robot LiDAR SLAM with two-stage global-local graph optimization[J]. IEEE robotics and automation letters, 2022, 7(2): 1150-1157.
94 ZHONG S P, QI Y H, CHEN Z Q, et al. DCL-SLAM: A distributed collaborative LiDAR SLAM framework for a robotic swarm[J]. IEEE sensors journal, 2024, 24(4): 4786-4797.
95 LAJOIE P Y, BELTRAME G. Swarm-SLAM: Sparse decentralized collaborative simultaneous localization and mapping framework for multi-robot systems[J]. IEEE robotics and automation letters, 2024, 9(1): 475-482.
96 RIAZUELO L, CIVERA J, MONTIEL J M M. C2TAM: A Cloud framework for cooperative tracking and mapping[J]. Robotics and autonomous systems, 2014, 62(4): 401-413.
97 WANG Y T, JIANG C Y, CHEN X. Voom: Robust visual object odometry and mapping using hierarchical landmarks[EB/OL]. arXiv:, 2024.
98 MUR-ARTAL R, TARDOS J D. ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras[J]. IEEE transactions on robotics, 2017, 33(5): 1255-1262.
99 CAMPOS C, ELVIRA R, RODRIGUEZ J J G, et al. ORB-SLAM3: An accurate open-source library for visual, visual-inertial, and multimap SLAM[J]. IEEE transactions on robotics, 2021, 37(6): 1874-1890.
100 LABBé M, MICHAUD F. RTAB-Map as an open-source lidar and visual simultaneous localization and mapping library for large-scale and long-term online operation[J]. Journal of field robotics, 2019, 36(2): 416-446.
101 KAKISH Z, ELAMVAZHUTHI K, BERMAN S. Using reinforcement learning to herd a robotic swarm to a target distribution[M]// Springer Proceedings in Advanced Robotics. Cham: Springer International Publishing, 2022: 401-414.
102 刘冬, 于涛, 丛明, 等. 基于深度学习图像特征的动态环境视觉SLAM方法[J]. 华中科技大学学报(自然科学版), 2024, 52(6): 156-163.
  LIU D, YU T, CONG M, et al. Visual SLAM method for dynamic environment based on deep learning image features[J]. Journal of Huazhong university of science and technology (natural science edition), 2024, 52(6): 156-163.
103 徐晓苏, 王睿, 姚逸卿. 基于筛选策略的动态环境下激光SLAM算法[J]. 中国惯性技术学报, 2024, 32(7): 681-689, 695.
  XU X S, WANG R, YAO Y Q. Dynamic environment laser SLAM algorithm with a filtering strategy[J]. Journal of Chinese inertial technology, 2024, 32(7): 681-689, 695.
104 张永宏, 李宇超, 董天天, 等. 非结构化环境下番茄采摘机器人目标识别与检测[J]. 中国农机化学报, 2024, 45(4): 205-213.
  ZHANG Y H, LI Y C, DONG T T, et al. Target identification and detection for tomato harvesting robot in unstructured environments[J]. Journal of Chinese agricultural mechanization, 2024, 45(4): 205-213.
105 陈孟元, 韩朋朋, 刘金辉, 等. 动态遮挡场景下基于改进Transformer实例分割的VSLAM算法[J]. 电子学报, 2023, 51(7): 1812-1825.
  CHEN M Y, HAN P P, LIU J H, et al. Improved Transformer instance segmentation under dynamic occlusion based VSLAM algorithm[J]. Acta electronica sinica, 2023, 51(7): 1812-1825.
106 张晨阳, 杨健.一种自适应点线特征和IMU耦合的视觉SLAM方法[J/OL]. 武汉大学学报(信息科学版), 1-19. [2024-08-29]. .
  ZHANG C Y, YANG J. A Visual SLAM Method Coupled with Adaptive Point-line Features and IMU[J/OL]. Geomatics and Information Science of Wuhan University, 1-19. [2024-08-29]..
107 WANG T, XU X B, WANG C, et al. From smart farming towards unmanned farms: A new mode of agricultural production[J]. Agriculture, 2021, 11(2): 145.
108 高春艳, 陶渊, 吕晓玲, 等. 非结构化环境下巡检机器人环境感知技术研究综述[J]. 传感器与微系统, 2023, 42(4): 10-13, 18.
  GAO C Y, TAO Y, LYU X L, et al. Research review of environment perception technology of inspection robots in unstructured environment[J]. Transducer and microsystem technologies, 2023, 42(4): 10-13, 18.
109 JIANG S K, WANG S L, YI Z Y, et al. Autonomous navigation system of greenhouse mobile robot based on 3D lidar and 2D lidar SLAM[J]. Frontiers in plant science, 2022, 13: ID 815218.
110 徐斌, 杨东勇. 基于边缘计算的移动机器人视觉SLAM方法[J]. 高技术通讯, 2023, 33(9): 1000-1008.
  XU B, YANG D Y. A visual SLAM method for mobile robot based on edge computing[J]. Chinese high technology letters, 2023, 33(9): 1000-1008.
111 LYTRIDIS C, KABURLASOS V G, PACHIDIS T, et al. An overview of cooperative robotics in agriculture[J]. Agronomy, 2021, 11(9): ID 1818.
112 MAMMARELLA M, COMBA L, BIGLIA A, et al. Cooperation of unmanned systems for agricultural applications: A case study in a vineyard[J]. Biosystems engineering, 2022, 223: 81-102.
113 MAGREE D, JOHNSON E N. Combined laser and vision-aided inertial navigation for an indoor unmanned aerial vehicle[C]// 2014 American Control Conference. Piscataway, New Jersey, USA: IEEE, 2014: 1900-1905.
114 YIN L, LUO B, WANG W, et al. CoMask: Corresponding mask-based end-to-end extrinsic calibration of the camera and LiDAR[J]. Remote sensing, 2020, 12(12): ID 1925.
115 CHEN S B, ZHOU B D, JIANG C H, et al. A LiDAR/visual SLAM backend with loop closure detection and graph optimization[J]. Remote sensing, 2021, 13(14): ID 2720.
116 MUMUNI A, MUMUNI F. CNN architectures for geometric transformation-invariant feature representation in computer vision: A review[J]. SN computer science, 2021, 2(5): ID 340.
117 DONG H L, YU J C, XU Y F, et al. MR-GMMapping: Communication efficient multi-robot mapping system via Gaussian mixture model[J]. IEEE robotics and automation letters, 2022, 7(2): 3294-3301.
118 HUSSEIN A M, IDREES A K, COUTURIER R. Distributed energy-efficient data reduction approach based on prediction and compression to reduce data transmission in IoT networks[J]. International journal of communication systems, 2022, 35(15): e5282.1-e5282.23.
119 ZHANG H, CHENG J Y, ZHANG L, et al. H2GNN: Hierarchical-hops graph neural networks for multi-robot exploration in unknown environments[J]. IEEE robotics and automation letters, 2022, 7(2): 3435-3442.
120 GIELIS J, SHANKAR A, PROROK A. A critical review of communications in multi-robot systems[J]. Current robotics reports, 2022, 3(4): 213-225.
121 ZHANG H F, LI Z T, ZHENG S K, et al. Range-aided drift-free cooperative localization and consistent reconstruction of multi-ground robots[J]. IEEE robotics and automation letters, 2023, 8(4): 2094-2101.
122 CHEN X Y, LIANG W, ZHOU X L, et al. An efficient transmission algorithm for power grid data suitable for autonomous multi-robot systems[J]. Information sciences, 2021, 572: 543-557.
123 SCHALL G, WAGNER D, REITMAYR G, et al. Global pose estimation using multi-sensor fusion for outdoor Augmented Reality[C]// 2009 8th IEEE International Symposium on Mixed and Augmented Reality. Piscataway, New Jersey, USA: IEEE, 2009: 153-162.
124 WANG S, WANG Y C, LI D Y, et al. Distributed relative localization algorithms for multi-robot networks: A survey[J]. Sensors, 2023, 23(5): ID 2399.
125 SHIN K, SIM H, NAM S, et al. Multi-robot relative pose estimation in SE(2) with observability analysis: A comparison of extended Kalman filtering and robust pose graph optimization[EB/OL]. arXiv:, 2024.
126 SHALABY M A, COSSETTE C C, LE NY J, et al. Multi-robot relative pose estimation and IMU preintegration using passive UWB transceivers[J]. IEEE transactions on robotics, 2024, 40: 2410-2429.
127 COSSETTE C C, SHALABY M A, SAUSSIé D, et al. Optimal multi-robot formations for relative pose estimation using range measurements[C]// 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Piscataway, New Jersey, USA: IEEE, 2022: 2431-2437.
文章导航

/