
BotanicGarden : a high-quality dataset for robot navigation in unstructured natural environments
- Author
- Yuanzhi Liu, Yujia Fu, Minghui Qin, Yufeng Xu, Baoxin Xu, Fengdong Chen, Bart Goossens (UGent) , Poly Z.H. Sun, Hongwei Yu, Chun Liu, Long Chen, Wei Tao and Hui Zhao
- Organization
- Abstract
- The rapid developments of mobile robotics and autonomous navigation over the years are largely empowered by public datasets for testing and upgrading, such as sensor odometry and SLAM tasks. Impressive demos and benchmark scores have arisen, which may suggest the maturity of existing navigation techniques. However, these results are primarily based on moderate structured scenario testing. When transitioning to challenging unstructured environments, especially in GNSS-denied, texture-monotonous, and dense-vegetated natural fields, their performance can hardly sustain at a high level and requires further validation and improvement. To bridge this gap, we build a novel robot navigation dataset in a luxuriant botanic garden of more than 48000m 2 . Comprehensive sensors are used, including Gray and RGB stereo cameras, spinning and MEMS 3D LiDARs, and low-cost and industrial-grade IMUs, all of which are well calibrated and hardware-synchronized. An all-terrain wheeled robot is employed for data collection, traversing through thick woods, riversides, narrow trails, bridges, and grasslands, which are scarce in previous resources. This yields 33 short and long sequences, forming 17.1km trajectories in total. Excitedly, both highly-accurate ego-motions and 3D map ground truth are provided, along with fine-annotated vision semantics. We firmly believe that our dataset can advance robot navigation and sensor fusion research to a higher level.
- Keywords
- Artificial Intelligence, Control and Optimization, Computer Science Applications, Computer Vision and Pattern Recognition, Mechanical Engineering, Human-Computer Interaction, Biomedical Engineering, Control and Systems Engineering
Downloads
-
BotanicGarden A High-Quality Dataset EarlyAccess.pdf
- full text (Accepted manuscript)
- |
- open access
- |
- |
- 4.95 MB
-
(...).pdf
- full text (Published version)
- |
- UGent only
- |
- |
- 3.94 MB
Citation
Please use this url to cite or link to this publication: http://hdl.handle.net/1854/LU-01HNHTSWYGXYRGJMBDWQB4C2AZ
- MLA
- Liu, Yuanzhi, et al. “BotanicGarden : A High-Quality Dataset for Robot Navigation in Unstructured Natural Environments.” IEEE ROBOTICS AND AUTOMATION LETTERS, vol. 9, no. 3, 2024, pp. 2798–805, doi:10.1109/lra.2024.3359548.
- APA
- Liu, Y., Fu, Y., Qin, M., Xu, Y., Xu, B., Chen, F., … Zhao, H. (2024). BotanicGarden : a high-quality dataset for robot navigation in unstructured natural environments. IEEE ROBOTICS AND AUTOMATION LETTERS, 9(3), 2798–2805. https://doi.org/10.1109/lra.2024.3359548
- Chicago author-date
- Liu, Yuanzhi, Yujia Fu, Minghui Qin, Yufeng Xu, Baoxin Xu, Fengdong Chen, Bart Goossens, et al. 2024. “BotanicGarden : A High-Quality Dataset for Robot Navigation in Unstructured Natural Environments.” IEEE ROBOTICS AND AUTOMATION LETTERS 9 (3): 2798–2805. https://doi.org/10.1109/lra.2024.3359548.
- Chicago author-date (all authors)
- Liu, Yuanzhi, Yujia Fu, Minghui Qin, Yufeng Xu, Baoxin Xu, Fengdong Chen, Bart Goossens, Poly Z.H. Sun, Hongwei Yu, Chun Liu, Long Chen, Wei Tao, and Hui Zhao. 2024. “BotanicGarden : A High-Quality Dataset for Robot Navigation in Unstructured Natural Environments.” IEEE ROBOTICS AND AUTOMATION LETTERS 9 (3): 2798–2805. doi:10.1109/lra.2024.3359548.
- Vancouver
- 1.Liu Y, Fu Y, Qin M, Xu Y, Xu B, Chen F, et al. BotanicGarden : a high-quality dataset for robot navigation in unstructured natural environments. IEEE ROBOTICS AND AUTOMATION LETTERS. 2024;9(3):2798–805.
- IEEE
- [1]Y. Liu et al., “BotanicGarden : a high-quality dataset for robot navigation in unstructured natural environments,” IEEE ROBOTICS AND AUTOMATION LETTERS, vol. 9, no. 3, pp. 2798–2805, 2024.
@article{01HNHTSWYGXYRGJMBDWQB4C2AZ, abstract = {{The rapid developments of mobile robotics and autonomous navigation over the years are largely empowered by public datasets for testing and upgrading, such as sensor odometry and SLAM tasks. Impressive demos and benchmark scores have arisen, which may suggest the maturity of existing navigation techniques. However, these results are primarily based on moderate structured scenario testing. When transitioning to challenging unstructured environments, especially in GNSS-denied, texture-monotonous, and dense-vegetated natural fields, their performance can hardly sustain at a high level and requires further validation and improvement. To bridge this gap, we build a novel robot navigation dataset in a luxuriant botanic garden of more than 48000m 2 . Comprehensive sensors are used, including Gray and RGB stereo cameras, spinning and MEMS 3D LiDARs, and low-cost and industrial-grade IMUs, all of which are well calibrated and hardware-synchronized. An all-terrain wheeled robot is employed for data collection, traversing through thick woods, riversides, narrow trails, bridges, and grasslands, which are scarce in previous resources. This yields 33 short and long sequences, forming 17.1km trajectories in total. Excitedly, both highly-accurate ego-motions and 3D map ground truth are provided, along with fine-annotated vision semantics. We firmly believe that our dataset can advance robot navigation and sensor fusion research to a higher level.}}, author = {{Liu, Yuanzhi and Fu, Yujia and Qin, Minghui and Xu, Yufeng and Xu, Baoxin and Chen, Fengdong and Goossens, Bart and Sun, Poly Z.H. and Yu, Hongwei and Liu, Chun and Chen, Long and Tao, Wei and Zhao, Hui}}, issn = {{2377-3766}}, journal = {{IEEE ROBOTICS AND AUTOMATION LETTERS}}, keywords = {{Artificial Intelligence,Control and Optimization,Computer Science Applications,Computer Vision and Pattern Recognition,Mechanical Engineering,Human-Computer Interaction,Biomedical Engineering,Control and Systems Engineering}}, language = {{eng}}, number = {{3}}, pages = {{2798--2805}}, title = {{BotanicGarden : a high-quality dataset for robot navigation in unstructured natural environments}}, url = {{http://doi.org/10.1109/lra.2024.3359548}}, volume = {{9}}, year = {{2024}}, }
- Altmetric
- View in Altmetric
- Web of Science
- Times cited: