Abstract

This paper presents a method of visual LiDAR odometry and forest mapping, leveraging tree trunk detection and LiDAR localization techniques. In environments like dense forests, where smooth GPS signals are unreliable, we employ camera and LiDAR sensors to accurately estimate the robot's position. However, forested or orchard settings introduce unique challenges, including a diverse mixture of trees, tall grass, and uneven terrain. To address these complexities, we propose a distance-based filtering method to extract data composed solely of tree trunk information from 2D LiDAR. By restoring arc data from the LiDAR sensor to its circular shape, we obtain position and radius measurements of reference trees in the LiDAR coordinate system. Then, these values are stored in a comprehensive tree trunk database. Our approach combines visual-based SLAM and LiDAR-based SLAM independently, followed by an integration step using the Extended Kalman Filter (EKF) to improve odometry estimation. Utilizing the obtained odometry information and the EKF, we generate a tree map based on observed trees. In addition, we use the tree position in the map as the landmark to reduce the localization error in the proposed SLAM algorithm. Experimental results show that the loop-closing error ranges between 0.3 to 0.5 meters. In the future, it is expected that this method will also be applicable in the fields of path planning and navigation.

Details

Title
VISUAL LIDAR ODOMETRY USING TREE TRUNK DETECTION AND LIDAR LOCALIZATION
Author
Park, K W 1   VIAFID ORCID Logo  ; Park, S Y 1 

 Graduate School of Electronic and Electrical Engineering, Kyungpook National University, Daegu 41566, South Korea; Graduate School of Electronic and Electrical Engineering, Kyungpook National University, Daegu 41566, South Korea 
Pages
627-632
Publication year
2023
Publication date
2023
Publisher
Copernicus GmbH
ISSN
16821750
e-ISSN
21949034
Source type
Conference Paper
Language of publication
English
ProQuest document ID
2901128630
Copyright
© 2023. This work is published under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.