Autonomous Navigation of a Forestry Robot Equipped with a Scanning Laser
Abstract
:1. Introduction
2. Forestry Robot Concept
- Tree detection algorithm discriminates between trees and non-tree objects using laser scanner data, as shown in Figure 2. Some studies [9] focus on the use of terrestrial 2D laser scanners to estimate the width of the trees trunk and non-tree objects. The width is determined by detecting the start and the end edges of each object. Such laser scanners can provide scan lines in a typically horizontal plane. In this study, the LiDAR scan data have been used to detect tree trunks.
- Map generation is the process of creating a map of the robot environment based on a laser scanner. The map can be loaded manually in the memory of the robot (graphical representation, matrix representation) or can be built progressively while the robot explores the new environment. To implement tree weeding tasks, the ideal option is to estimate the position of the individual trees rather than the tree rows. Therefore, the proposed forest map is built based on tree trunk detection and consists of the 2D location of the individual trees.
- Localization is to estimate the robot position and orientation with respect to the environment. In this case, the forestry robot does not have any prior details about its environment. For accurate localization, a sensor data fusion is proposed to correct the robot position given by the odometer and IMU data.
- Path planning or motion planning is to find the optimal path between initial and final locations. The path planning problem can be classified into global and local planners. The local planner takes the path generated by the global planner and send the velocity command to the base controller to execute it.
- Path execution or the base controller is the lowest level of motion control that can turn the drive wheels at a desired velocity. It is usually done in two separated steps: the lateral control calculates the vehicle cap and the longitudinal control determines the braking and traction torques of the driving wheels in order to follow the desired path.
- Weeding tree is one common manual task done by the professional forester, in the first years after planting. Our forestry robot is equipped with a robotic manipulator that can perform weeding tasks around the tree trunk, as shown in Figure 3. The onboard manipulator could be further improved with an interchangeable end-effector allowing different functionalities.
3. Autonomous Navigation
3.1. Localization and Mapping Algorithm
3.2. Motion Planning Algorithm
Algorithm 1: The Algorithm. |
4. ROS Architecture and Simulation Results
4.1. Data Visualization and 3D Simulator
4.2. Coordinate Frames Transformations
- map frame represents the world map to which the robot pose is related. Its origin is located at some arbitrarily chosen point in the world. This coordinate frame is fixed in the world.
- odom frame is fixed in the world. Its origin is at the point where the robot is initialized.
- base-footprint frame moves as the robot moves. Its origin is directly under the center of the robot.
- base-link frame is rigidly attached to the mobile robot base. This frame is considered as a moving frame.
- sensor-link frame remains fixed relative to the base-link frame. This coordinate frame has its origin at the center of the onboard sensor (LiDAR, IMU).
4.3. Navigation Stack
4.3.1. Mapping Generation
4.3.2. Localization
4.3.3. Motion Planning
4.4. Base Controller
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Ali, W.; Georgsson, F.; Hellstrom, T. Visual tree detection for autonomous navigation in forest environment. In Proceedings of the 2008 IEEE Intelligent Vehicles Symposium, Eindhoven, The Netherlands, 4–6 June 2008; pp. 560–565. [Google Scholar]
- Olofsson, K.; Holmgren, J.; Olsson, H. Tree stem and height measurements using terrestrial laser scanning and the ransac algorithm. Remote Sens. 2014, 6, 4323–4344. [Google Scholar] [CrossRef] [Green Version]
- Safaie, A.H.; Rastiveis, H.; Shams, A.; Sarasua, W.A.; Li, J. Automated street tree inventory using mobile lidar point clouds based on hough transform and active contours. ISPRS J. Photogramm. Remote Sens. 2021, 174, 19–34. [Google Scholar] [CrossRef]
- Pfeifer, N.; Gorte, B.; Winterhalder, D. Automatic reconstruction of single trees from terrestrial laser scanner data. In Proceedings of the 20th ISPRS Congress, Istanbul, Turkey, 12–13 July 2004; pp. 114–119. [Google Scholar]
- Shalal, N.; Low, T.; McCarthy, C.; Hancock, N. Orchard mapping and mobile robot localisation using on-board camera and laser scanner data fusion—Part a: Tree detection. Comput. Electron. Agric. 2015, 119, 267–278. [Google Scholar] [CrossRef]
- Shalev, O.; Degani, A. Canopy-based monte carlo localization in orchards using top-view imagery. IEEE Robot. Autom. Lett. 2020, 5, 2403–2410. [Google Scholar] [CrossRef]
- Goebel, P. ROS by Example; Lulu: Morrisville, NC, USA, 2015. [Google Scholar]
- Koubâa, A.; Bennaceur, H.; Chaari, I.; Trigui, S.; Ammar, A.; Sriti, M.F.; Alajlan, M.; Cheikhrouhou, O.; Javed, Y. Robot Path Planning and Cooperation: Foundations, Algorithms and Experimentations; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
- Shalal, N.; Low, T.; McCarthy, C.; Hancock, N. Orchard mapping and mobile robot localisation using on-board camera and laser scanner data fusion—Part b: Tree detection. Comput. Electron. Agric. 2015, 119, 254–266. [Google Scholar] [CrossRef]
- Murphy, K.P. Bayesian map learning in dynamic environments. In Proceedings of the Advances in Neural Information Processing Systems 12 (NIPS 1999), Denver, CO, USA, 29 November–4 December 1999; pp. 1015–1021. [Google Scholar]
- Grisetti, G.; Stachniss, C.; Burgard, W. Improved techniques for grid mapping with rao-blackwellized particle filters. IEEE Trans. Robot. 2007, 23, 34–46. [Google Scholar] [CrossRef] [Green Version]
- Ajeil, F.H.; Ibraheem, I.K.; Azar, A.T.; Humaidi, A.J. Autonomous navigation and obstacle avoidance of an omnidirectional mobile robot using swarm optimization and sensors deployment. Int. J. Adv. Robot. Syst. 2020, 17, 1729881420929498. [Google Scholar] [CrossRef]
- Muhtadin; Zanuar, R.M.; Purnama, I.K.E.; Purnomo, M.H. Autonomous Navigation and Obstacle Avoidance For Service Robot. In Proceedings of the 2019 International Conference on Computer Engineering, Network, and Intelligent Multimedia (CENIM), Surabaya, Indonesia, 19–20 November 2019; pp. 1–8. [Google Scholar]
- Clearpathrobotics. Available online: https://clearpathrobotics.com/husky-unmanned-ground-vehicle-robot/ (accessed on 24 November 2022).
- Foote, T. tf: The transform library. In Proceedings of the IEEE Conference on Technologies for Practical Robot Applications, Maribor, Slovenia, 10–12 December 2013; pp. 1–6. [Google Scholar]
- Blok, P.M.; van Boheemen, K.; Evert, F.K.v.; Jsselmuiden, J.I.; Kim, G.-H. Robot navigation in orchards with localization based on Particle filter and Kalman filter. Comput. Electron. Agric. 2019, 157, 261–269. [Google Scholar] [CrossRef]
- Terejanu, G.A. Extended Kalman Filter Tutorial; University at Buffalo: Getzville, NY, USA, 2008. [Google Scholar]
- Alatise, M.B.; Hancke, G.P. Pose estimation of a mobile robot based on fusion of imu data and vision data using an extended kalman filter. Sensors 2017, 10, 2164. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Missura, M.; Maren, B. Predictive collision avoidance for the dynamic window approach. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; IEEE: Piscataway, NJ, USA, 2019. [Google Scholar]
- Ye, J. Tracking control for nonholonomic mobile robots: Integrating the analog neural network into the backstepping technique. Neurocomputing 2008, 71, 3373–3378. [Google Scholar] [CrossRef]
Technical Specifications | |
---|---|
Dimensions | mm |
Weight | 50 kg |
Wheel radius | 330 mm |
Maximum payload | 75 kg |
Maximum payload all terrain | 20 kg |
Maximum speed | 1 m/s |
Transmission | driven wheels |
Maximum slope | |
Drivers/API | ROS, , Python |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ben Abdallah, F.; Bouali, A.; Meausoone, P.-J. Autonomous Navigation of a Forestry Robot Equipped with a Scanning Laser. AgriEngineering 2023, 5, 1-11. https://doi.org/10.3390/agriengineering5010001
Ben Abdallah F, Bouali A, Meausoone P-J. Autonomous Navigation of a Forestry Robot Equipped with a Scanning Laser. AgriEngineering. 2023; 5(1):1-11. https://doi.org/10.3390/agriengineering5010001
Chicago/Turabian StyleBen Abdallah, Fida, Anis Bouali, and Pierre-Jean Meausoone. 2023. "Autonomous Navigation of a Forestry Robot Equipped with a Scanning Laser" AgriEngineering 5, no. 1: 1-11. https://doi.org/10.3390/agriengineering5010001