Next Article in Journal
Spray Deposition and Distribution on Rice as Affected by a Boom Sprayer with a Canopy-Opening Device
Next Article in Special Issue
Evaluation of Aspergillus flavus Growth and Detection of Aflatoxin B1 Content on Maize Agar Culture Medium Using Vis/NIR Hyperspectral Imaging
Previous Article in Journal
Design and Experimental Study of Bionic Reverse Picking Header for Fresh Corn
Previous Article in Special Issue
Detection of Water Content in Lettuce Canopies Based on Hyperspectral Imaging Technology under Outdoor Conditions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Use of Hough Transform Method and Knot-Like Turning for Motion Planning and Control of an Autonomous Agricultural Vehicle

1
Mechanical Engineering Department, Zonguldak Bulent Ecevit University, Zonguldak 67100, Turkey
2
Zonguldak Teknopark, Çınartepe Mah., Adnan Menderes Cad., No: 91C/16, Zonguldak 67040, Turkey
Agriculture 2023, 13(1), 92; https://doi.org/10.3390/agriculture13010092
Submission received: 20 February 2022 / Revised: 5 April 2022 / Accepted: 7 April 2022 / Published: 29 December 2022
(This article belongs to the Special Issue Sensors Applied to Agricultural Products)

Abstract

:
This study focuses on motion planning and reference trajectory tracking control of an autonomous agricultural vehicle to achieve precise row following and turning. The smooth time-varying feedback control method was adapted to the system to generate the required control commands. The mathematical representations for motion planning and controllers were constructed based on the car-like robot model. An algorithm to detect trees and rows of trees of an orchard was developed using the Hough transform approach. A new type of turning procedure, called knot-like turning, was proposed to perform turning from one row to another. A simulation environment was created to test and analyze the developed system. To obtain the real data from a field, the trees and rows of trees of a cherry orchard were scanned using a laser scanner rangefinder sensor. Then, the scanned data were moved to the simulation environment to generate the desired trajectory, which was followed by an autonomous agricultural vehicle. The simulation environment made it possible to determine the performance of the proposed motion planning, reference trajectory generation, tracking control and turning procedures. The results presented here indicate that the proposed methodology could be used for desired trajectory tracking tasks for agricultural operations in the case that minimum tracking errors in both straight and turning motions are needed.

1. Introduction

Autonomous systems have been rapidly growing and used in industry for the last two decades. The developments and enhancements gained in autonomous systems, equipment, methods and approaches have also been adapting to agricultural systems. Agricultural systems must be automated, since the world’s population is significantly increasing, and people need more agricultural products. Agricultural applications and operations should be automatically controlled in order to meet these expectations.
With the recent developments in autonomous control in agricultural operations, mechatronics systems are being used together with agricultural tools in an efficient manner. This collaboration can be expected to continue growing in the near future.
This paper presents a methodology that combines four main issues. The first issue was about motion planning and desired trajectory generation for the autonomous agricultural vehicle applications. Reference trajectory and motion planning, which are specified according to a task, were modeled and created based on the car-like robot approach. The second issue was the development control strategies for the steering and driving systems of the vehicle. To generate these control commands, the smooth time-varying feedback control procedure was used and adapted into the proposed system. The third issue was the development of a methodology to conduct the operations of detection of trees and estimation of rows of trees. The methodology was built by following the principles of the Hough transform method. The last issue was to build a turning strategy to perform turning from one row of trees to another. A new type of turning geometry, called knot-like turning, was proposed. It is considered that knot-like turning methodology is useful when there is a limited spare space for turning, and precise row following is the main objective. In order to test the performance of the developed algorithms and methodologies, a simulation environment was created, and a number of simulation studies were conducted.
The Mihaliccik region of Turkey (Eskisehir) (39°51′34.3″ N 31°31′35.4″ E) a well-known place due to its cherry orchards. This region allows the growers to yield high-quality Napoleon cherries, one of the most famous cherry types in the cherry family. The amount of cherry production is rapidly increasing every year. To keep pace with this rise in cherry production, the growers have used autonomous systems in their orchard operations, e.g., the use of autonomous vehicles, robotic systems, etc. As the scope of this study, the trees and rows of trees of one of the cherry orchards in this region were scanned using a mid-range laser scanner rangefinder. Then, the scanned data were moved to the developed simulation environment. In addition to detecting trees, estimating rows of trees and creating reference trajectories (the center line between two consecutive rows of trees), the simulation environment was used to examine a four-wheeled orchard vehicle’s behavior during straight and turning motions. The simulation environment also provided opportunities for testing the performance, efficiency, robustness and repeatability of the proposed control system. The details of the mathematical background of the procedures proposed and the results of simulation studies are presented in this paper.
This paper is organized as follows. The next section discusses related works. The Material and Methods section includes seven subsections: the problem statement, generation of desired trajectory, design steps of the feedback controller, turning procedure proposed in this study, cherry orchard where the trees and rows of trees were scanned using a laser scanner range finder, tree detection procedure and details of row estimation procedure constructed based on the Hough transform method. Prior to the conclusion, the Results and Discussion sections are presented.

Related Works

The problems encountered during autonomous driving in an orchard environment can be divided into the following subproblems: the control of the position and orientation of the vehicle, navigation, trajectory generation/path planning, the detection of trees and estimation of rows of trees, and building the turning path.
The control of nonholonomic-wheeled robots in motion on a plane was studied by the authors of [1], who used feedback control techniques for performing trajectory tracking tasks. This study can be used to model autonomous robot/vehicle applications. The modeling structure proposed in the present study was also guided by this approach. A new path-tracking control for a car-like robot was presented [2]. The control system was constructed using neural predictive control procedure. The study was tested for only indoor applications. The parking problem of a car-like mobile robot was studied by the authors of [3], who focused on the stabilization problem of nonholonomic systems. A simple and efficient methodology for stability analysis was introduced. Turning strategies, which are adaptable to autonomous vehicle applications, were also introduced. The motion control of wheeled mobile robots was investigated [4]. The dynamic feedback linearization technique was used for solving trajectory tracking and set-point regulation problems. The presented system was tested using an indoor mobile robot. Nonlinear trajectory tracking control for a car-like robot was studied [5]. The controller was constructed based on the dynamic feedback linearization method. The modeling structures presented by the authors of [4,5] can be adapted for indoor autonomous trajectory tracking applications. The details of the modeling of the trajectory generation, path planning and tracking controllers for autonomous vehicles were given by the authors of [6]. Different types of approaches for controller design were introduced (the presented approaches were also taken into consideration in the modeling part of this paper). A solution procedure for path tracking problem of a mobile robot was proposed [7]. A robust PID controller was adapted to the system for achieving accurate path tracking. A car-like robot model was used for reference trajectory generation for an autonomous tractor [8]. A nonlinear adaptive controller was designed for both achieving trajectory tracking and rejecting sliding effects. The introduced system was not tested for real orchard applications.
An autonomous navigation system for an orchard vehicle was introduced [9]. The system uses a laser scanner rangefinder. It detects trees and estimates rows of trees using the principles of the Hough transform technique. The proposed system does not address turning from one row to another. Moreover, the turning strategy, mathematical representation and adaptation were not provided. A low-cost localization and navigation methodology was proposed for autonomous vineyard sprayer robots [10]. A data fusion technique that processed the data from various sensors was also introduced. Instead of using a laser scanner, the system uses the visual odometry based on the orchard video data. The motion model of the autonomous vehicle was constructed based on the kinematic approach. The researchers did not focus on a desired trajectory generator, feedback controller or plant detection system.
Trajectory tracking problems for autonomous farming vehicles were studied [11]. A combination of nonlinear and sliding mode controllers was used to create a curved path tracking system. A simple U-turn method was presented. The performance of the controller during tracking the U-shaped desired trajectory was illustrated. The authors of [12] stated the accurate localization of autonomous vehicles in orchard environment depends on how precisely the orchard map is built. To verify this statement, a methodology to create a local orchard map based on the combination of camera and laser scanner data was proposed. The system also introduced a simple tree trunk detection algorithm. A motion model was constructed using a prediction-correction model. U-shaped turning was used for performing turnings between rows. A laser scanning rangefinder sensor model was constructed for performing autonomous navigation of a robot [13]. The identification of the robot’s surrounding was achieved using an algorithm constructed based on the particle filter. The system does not involve a feedback controller. Therefore, the tracking performance was not provided. The performances of the particle filter and localization were not shown for a turning operation. An image-based particle filtering technique was developed and used to perform localization in an agricultural environment [14]. The proposed system obtained the uncertainties for increasing robustness in autonomous navigation. It did not involve a closed-loop control system. To achieve turning between rows of trees, a simple U-shaped turning trajectory was used. Different headland turning techniques using continuous curvature paths were proposed [15]. The turning shapes introduced were tested for cases where the turning was coupled with the motion model, which was constructed based on a car-like robot approach. The desired trajectory generation technique and feedback control design were not presented. A new methodology for generating curvature trajectory for agricultural vehicles was introduced [16]. The forward and backward performances of the curvature trajectory with different steering parameters were analyzed. The study focused on only the curvature trajectory performance. An automatic control algorithm based on laser scanning rangefinder data was proposed for autonomous orchard tractor applications [17]. Navigation of the autonomous tractor and its trailer’s position were studied. In addition to observing the performance of straight-line motion, the performances of wide, tight and U-shaped were investigated.
The solution procedure proposed in this paper is similar to the optimal coverage problem. Several studies have been conducted related to the optimal coverage problem. Specifically, a route-planning technique for autonomous orchard operations was developed [18]. The optimal area coverage method was adapted into the system and tested using a deterministic behavior robot. A comparison between the proposed and non-optimized methods was illustrated. The introduced system does not cover a trajectory generating algorithm or a closed-loop control system. Simple wide-U-shaped turning was used as a turning procedure. Optimal control-based coordinated taxiing path planning and tracking were studied [19]. Dubins curve methods were used to generate the reference geometries. Trajectory planning, trajectory tracking and a feedback control system for autonomous motion control task for an off-axle hitching tractor-trailer system with drawbar were studied [20]. A two-layer optimal control-based method was introduced to generate a reference trajectory. A detailed investigation related to the carrier aircraft’s dispatch path planning on the deck was studied [21]. Different modeling techniques for planning were provided.
After reviewing the literature, this study differentiates itself from previous works by introducing a new methodology. Specifically, reference trajectory generation, a new type of turning procedure, detection of trees and estimation of rows of trees, and newly developed path tracking controller are proposed to achieve precise row following and turning. The proposed methodology should provide enhancement in the trajectory tracking performance.

2. Materials and Methods

2.1. Problem Statement

Consider the four-wheeled autonomous orchard vehicle shown in Figure 1. The vehicle has steering and driving wheels at its front and rear, respectively. The steering angle and longitudinal velocity of the vehicle are presented by δ and V, respectively. The length between the rear and front axles is presented by L. The fixed reference frame used to generate the desired trajectory is shown by the (x-y) coordinate axis. The (x′, y′) coordinate system is placed at the rear-axle-center of the vehicle. The orientation of the vehicle is presented by θ. The difference between the reference and actual trajectory is shown by the error vector (e), whose components are ex and ey in the x and y directions, respectively.
The objective of this vehicle is to follow rows of trees. It detects the end of each tree row, and turns around to position itself parallel to the next tree row. This motion plan makes it possible to develop an autonomous system that can be used for cherry orchard operations such as production automation, human worker augmentation, spraying, pruning, etc. Motion planning should be able to provide precise straight and turning motions, effectively detect trees and rows of trees, accurately estimate the start and end positions of each row, and produce smooth/reliable control signals for steering and driving systems of the orchard vehicle. In order to develop an autonomous system that can meet these expectations, the following key steps were taken into consideration:
Trees were detected using the data from a laser scanning range finder.
The rows of trees were created by adapting the procedure constructed based on the Hough transform method.
A simple methodology was built to recognize the start and end positions of each row.
In order to achieve turning from one row to another, a turning geometry called knot-like turning was used.
The desired trajectory tracking was achieved using a control strategy which was developed by following the principles of the “smooth time-varying feedback control” method.
In the literature, different kinds of generation methods for reference trajectory for straight and turning motions have been used. Many control strategies have also been proposed to track a reference trajectory. In this study, the main objective was to use a simple modeling structure to generate the desired trajectories for straight motion and turning. Turning geometry was effectively used when there was not enough space for turning operation. Moreover, the trajectory tracking controller was easily modeled and adapted into the overall system model. The trees and rows of trees were estimated with precision. Lastly, the overall mathematical structures created for trajectory generation, estimation and control purposes were adapted into real systems.

2.2. Reference Trajectory Generation

A closed-loop control system block diagram is given in Figure 2 to show the details of the proposed system, which includes reference trajectory generation, knot-like turning, tree detection, row estimation and trajectory tracking control. A reference input block is also shown in this block diagram. The block obtains the data from the laser scanner sensor, and detects trees and estimates the row of trees. It is also responsible for generating the reference trajectory. The control system block diagram was used to develop the simulation environment for observing the performance of the detection of trees and estimation of the rows of trees, desired trajectory generation, turning between rows and generation of control signals.
A car-like robot model can be adapted to the system shown in Figure 1. In this model, the front wheels are steerable, whereas the rear wheels are actuated. A car-like robot model [1,6,22] is constructed in Equations (1)–(4). This basic model and basic trajectory generation structure (Equations (1)–(9)) are preferred in autonomous vehicle applications, since their adaptation, performance and use in real-time performance meet the expectations. The desired velocities of the vehicle in the x ( x ˙ d ) and y ( y ˙ d ) directions are given in Equations (1) and (2), respectively. The desired orientation of the vehicle is described by θd. Note that subscript “d” represents the desired value. The desired value of the longitudinal velocity is shown by Vd1. The angular velocity is specified by ( θ ˙ d ) and given in Equation (3). The steering angle speed given in Equation (4) is illustrated by ( δ ˙ d ).
x ˙ d = cos ( θ d ) V d 1
y ˙ d = sin ( θ d ) V d 1
θ ˙ d = tan ( δ d ) ( V d 1 / L )
The reference speed of the steering angle change can be defined as follows:
δ ˙ d = V d 2
Once the desired velocities in x and y directions are defined (Equations (1) and (2)), the desired forward velocity (longitudinal velocity) of the vehicle can be obtained as follows. Note that the sign of ± indicates the forward and backward motion of the vehicle.
V d 1 = x ˙ d 2 + y ˙ d 2
Combining the Equations (1) and (2), the desired orientation of the vehicle can be obtained as given in Equation (6). Note that the atan2(y,x) solution procedure is used because it is similar to the solution of atan(y/x), except that the signs of the components of (y,x) are used to determine the quadrant of the result.
θ d = atan 2 ( y ˙ d V d 1 , x ˙ d V d 1 )
The reference angular velocity of the vehicle can also be derived by differentiating Equation (6):
θ ˙ d = y ¨ d x ˙ d x ¨ d y ˙ d V d 1 2
When the desired trajectory is defined, the desired steering angle can also be built as:
δ d = atan ( L ( y ¨ d x ˙ d x ¨ d y ˙ d ) V d 1 3 )
Finally, the desired steering angle speed can be derived as shown below:
δ ˙ d = V d 2 = L ( y d x ˙ d x d y ˙ d ) V d 1 3 3 LV d 1 ( y ¨ d x ˙ d x ¨ d y ˙ d ) ( x ¨ d x ˙ d + y ¨ d y ˙ d ) V d 1 6 + L 2 ( y ¨ d x ˙ d x ¨ d y ˙ d ) 2

2.3. Controller Design

The control signals required for actuating the steering and driving systems of the vehicle are generated by following the principles of the smooth time-varying feedback control strategy [1,4,5,6,22,23]. The control method uses the chained system rule to decompose the solution techniques in two design stages. In the first design stage of the controller, one control input is provided such that some design requirements are satisfied. Then, the other controller is constructed in order to ensure the stability of (n − 1) dimensional system state. In the second stage, the objective is to guarantee the convergence of the system variables. In this design stage, the overall closed-loop control system stability should be achieved. To set up a new system state, the following form is considered:
X ˜ = ( X ˜ 1 , X ˜ 2 , … …, X ˜ n 1 , X ˜ n ) = ( x 1 , x 2 , … …, x n )
The chain rules are used for reference trajectory generation and controller design. They are also used to define new states and a reference system. The chained form for the system introduced here can be constructed in the following form:
X ˜ ˙ 1 = V 1 X ˜ ˙ 2 = X ˜ 3 V 1 X ˜ ˙ 3 = X ˜ 4 V 1  ⋯  ⋯  ⋯ X ˜ ˙ n 1 = X ˜ n V 1 X ˜ ˙ n = V 2
where V1 and V2 indicate the longitudinal and steering angle velocities, respectively. Adapting the (2, 4) chained form to develop the feedback controllers for the steering and driving systems gives the following new state variables ( X ˜ i , i = 1,…,4). Note that the details are omitted here for clarity. Interested readers may refer to the studies [1,4,5,6,22,23] for details.
X ˜ 1 = x d x = x e X ˜ 2 = y d y = y e X ˜ 3 = tan ( θ d θ ) = tan ( θ e ) X ˜ 4 = ( tan ( δ ) cos ( θ d θ ) tan ( δ d ) ) ( 1 ( L cos ( θ d θ ) 3 ) ) + k 2 ( y d y )
where xd and yd show the x–y components of the desired trajectory. ϴd and δd represent the desired orientation angle and steering angle of the vehicle moving on the desired trajectory, respectively. Tracking errors in the x and y directions are given by xe and ye, respectively. The orientation error is shown by ϴe. k2 is the control parameter which is to be defined according to the system specifications. The time derivatives of the new state variables ( X ˜ ˙ i , i = 1,..,4) are derived in the following form:
X ˜ ˙ 1 = ( X ˜ d / L ) tan ( δ d ) X ˜ 2 + V 1 X ˜ ˙ 2 = ( X ˜ d / L ) tan ( δ d ) X ˜ 1 + X ˜ d X ˜ 3 + V 1 X ˜ 3 X ˜ ˙ 3 = ( k 2 X ˜ d X ˜ 2 ) + X ˜ d X ˜ 4 + V 1 ( X ˜ 4 k 2 X ˜ 2 + ( 1 + X ˜ 3 2 ) tan ( δ d ) / L ) X ˜ ˙ 4 = V 2
The feedback controllers, which are specified by longitudinal velocity (V1) and steering angle speed (V2), can be described in the following form:
V 1 = k 1 | U d 1 | ( X ˜ 1 + ( X ˜ 2 X ˜ 4 k 2 + X ˜ 2 k 2 X ˜ 3 2 ) tan ( δ d ) / L ) V 2 = k 3 | U d 1 | ( X ˜ 3 X ˜ 4 k 4 X ˜ 4 | U d 1 | )
where k1, k3 and k4 are the other control parameters which are to be defined. In Equation (14), the values of the desired (reference) controller indicated by Ud1 and Ud2 are specified as:
U d 1 = x ˙ d U d 2 = ( y d x ˙ d 2 x d y ˙ d x ˙ d 3 y ¨ d x ˙ d x ¨ d + 3 y ˙ d x ¨ d 2 ) ( 1 / x ˙ d 4 )
where longitudinal and lateral velocities, accelerations, and jerks in the x and y directions are specified by ( x ˙ d , y ˙ d ) ,   ( x ¨ d , y ¨ d ) ,   ( x d , y d ) , respectively.

2.4. Turning Geometry

The vehicle that follows the center line of two consecutive rows must perform a turning when it comes to the end of a row. Different types of turning strategies, such as clothoid turning [24] and bulb-shape turning [25], have been developed. In both turning strategies, the turning geometry should be described in an equation form. This form also requires some computational effort, which may be a problem in real-time operations. In the case that the turning space is not big enough for clothoid, bulb-shape, U-shape or K-shape turnings, the turning strategy introduced in this study can also be used. In this study, a new kind of turning methodology (called knot-like turning) was proposed, as presented in Figure 3. The advantage of this turning strategy is that it can be described by the combination of simple straight lines and circles. Therefore, a complete environment model is not needed. The vehicle is located at the beginning of the first row, and the rest is completed via the combination of lines and circles, which creates the reference path. The use of this simple combination makes it possible arrange the straight line and circle parameters (s1, s2, s3, R1, R2, Vs, Vc) according to the row width information (c1) in an easy way (i.e., use of a look-up table). Another advantage of adapting knot-like turning in an orchard application is that there is enough time and distance when the last turn (of which radius is shown by R2) is completed. Before entering the next row, the (autonomous) vehicle has time to check its surroundings, whether it goes through the right row and whether the motion is safe (i.e., there is no possibility to crash into trees). Note that the cherry orchard where the laser sensor data were collected was professionally organized. The plantation geometries of the rows of trees, the row widths and the spare spaces for turning operations were well planned. Therefore, while the modeling structure was constructed, it was assumed that there would be always enough spare space for turning.
In Figure 3, row centers and turnings are indicated by dashed and solid lines, respectively. The velocities of the vehicle during straight and turning motions are defined by Vs and Vc, respectively. The vehicle considered in this study was equipped with a laser scanner rangefinder placed at the front-mid-center. The row widths (qi) and row lengths (RLi) were obtained using the information from this sensor. The turning geometry was generated according to the information of the available turning space. Considering the case shown in Figure 3, the available turning space should be equal to or bigger than the value of (s1 + s3 + R2). The turning methodology introduced here only needs the information of the row widths. Then, the turning algorithm can calculate the radiuses of R1 and R2, and the lengths of s1, s3 and s2.

2.5. Scanning of Trees of a Cherry Orchard Using a Laser Scanner Rangefinder

Trees and rows of trees of one of the Napoleon cherry orchards located in Mihaliccik region of Turkey (Eskisehir) were scanned using the laser scanner rangefinder shown in Figure 4. A branch with the Napoleon cherries is shown in Figure 5. The laser scanner was placed at the center of the rows, and the scanning data were recorded.
The laser scanner (Figure 4), Hokuyo UTM-30LX, has a scanning capability of up to 30 m. This property is indicated in the right image of Figure 4. The scanning area is shown by the red circle, and maximum scan range is indicated by the radius information (R = 30 m). The scanning angle range is also presented by the yellow circular arrow, which shows the scanning range of 270°. The angular resolution of the laser scanner is 0.25°, which means a full scan provides 1080 distance information. The power requirement for this sensor is 12 VDC with a maximum of 1 A current. The sensor sends a data package, which should be decoded using a computer or a microprocessor. In this study, the decoding process was achieved in the Matlab/Simulink environment using a Real-Time Workshop toolbox. The sensor data were handled in every 10 Hz, which also determined the sampling frequency of the control action (the time interval between two control actions was set to 0.1 s).

2.6. Detection of Trees

In this study, the aim was to keep the vehicle in the row center while it was in motion. In order to achieve this objective, the right and left rows of trees should be recognized so that the error between the vehicle and the row center can be measured. The accurate line equations for the right and left rows can be created as long as the trees are detected with minimum detection errors. The principles of the detection of trees and estimation of rows of trees are depicted in Figure 6. The test environment is presented with two rows of trees in this figure. The distance between two tree rows was approximately 4 m, and the length of each row was about 20 m. The laser scanner scanned its surrounding, and each tree was detected using at least three laser scan measurements, as shown in the zoomed view in Figure 6. In Figure 6, the distances between the laser scanner and the detected trees is shown by LiR and LiL for the trees at the right and left of the vehicle (i = 1, 2, 3, 4, ….), respectively. Note that only four of detected trees on each side are shown in this representation. The tree locations shown by (XiR, YiR) and (XiL, YiL) were also determined using the distance and scan angle information.
In order to test the tree detection procedure mentioned above, the recorded laser scanner data were moved to the simulation environment. The developed algorithm detected the trees, as exhibited in Figure 7 (detected trees are indicated by black-colored circles). As seen in the figure, the detected trees formed three neighboring rows of trees.

2.7. The Use of Hough Transform Method for Row Estimation

Trees detected using the methodology described above were used to estimate the rows of trees. (Note that in this study, it was assumed that two adjacent tree rows were parallel, since the trees in the cherry orchard where the laser data were collected were professionally planted to create straight rows of trees.) The tree lines shown by “Left Row” and “Right Row” in Figure 6 were created using the algorithm, which was developed by adapting the Hough transform method. The principles of Hough transform can be given as follows: A line in a 2D plane is described using an equation which has a form of y = mx+k, where m and k are slope and intercept, respectively. When there are many points (xi, yi) in a data set, yi = m(xi) + k is constructed by a sufficient number of points. As long as the points (xi, yi) lie on the same line, the values of m and k should be unique. On the other hand, the polar coordinate system guarantees that the line equation can be represented via r = x cosθ + y sinθ. This representation determines the unique locations of (r, θ) for the points (xi, yi), which show lines on a curve in the Hough space, known as Hough transform. The principles of Hough transform are represented in Figure 8 using a simple example including only three points. In this example, the Hough transform takes the points (x1, y1), (x2, y2) and (x3, y3), and maps them into the Hough curves (HC) at the point (θ, r) of the Hough space. The points (xi, yi) and (θ, r) are illustrated in the left and right images of Figure 8, respectively.
When the Hough transform method was adapted into the system introduced here (detected trees shown in Figure 7), the rows of trees were estimated, as illustrated in Figure 9. In the estimation process, Hough transform generates a line which connects the number of trees in a row of trees. This estimation is performed for both the right and left rows of trees. This process generates the line equations rj = xj cos(θj) + yj sin(θj) (where j indicates right and left) for the right and left rows of trees. The position information of the trees (xi, yi), i = 1, 2, 3,…, n are mapped into Hough space and create n Hough curves to estimate the unique values of ϴ and r. In Figure 9, the red-colored solid lines show the estimated rows of trees. The row centers, considered as the straight parts of the reference trajectory, are indicated via blue-colored dashed lines. The approximate row width used to generate a reference turning geometry (Figure 3) was also obtained by this estimation.
Hough curves mapped into Hough space are also illustrated for the row center lines (left and right dashed lines in Figure 9) in Figure 10.

3. Results and Discussion

As described above, the laser scanner data were used for detecting each tree in the cherry orchard. Then, the information of the detected trees was transferred to the algorithm developed for estimating the rows of trees. These two steps provide the necessary data used to generate the desired (reference) trajectory. In order to test the trajectory tracking performance and accuracy of the developed system, a simulation environment was created in Matlab. The reference trajectory was given to this environment as the input. In addition, the errors that occurred between the input and output trajectories were observed. Furthermore, the behaviors of the steering and driving controllers were also tested. The simulation environment also provided the opportunity to test the performance of the proposed knot-like turning method. The results obtained in the simulation studies make it possible to test the developed system before adapting it into a real system.
In the simulation studies, the row centers estimated using Hough transform (Figure 9) were used to create the straight parts of the desired trajectory. The turning geometry introduced in Figure 3 was adapted to the system to build the turning parts of the desired trajectory. The desired trajectory and tracking results are presented in Figure 11, Figure 12 and Figure 13. The straight and curved parts of the motion are specified by the letters A–H (Figure 11) to present the results in detail. Note that in order to test the performance of the feedback controller introduced in Section 2.3, Gaussian white noise was added to the straight parts of the desired trajectory (specified by the letters A, C, E, G and I).
Errors obtained in x and y directions are given in the left image of Figure 12 by the black and red solid lines, respectively.
The histogram plots of x and y errors are also given in the right image of Figure 12. These results indicate that the errors occurring in the x direction were accumulated between −0.001 and 0.002 m, whereas those in the y direction were observed in the region between the −0.01 and 0.02 m band.
Figure 13 exhibits the control signals generated for the steering angle. In order to show the variations in steering angle, a couple of zoomed-in-views are also presented. The locations of the specified motions are also shown in the right image in Figure 13. The letters shown in this figure specify the straight and turning parts of the reference trajectory (see Figure 11). Despite the additive Gaussian white noise in the straight parts of the reference trajectory, the controller was able to generate the required steering angle comments to achieve trajectory tracking with minimal tracking errors.

4. Conclusions

The smooth time-varying feedback control method is used for autonomous vehicle applications. It generates the necessary commands to follow a desired trajectory. In this study, this procedure was adapted to an autonomous orchard vehicle task. The vehicle’s motion and reference trajectory were developed by following the principles of a car-like robot model. The modeling structure was integrated with the feedback control strategy. A new type of turning strategy for turning from one row to another was proposed. An algorithm was developed to detect trees and estimate rows of trees. An estimation procedure was constructed based on the Hough transform method. A simulation environment was created to determine the performance of the proposed system. The trees and rows of trees of a cherry orchard were scanned using a laser scanner rangefinder. The scanned data were then transferred to the simulation environment to create the desired trajectory, and the simulations were conducted for a trajectory tracking task. The results indicate that the proposed system, including the adaptation of the car-like robot model, desired trajectory generation, turning procedure and smooth time-varying feedback control, could be effectively used for autonomous orchard applications. As a future work, the methodology introduced in this paper will be adapted into a real robot vehicle designed for performing autonomous cherry orchard applications. Conducting real experiments in cherry orchards will make it possible to verify the system presented here.

Funding

The infrastructure project of the Mechanical Engineering Department supported by Zonguldak Bulent Ecevit University (Zonguldak, Turkey), numbered 2013-77654622-03. Research project (BAP-2018-77654622-03) supported by Zonguldak Bulent Ecevit University.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Luca, A.D.; Oriolo, G.; Samson, C. Feedback control of a nonholonomic car-like robot. In Robot Motion Planning and Control; Laumond, J.-P., Ed.; Springer: Berlin/Heidelberg, Germany, 2005; Volume 229, pp. 171–253. [Google Scholar]
  2. Gu, D.; Hu, H. Neural predictive control for a car-like mobile robot. Int. J. Robot. Auton. Syst. 2002, 39, 73–86. [Google Scholar] [CrossRef]
  3. Lee, S.; Kim, M.; Youm, Y.; Chung, W. Control of a car-like mobile robot for parking problem. In Proceedings of the 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C), Detroit, MI, USA, 10–15 May 1999; pp. 1–6. [Google Scholar]
  4. Oriolo, G.; Luca, A.D.; Vendittelli, M. WMR control via dynamic feedback linearization: Design, implementation, and experimental validation. IEEE Trans. Control. Syst. Technol. 2002, 10, 835–852. [Google Scholar] [CrossRef]
  5. Yang, E.; Gu, D.; Mita, T.; Hu, H. Nonlinear tracking control of a car-like mobile Robot via dynamic feedback linearization. In Proceedings of the UKACC Control Conference 2004, University of Bath, Bath, UK, 6-9 September 2004. [Google Scholar]
  6. Fahimi, F. Autonomous Robots: Modeling, Path Planning, and Control; Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
  7. Normey-Rico, J.E.; Alcal, I.; Gomez-Ortega, J.; Camacho, E.F. Mobile robot path tracking using a robust PID controller. Control. Eng. Pract. 2001, 9, 1209–1214. [Google Scholar] [CrossRef]
  8. Lenain, R.; Thuilot, B.; Cariou, C.; Martinet, P. Rejection of sliding effects in car like robot control: Application to farm vehicle guidance using a single RTK GPS sensor. In Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453), Las Vegas, NV, USA, 27–31 October 2003; pp. 3811–3816. [Google Scholar]
  9. Barawid, O.; Mizushima, A.; Ishii, K.; Noguchi, N. Development of an autonomous navigation system using a two-dimensional laser scanner in an orchard application. Biosyst. Eng. 2007, 96, 139–149. [Google Scholar] [CrossRef]
  10. Zaidner, G.; Shapiro, A. A novel data fusion algorithm for low-cost localisation and navigation of autonomous vineyard sprayer robots. Biosyst. Eng. 2016, 146, 133–148. [Google Scholar] [CrossRef]
  11. Matveev, A.S.; Hoy, M.; Savkin, A.V. Mixed nonlinear sliding mode control of an unmanned farm tractor in the presence of sliding. In Proceedings of the 2010 11th International Conference on Control Automation Robotics and Vision, Singapore, 7–10 December 2010. [Google Scholar]
  12. Shalal, N.; Low, T.; McCarthy, C.; Hancock, N. Orchard mapping and mobile robot localisation using on-board camera and laser scanner data fusion—Part B: Mapping and localisation. Comput. Electron. Agric. 2015, 119, 267–278. [Google Scholar] [CrossRef]
  13. Hiremath, A.A.; Heijden, G.; Evert, F.; Stein, A.; Braak, C. Laser range finder model for autonomous navigation of a robot in a maize field using a particle filter. Comput. Electron. Agric. 2014, 100, 41–50. [Google Scholar] [CrossRef]
  14. Hiremath, S.; Evert, F.; Braak, C.; Stein, A.; Heijden, G. Image-based particle filtering for navigation in a semi-structured agricultural environment. Biosyst. Eng. 2014, 121, 85–95. [Google Scholar] [CrossRef]
  15. Sabelhaus, D.; Roben, F.; Helligen, L.; Lammers, P. Using continuous-curvature paths to generate feasible headland turn manoeuvres. Biosyst. Eng. 2013, 116, 399–409. [Google Scholar] [CrossRef]
  16. Backman, J.; Piirainen, P.; Oksanen, T. Smooth turning path generation for agricultural vehicles in headlands. Biosyst. Eng. 2015, 139, 76–86. [Google Scholar] [CrossRef]
  17. Thanpattranon, P.; Ahamed, T.; Takigawa, T. Navigation of autonomous tractor for orchards and plantations using a laser range finder: Automatic control of trailer position with tractor. Biosyst. Eng. 2016, 147, 90–103. [Google Scholar] [CrossRef]
  18. Bochtis, D.; Griepentrog, H.W.; Vougioukas, S.; Busato, P.; Berruto, R.; Zhou, K. Route planning for orchard operations. Comput. Electron. Agric. 2015, 113, 51–60. [Google Scholar] [CrossRef]
  19. Wang, X.; Peng, H.; Liu, J.; Dong, X.; Zhao, X.; Lu, C. Optimal control based coordinated taxiing path planning and tracking for multiple carrier aircraft on flight deck. Def. Technol. 2022, 18, 238–248. [Google Scholar] [CrossRef]
  20. Liu, J.; Dong, X.; Wang, J.; Lu, C.; Zhao, X.; Wang, X. A novel EPT autonomous motion control framework for an off-axle hitching tractor-trailer system with drawbar. IEEE Trans. Intell. Veh. 2021, 6, 376–385. [Google Scholar] [CrossRef]
  21. Wang, X.; Liu, J.; Su, X.; Peng, H.; Zhao, X.; Lu, C. A review on carrier aircraft dispatch path planning and control on deck. Chin. J. Aeronaut. 2020, 33, 3039–3057. [Google Scholar] [CrossRef]
  22. Morin, P.; Samson, C. Trajectory tracking for nonholonomic vehicles. In Proceedings of the 4th International Workshop on Robot Motion and Control (RoMoCo’04), Poznań, Poland, 17–20 June 2004; pp. 139–153. [Google Scholar]
  23. Samson, C. Control of chained systems: Application to path following and timevarying point-stabilization of mobile robots. IEEE Trans. Autom. Control 1995, 40, 64–77. [Google Scholar] [CrossRef]
  24. Shin, D.H.; Singh, S. Path Generation for Robot Vehicles Using Composite Clothoid Segments; Technical Report CMU-RI-TR-90-31; Carnegie Mellon University Robotics Institute: Pittsburgh, PA, USA, 1990. [Google Scholar]
  25. Hamner, B.; Koterba, S.; Shi, J.; Simmons, R.; Singh, S. Mobile robotic dynamic tracking for assembly tasks. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009. [Google Scholar]
Figure 1. Car-like robot model.
Figure 1. Car-like robot model.
Agriculture 13 00092 g001
Figure 2. Block diagram of the proposed system.
Figure 2. Block diagram of the proposed system.
Agriculture 13 00092 g002
Figure 3. Representation of the proposed turning geometry called “knot-like turning.” The scanning range of the laser scanner sensor used in this study was 30 m.
Figure 3. Representation of the proposed turning geometry called “knot-like turning.” The scanning range of the laser scanner sensor used in this study was 30 m.
Agriculture 13 00092 g003
Figure 4. Laser scanner rangefinder used in this study.
Figure 4. Laser scanner rangefinder used in this study.
Agriculture 13 00092 g004
Figure 5. Cherry orchard where trees and rows of trees were scanned.
Figure 5. Cherry orchard where trees and rows of trees were scanned.
Agriculture 13 00092 g005
Figure 6. The principles of tree detection and row estimation procedures.
Figure 6. The principles of tree detection and row estimation procedures.
Agriculture 13 00092 g006
Figure 7. The detected trees (shown by black-colored circles) forming the adjacent rows of trees.
Figure 7. The detected trees (shown by black-colored circles) forming the adjacent rows of trees.
Agriculture 13 00092 g007
Figure 8. Principles of Hough transform. HC means Hough curve.
Figure 8. Principles of Hough transform. HC means Hough curve.
Agriculture 13 00092 g008
Figure 9. Rows of trees and center lines created using Hough transform.
Figure 9. Rows of trees and center lines created using Hough transform.
Agriculture 13 00092 g009
Figure 10. Hough curves. The intersection points show the (θ, r) values for the center lines of the rows shown in Figure 9. PL (181°, 1.7094), PR (181°, −1.6994).
Figure 10. Hough curves. The intersection points show the (θ, r) values for the center lines of the rows shown in Figure 9. PL (181°, 1.7094), PR (181°, −1.6994).
Agriculture 13 00092 g010
Figure 11. Desired trajectory tracking results. The distances between the row centers (c1 and c2) were approximately 3 m. Row lengths, Rw1, Rw2 and Rw3 were about 50 m.
Figure 11. Desired trajectory tracking results. The distances between the row centers (c1 and c2) were approximately 3 m. Row lengths, Rw1, Rw2 and Rw3 were about 50 m.
Agriculture 13 00092 g011
Figure 12. (Left) Errors occurred in x and y directions, (Right) histograms.
Figure 12. (Left) Errors occurred in x and y directions, (Right) histograms.
Agriculture 13 00092 g012
Figure 13. (Left) Steering angles generated by the controller, (Right) letters indicate the specified motions. The letters, A, E, I, indicate the longitudinal motions. B and F show the small arcs in the trajectory. C and G are used to show the short lateral motions. The circular motions are demonstrated via letter D and letter H.
Figure 13. (Left) Steering angles generated by the controller, (Right) letters indicate the specified motions. The letters, A, E, I, indicate the longitudinal motions. B and F show the small arcs in the trajectory. C and G are used to show the short lateral motions. The circular motions are demonstrated via letter D and letter H.
Agriculture 13 00092 g013
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bayar, G. The Use of Hough Transform Method and Knot-Like Turning for Motion Planning and Control of an Autonomous Agricultural Vehicle. Agriculture 2023, 13, 92. https://doi.org/10.3390/agriculture13010092

AMA Style

Bayar G. The Use of Hough Transform Method and Knot-Like Turning for Motion Planning and Control of an Autonomous Agricultural Vehicle. Agriculture. 2023; 13(1):92. https://doi.org/10.3390/agriculture13010092

Chicago/Turabian Style

Bayar, Gokhan. 2023. "The Use of Hough Transform Method and Knot-Like Turning for Motion Planning and Control of an Autonomous Agricultural Vehicle" Agriculture 13, no. 1: 92. https://doi.org/10.3390/agriculture13010092

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop