Next Article in Journal
Single Nucleotide Polymorphisms in the Fatty Acid Binding Protein 4, Fatty Acid Synthase and Stearoyl-CoA Desaturase Genes Influence Carcass Characteristics of Tropical Crossbred Beef Steers
Previous Article in Journal
Research on Integrated Navigation System of Agricultural Machinery Based on RTK-BDS/INS
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Algorithm for Calculating Apple Picking Direction Based on 3D Vision

1
College of Metrology and Measurement Engineering, China Jiliang University, Hangzhou 310018, China
2
Engineering Training Center, China Jiliang University, Hangzhou 310018, China
3
Faculty of Mechanical Engineering & Automation, Zhejiang Sci-Tech University, Hangzhou 310018, China
*
Author to whom correspondence should be addressed.
Agriculture 2022, 12(8), 1170; https://doi.org/10.3390/agriculture12081170
Submission received: 6 June 2022 / Revised: 1 August 2022 / Accepted: 3 August 2022 / Published: 5 August 2022
(This article belongs to the Section Digital Agriculture)

Abstract

:
Research into autonomous (robotic) apple picking has not yet resolved the problem of finding the optimal picking orientation. Robotic picking efficiency, in terms of picking all available apples without loss or damage, remains low. This paper proposes a method of determining the optimal picking orientation relative to the target fruit and adjacent branches from the point cloud of the apple and the surrounding space. The picking mechanism is then able to avoid branches and accurately grasp the target apple in order to pick it. The apple is first identified by the YOLOv3 target detection algorithm, and a point cloud of the fruit and the space surrounding it is obtained. The random sample consensus algorithm RANSAC is used for sphere fitting, and the fruit is idealized as a sphere. RANSAC also idealizes the branch as a line that is fitted to the branch bearing the target apple in the point cloud around it. The distance between the line of the branch and the fruit centroid is constrained in fitting to ensure identification of the branch/line closest to the apple/sphere. The best apple picking orientation is determined from the positional relationship between the straight branch/line and the center of the apple/sphere. The performance of the algorithm was evaluated using apples with various orientations on growing trees. The average angle error between the calculated picking direction vector and the expected direction vector was 11.81°, and the standard deviation was 13.65°; 62.658% of the determinations erred by ≤10°, and 85.021% erred by ≤20°. The average time for estimating the orientation of an apple was 0.543 s. The accuracy and speed of the algorithm enabled the robotic picker to operate at a speed that matches that of a human apple picker.

1. Introduction

Apple picking is the most time-consuming and laborious link in the production chain. The quality of picking directly affects the storage and transportation life of the apple and so affects its economic value [1]. However, as shown in Figure 1a, fruit trees in most orchards in China grow in a spindle shape with intricate branches, which greatly interferes with picking. Figure 1b is the scene of our picking experiment in the orchard, and it can be seen that the growth direction of the fruit is very complicated. Robotically picking an apple from a tree requires an assessment of the complex positional relationship between the apple and the tree branches to determine the direction of approach to the apple of the end effector. This direction is an important factor to be considered in the path planning of the manipulator. If the picker approaches the target apple with a fixed attitude and does not consider the pose of the apple, the gripper (end effector) may be obstructed by the branch (Figure 2a), or it may grip the apple and the branch together (Figure 2b), resulting in apple damage, tree damage, or even robot damage [2,3]. In practice, experienced human apple pickers can adapt their picking actions to the apple growth pose and so avoid branches and grip the apple without damaging the apple or tree. To enable the robot to imitate the pickers, it must first perceive the apple pose.
Sensor technology has developed to the extent that there are many brands of highly precise cheap sensors available for use in fruit detection and picker positioning [4,5]. Zhou Jun et al. use a particle filter to predict fruit posture, depending on whether or not the calyx remnant is visible or not, and combine multi-frame images to avoid errors caused by using a single frame image [6]. However, the measurement error of this method is large when the calyx remnant is not visible.
Giefer et al. developed a convolutional neural network algorithm for automatic localization and posture determination using images of apples captured by RGB cameras [7]. The algorithm needs to collect high-definition images of fruits at close range under lighting conditions to detect fruit textures, so it is not suitable for long-distance recording images in outdoor unstructured environments.
Xie et al. developed a decision-making method that combined four techniques to increase recognition accuracy: second-order center distance approximation, shortest distance, slope variance, and three-point collinearity [8]. This method is able to estimate fruit poses when the calyx remnant is not visible, but the estimated pose is two-dimensional, and this algorithm is less efficient.
Some researchers optimized the grasping posture of the harvesting robot based on the angle formed by the fruit stem and the branch in an eye-in-hand pepper picking application, thus avoiding incorrectly gripping the fruit [9,10]. This method is suitable for picking systems with eyes on hands, and picking objects must have a fixed positional relationship with their branches. This does not apply to unstructured fruit picking.
Kang et al. developed a method of fitting the fruit surface point cloud captured by the camera into a sphere using the Hough transform and taking the line from the center of the sphere to the geometric center of the surface point cloud as the reference direction for picking [11,12,13]. This method increased positioning accuracy and predicted the picking direction when a complete view of the fruit was not obtained. However, the method does not consider the distribution of branches that may hinder picking; thus, there is a risk of catching branches when picking fruit.
Lin et al. developed a method of picking guava in which the fruit and branches were separated through RGB imaging, and the three-dimensional posture of the fruit was predicted using the position of the fruit center relative to the nearest branch [14]. This method reconstructed all branches in the field of vision, but in practice, not all branches in view will interfere with picking; thus, the method wastes computing resources by unnecessarily reconstructing branches that are noisy with respect to picking.
Most of the investigated methods use traditional RGB cameras for pose calculation, which has large errors. The existing methods using RGB-D cameras still have the disadvantage of a large amount of redundant computation. Therefore, this paper proposes a method to calculate the direction of the apple using only the image data of the area near the target apple, which can improve the speed of the calculation. The idea is as follows: First, the apple centroid is determined by the sphere fitting of the apple point cloud, and the branch line is fitted by the point cloud near the apple. Then the growth orientation of the apple on the branch is calculated based on the spatial relationship between the point and the line. The main steps of this determination are as follows. First, use an RGB-D camera to acquire images and use the very mature YOLOv3 network model to recognize the apple in color images [15,16,17]. The point cloud inside the bounding box is then extracted, and a sphere is fitted using the RANSAC algorithm. Use RANSAC again to fit the line to the branch closest to the target apple. Finally, take a line that passes through the center of the sphere and is perpendicular to the branch closest to the apple to denote the best picking direction for the end effector to approach the apple. The steps we outlined will reduce occurrences of the gripper being obstructed by a branch or grabbing a branch together with the apple and increase the adaptability of the picking system, thus increasing the picking success rate.
The novelties of this paper are:
(1)
The spherical fitting of the point cloud on the fruit surface with the RANSAC algorithm can improve the accuracy of fruit localization. There is a large error in the method of directly using the centroid of the point cloud on the fruit surface to replace the fruit centroid. The algorithm of this paper can remove the point cloud on the surface of the branches and leaves around the fruit according to the spherical feature of the apple and fit the centroid of the fruit.
(2)
The orientation of the fruit is approximately defined as a straight line passing through the center of the apple and perpendicular to the branch closest to the apple (as shown in Figure 3b), rather than the orientation of the fruit stem, which can reduce the error in the direction calculation. Because apples are ellipsoids, using only the point cloud on the fruit surface to calculate the orientation of the fruit stems will result in a large error. However, defining the orientation of the fruit by the positional relationship between the fruit and its nearest branches can effectively reduce the error, and this orientation is helpful for the robot arm to avoid obstacles.
(3)
The computational efficiency can be improved by only processing the point cloud data in the vicinity of the fruit separately. Because the branches that interfere more with apple picking are distributed in the area closer to the fruit, reducing the scope of the calculation to the vicinity of the fruit can reduce unnecessary calculations.

2. Materials and Methods

2.1. Materials

Figure 3a shows the schematic of the proposed apple picking system. The system uses eye-to-hand camera installation. The camera is fixed to the base of the six-axis manipulator in an upward sloping orientation to minimize the obscuration of fruits and branches by leaves. The RGB-D camera was an FM811-IX-A produced by Tuyang Technology, Shanghai, China, that consisted of an infrared laser, infrared camera, and RGB camera. The infrared laser emits a light beam onto the subject, and the reflected light is collected by the infrared camera. Depth data are calculated from image phase information. The infrared image resolution is 1280 × 960 pixels, and the accuracy at 800 mm distance is 0.2 mm. Color image resolution is also 1280 × 960 pixels. Align the color map with the depth map, and convert the depth data into point cloud data through the SDK that comes with the camera, which is convenient for post-calculation. The computer uses an AMD R7-5800H CPU with 16 GB RAM. The system executes in the following sequence: the mobile platform moves among the fruit trees while the camera continuously captures images and determines whether there is fruit within reach of the arm robot. If there is a fruit that can be picked, the platform stops moving, and the algorithm mentioned in this paper is used to calculate the position and direction of the fruit. Then the arm robot picks the fruit according to the picking path adapted to the direction of the target fruit. When there is fruit in the camera’s field of view, but the arm robot cannot pick it, the extra degrees of freedom on the platform can expand the action space of the arm robot.
The apple picking direction defined in this paper is shown in Figure 3b. Point O is the centroid of the apple; the straight line AB is the branch closest to the target apple (AB is not necessarily the branch bearing the target apple O); Point C is the intersection of the shortest line from the centroid of the apple to the branch line, so line OC is perpendicular to AB; the vector OC is the picking direction of the apple, and the gripper approaches the apple along the direction of the vector OC. Although the vector OC is not the axis of the apple, it can provide guidance for fruit picking, which can help reduce the chance of the gripper colliding with the branch during the picking process. Figure 3b shows the motion track of the end effector during the apple picking process of the robotic arm, which can reduce the possibility of collision. The track is a polynomial function, and this function needs to know the position and direction of the target apple.

2.2. The Method of Apple Orientation Determination Using 3D Vision

2.2.1. Workflow of the Method

In this section, the apple picking pose determination algorithm is introduced. The algorithm flow is shown in Figure 4.
The first step, after recording the RGB-D images with the camera, is to identify the apple using the YOLOv3 algorithm to detect and frame the apple from the RGB data. Then a sphere is fitted to the apple by calculating the centroid of the apple, segmenting the depth data in the recognition frame, and converting it into point cloud data. This process is controlled by internal and external camera parameter settings. The radius of the fitted sphere is taken to be the radius of the apple. The next step is to model the branches in the neighborhood of the target apple. Branches that will affect picking are distributed near the apple. In order to reduce computation, identified branches within a sphere with the same center as the apple and a radius equal to three times the radius of the apple as the region of interest (ROI). Remove all points on the surface of the apple from the ROI and fit straight lines to the remaining points in the ROI in order to identify the straight line closest to the apple as the branch closest to the apple, which is assumed to be the branch bearing the apple. The next step is to calculate the line from the centroid of the apple perpendicular to the branch fitting line. This line indicates the orientation of the apple. This line is not necessarily the direction of the peduncle, but it can guide automated apple picking. The approach orientation of the robotic arm can be determined based on the apple direction determined by this algorithm and will enable the gripper to avoid obstructive branches and successfully grasp the apple.
Each step of the algorithm is described in more detail in the following sections.

2.2.2. Apple Identification Based on YOLOv3

The target detection rate of the vision positioning system greatly affects the overall efficiency of a robotic picker. There are many established algorithms for target recognition and segmentation [18,19], but all lack robustness in real-world environments. The target recognition of YOLOv3 is highly accurate [15,16,17]. Therefore, this study used a YOLOv3 object detection algorithm to identify apples in an orchard environment. The RGB-D camera was used to obtain sample images of the orchard, and the VOC (Visual Object Class) apple data set was created with these images. The model was then trained in order to create a network model suitable for apples.
We labeled the photos collected from the orchard and trained the network in order to produce a network model that identified apples. The network was trained and tested in an orchard, and recognition accuracy was ≥96%, which indicates that the model was sufficient for accurate apple recognition. Recognition data provided position parameters for subsequent orientation determinations. Recognition examples are shown in Figure 5. Although the color of the apples is similar to the color of the leaves and weeds in the orchard, the apples were accurately recognized by the network model.

2.2.3. Apple Centroid Calculation

Individual apples differ in morphology, and their geometry is irregular. The image obtained by a 3D camera can only contain the point cloud data of the visible apple surface. It is, therefore, infeasible to obtain the position and orientation of the apple by matching the 3D image to a standard apple model. In order to simplify the model, the apple was considered to be an ideal sphere, and a mathematical sphere was fitted using the point cloud data of the visible apple surface. The position of the bounding box of the apple was obtained from the apple target recognition algorithm used earlier, and the data for sphere fitting are the point cloud data inside the bounding box. There are also nonapple surface points and noise points in the bounding box, so simply fitting the sphere using the least squares method would have a large deviation. To avoid this outcome, it is desirable to reduce noise in the point cloud in advance of fitting.
The random sample consistency algorithm RANSAC is widely used to calculate robustness, and it is especially suitable for processing data containing noise and outliers [20]. RANSAC algorithm was used to process the point cloud in order to screen out various points distributed on the surface of the apple. Figure 6 shows the flow chart for spherical fitting using RANSAC.
A sphere can be parameterized by the coordinates x, y, and z of the center O, relative to some horizontal, vertical and lateral axes, and the radius r. The parameters can be determined from four points in space that are assumed to lie on the surface of the sphere [21]. The input data Q in the algorithm flow chart shown in Figure 6 consists of the point cloud in the apple bounding box, which includes apple surface points, nonapple points (e.g., branches, leaves), and noise points. Four points are randomly selected in the point cloud Q, mathematically determine the spherical surface containing the four points to obtain the sphere S, and then calculate the distances from the other points in the point cloud Q to the center of the sphere S. If the difference between the distance and the radius of the sphere S is less than the threshold value, the point is marked as an interior point, and the total number of interior points is incremented by 1; then the next iteration is calculated. If the number of interior points for this iteration is greater than for the last iteration, the calculation result of this iteration is considered to be the current optimal fitting sphere S, and the next iteration is calculated. The preceding steps are repeated until the desired number of iterations have been completed. Then use the least squares method to correct the sphere parameters and use the optimal sphere (the sphere with the most interior points) as the best fit. Based on a large number of experiments, setting the number of iterations to 1000 can make the calculation result stable within a specified range [22].
Randomly select four points from the 3D point set Q, A 1 ( x 1 , y 1 , z 1 ) , A 2 ( x 2 , y 2 , z 2 ) , A 3 ( x 3 , y 3 , z 3 ) , and A 4 ( x 4 , y 4 , z 4 ) ; then substituting them into the sphere Equation (1), the center O ( a , b , c ) and radius r of the sphere can be found:
( x a ) 2 + ( y b ) 2 + ( z c ) 2 = r 2
Details of the calculations can be found in [23]. After calculating the center O ( x , y , z ) of the sphere and the radius r, the model sphere S for the four selected points has been parameterized and can be used to determine, for all other cloud points, whether they are internal or external to the sphere.
Next filter to obtain the interior points of the sphere determined by the four sample points from the point cloud Q. An interior point is defined thus:
If there is a point in the target space, and the absolute value of the difference between the distance of that point from the center O of the sphere and the radius r of the sphere is less than the threshold, then that point is an interior point of the sphere S. Considering that the apple is not an ideal sphere, the threshold was set to r/5.
That is, if the input point P i ( x i , y i , z i ) satisfies the inequality in Equation (2), then P i is an interior point of the sphere.
| r ( x x i ) 2 + ( y y i ) 2 + ( z z i ) 2 | < r 5
After that, the same steps are used for four more randomly selected sample points for spherical fitting. If the number of interior points counted in this successive iteration is greater than in the last iteration, the sphere fitting parameters of this iteration are used as the optimal sphere (S*) parameters. After enough iterations, the fluctuation of the fitted sphere radius is ≤1 mm, and it can be assumed that the number of interior points is close to a maximum, and therefore the parameters of the sphere fitted at this time are close to accurate.
The final set of interior points produced by RANSAC will be essentially noiseless and free of outliers, and it can be assumed that almost all the remaining interior points are distributed on the surface of the apple. An optimal sphere can be fitted to all interior points by the least squares method to represent the idealized apple. The least squares method ensures that the idealized apple sphere will minimize the distance between the surface of the sphere and the set of interior points, so the parameterized sphere calculated from the point cloud data will be the optimal sphere. The method of fitting a sphere using least squares is as follows:
Expand the parentheses in Equation (1), and let A = 2 a , B = 2 b , C = 2 c and D = a 2 + b 2 + c 2 r 2 . The sphere can be represented as:
x 2 + y 2 + z 2 A x B y C z + D = 0
The residual v of the sphere fitting all interior points can be minimized by:
min v = min i = 1 n ( x i 2 + y i 2 + z i 2 A x i B y i C z i + D ) 2
Since the residual summation is a quadratic function, the values of A, B, C, and D can be found by the least squares method, and Equation (4) can be written in matrix form:
min v = | A X B | 2
where A = [ x 1 y 1 z 1 1 x 2 y 2 z 2 1 x n y n z n 1 ] , X = [ A B C D ] , and B = [ x 1 2 + y 1 2 + z 1 2 x 2 2 + y 2 2 + z 2 2 x n 2 + y n 2 + z n 2 ] .
Taking the partial derivative of the coefficient vector X can get:
v X = 2 A T A X 2 A T B = 0
Then the optimal value of the coefficient vector is given by:
X = ( A T A ) 1 A T B
After calculating the coefficients A, B, C, and D, we can obtain a = A 2 , b = B 2 , c = C 2 , and r = a 2 + b 2 + c 2 D .
The influence of each interior point on the fitting is determined when using the least squares method to fit all the interior points selected in the final iteration to the spherical surface, so the error in fitting is reduced. Figure 7a is the original point cloud image obtained by the camera, Figure 7b is the point cloud in the apple bounding box, and the red point cloud in Figure 7c is the set of all interior points calculated by RANSAC; the sphere in the figure is the sphere fitted by the least squares method.

2.2.4. Line Fitting of Apple Neighborhood Branches

The branches that affect the apple picking orientation are distributed around the apple, and the branches closest to the apple have the greatest influence on picking. Only the point cloud in a neighborhood around the target apple is calculated in order to reduce computation. A straight line in space is used to model the branch closest to the apple. The detection area is the space between two spheres centered on the target apple sphere center with 1.2 times the apple sphere radius as the inner radius and three times the apple sphere radius as the outer radius. Elements of the point cloud of Minter in this space are used to fit the line to the branch. The neighborhood of the apple contains many obstacles, such as branches or leaves, so singular value decomposition cannot be used to fit the straight line in the space [24]. RANSAC algorithm was used to remove some irrelevant branch and leaf data. Because there are many branches around the apple, the straight line is not necessarily fitted to the branch closest to the apple, so it is necessary to add distance constraints to the RANSAC iterations. The branch closest to the target apple most influences grabbing during apple picking, and it is impossible for the branch to pass through the inside of the apple, so most attention is directed to a straight line 0.5–1.5 sphere radii from the apple centroid. This allows us to fit a line to a branch that is closest to the sphere center and has a large number of interior points. The flowchart of the method is shown in Figure 8.
Two points determine a straight line in space. Randomly select two points from the point cloud Minter(as shown in Figure 9a) to form a straight line, and then calculate the distance between the straight line and the fruit centroid. If the distance is between 0.5 and 1.5 times the apple radius, the straight line is considered to be close to the apple; otherwise, the two points are resampled, and the distance calculation is repeated. If the subsequent straight line is closer to the apple than the previous line, then count the number of points in Minter that are at a distance from the line equal to or less than a threshold value. This is the number of interior points for the line. The threshold is set to 5 mm because the radius of the selected experimental branch is 5 mm. Another pair of points is randomly selected, and the preceding process is repeated until another set of interior points is obtained. (An equation for the line can be determined by least squares regression, although it is not necessary to determine the equation until the optimal set of interior points has been identified.) If the number of interior points in the newly generated set is greater than the number of interior points in the previously generated set, the equation for this line will be the equation of the current optimal line that is fitted to the branch. It has been experimentally determined that a relatively stable optimal line will have been identified after approximately 3000 iterations. The line fitted in this way will model the branch closest to the apple and have a high degree of fit.
The 3D equation of the optimal line is determined as follows. Suppose there is a point A ( x 0 , y 0 , z 0 ) on the line, and a direction vector of the line is e = ( l , m , n ) . Then the equation of the straight line in space can be written in the following vector form:
X = A + t e
The equation for calculating the distance from a point to a line is:
d = | P Q × e | / | e |
During iteration, the fit of the calculated straight line gradually improves. The final line fitting the branch closest to the sphere is shown in Figure 9b, in which straight line AB fits the branch closest to the apple and the point O is the apple centroid. Figure 9b shows that RANSAC with distance constraints can indeed filter out the branches close to the apple that may interfere with picking.

2.2.5. Calculation of the Position of the Apple Relative to the Branch and Calculation of Picking Attitude

When the sphere representing the apple and the line representing the branch bearing the apple has been identified, the line from the center of the sphere perpendicular to the straight line (branch) is considered to indicate the pose of the apple on the branch. This assumed orientation enables the robotic arm to position itself to pick the apple by approaching it along the direction of the line in the orientation that decreases the distance between the gripper and the apple. This positioning reduces the likelihood of the gripper hitting the apple and damaging it during picking. Before calculating the picking attitude shown in Figure 3b, it is necessary to calculate the direction vector from the sphere center perpendicular to the branch line and convert this direction vector to the Euler angle of the end gripper in the robotic arm coordinate system. The calculation is as follows.
In Figure 10a, if O ( x o , y o , z o ) is the center of the apple fitting sphere, and X = A + t e is the straight line fitted to the branch, where A ( x l , y l , z l ) is a point on the straight line, and e = ( l , m , n ) is a direction vector of the straight line, then the vector OC is perpendicular to the straight line, and the intersection point is C and the vector OC can be solved by solving the coordinates of point C. Because the two vectors are perpendicular, their vector product is zero: O C e = 0 . Combining this with linear Equation (8), the following equations can be obtained:
{ x = l t + x l y = m t + y l z = n t + z l ( x x o ) l + ( y y o ) m + ( z z o ) n = 0
After solving for the point C ( x , y , z ) , the desired vector OC can be obtained, and the calculation result is shown in Figure 10.
To obtain the Euler angle of the end effector, first, convert the spherical center coordinates O ( x o , y o , z o ) of the sphere and direction vector OC to the end effector coordinate system. The robotic arm will approach the centroid in the direction of the transformed direction vector, O C ( x c , y c , z c ) , which is the gripping direction shown in Figure 3b. The transformation is as follows.
Define the X direction of the robot base coordinate system to be the original direction of the arm end effector. As shown in Figure 11, rotate the coordinate system of the gripper to the desired apple direction vector O C ( x c , y c , z c ) in the sequence Z–Y–X. The Euler angle of the gripper is then ( R X , R Y , R Z ) .
The apple can be approximated to a rotating body, and the rotation angle around the x axis can be set to zero. The Euler angle is then calculated by Equation (11):
{ R X = 0 R Y = A tan 2 ( z c , x c 2 + y c 2 ) R Z = A tan 2 ( y c , x c )

2.2.6. Algorithm Testing Method Design

An experimental platform was built in the laboratory to test whether the algorithm correctly calculates the apple picking orientation. The camera was a Tuyang FM811-IX-A, the measurement range was 700–3500 mm, and the accuracy was 1–5 mm. The model orchard used in the experiment consisted of a single apple with a radius of 25.78 mm connected rigidly to a single tree branch with a diameter of 5 mm. The camera position and attitude were unchanged during the experiment. The camera coordinate system was the frame of reference. The optical center of the camera was the origin. The optical axis of the camera was the z axis, the y axis was vertical downward, and the x axis was horizontal rightward. The experiment is divided into three parts, each part of the apple is rotated around a different axis, and then the accuracy of the algorithm is tested at different angles of the fruit. Apple orientations used in the experiment are shown in Figure 12.
Experiment metrics were the actual radius of the apple, the calculated radius of the fitted sphere, and the angle θ between the actual direction vector a of the apple and the calculated direction vector b. The angle θ was calculated by:
cos θ = ( a b ) / ( | a | | b | )

3. Results and Discussion

The data obtained from the experiment are shown in Table 1, Table 2 and Table 3.
The gray points in all point clouds in Figure 12 are the external points removed by RANSAC, the green points are the points of the branches closest to the apple, and the pink points are the surface points of the apple. The center of the fitted sphere is very close to the apple centroid. However, the straight line fitted by the branch point cloud is not the rotation axis of the branch because the camera cannot record the full point cloud of the branch. The fitted line is, therefore, closer than the branch rotation axis to the camera optical center, and this positioning introduces a small error to the determination.
It can be seen in Figure 12a that as the apple is rotated around the x axis, it is partially obstructed by the branch, especially when it is rotated by 90°. At this point, the point cloud morphology of the apple surface collected by the camera is quite different from that at 0°. The radius of the fitted sphere is therefore different, and this difference influences the subsequent attitude measurement. Table 1 shows that the angle error is large when the rotation angle is large.
Figure 12b shows that when rotation about the y axis was 60° or −60°, there were few branch cloud points in the neighborhood space of the apple, so the experimental range was set to [−60°, 60°] to ensure there were enough points to fit the straight line to the branch. During rotation around the y axis, the surface morphology of the apple captured by the camera was almost unchanged, so the radius of the apple fitting sphere was also steady. The angle error also changed little.
Figure 12c shows that as the apple rotates around the z axis, the apple morphology changed little, and so the radius of the fitting sphere changed little. However, when the rotation angle was −45°, the angle error was 10.08°. In the corresponding point cloud image, there were considerably fewer points on the branch surface than on the 0° image, resulting in a discrepancy between the fitted branch centerline and the actual branch line. If the discrepancy is large, the calculated apple orientation will also include a large error. The degree of error depends on the accuracy of the camera and the environment of the apple.
To test the reliability of the algorithm and its realtime performance, a test was repeated 474 times. The peduncle of the apple used in the experiment had some leaves to act as interference. Figure 13a shows the point cloud of the experimental apple. The calculated apple radius, the calculated position of the apple centroid O, and the calculated point C where the pedicel connects to the branch were recorded for each test, and then 1000 tests were performed. The time spent on detection, spherical apple fitting, straight branch fitting, and orientation determination for each detection was recorded.
The plotted test results are shown in Figure 13b. The red point cloud in the figure is the fitted sphere center O. It can be seen that the distribution of calculated apple centroids was concentrated. The average sphere radius was 42.64 mm; the measured apple radius was 41.67 mm The standard deviation of the sphere radius was 1.79 mm; the standard deviation of the coordinates of the center of the sphere was ≤2.5 mm. These statistics indicate that the algorithm was consistent and that sphere fitting was accurate.
The green point cloud in Figure 13b is the intersection points C of the fitting line from the centroid perpendicular to the nearest branch. The distribution of C is relatively scattered, and the standard deviation of the coordinates is 15.60 mm. The blue arrow in the figure is the fitted direction vector OC. The angle errors between the fitted vector and the expected vector were counted, where the expected vector is shown as the red arrow OC in Figure 13a; the starting point is the mean coordinate of all particle points O, and the end point is the mean coordinate of all points C.
The histogram of angle error is shown in Figure 14 and Figure 15. The expected angle error is 11.81°, the standard deviation is 13.65°, the proportion of angle errors within 10° is 62.658%, and the proportion of angle errors within 20° is 85.021%. The angle error can prevent any collision between the robotic arm and the branch during the actual operation. However, 3.376% of angle errors ≥ 50°. Figure 13 shows that some of the results with large angle errors point to the leaves, which is due to the leaves around the apple affecting the straight line fitting of the branches. More leaves will have a greater effect on the calculation results. The camera must therefore shoot upwards at a low angle to reduce the number of leaves in the picture, or the apple trees must be thinned before picking to reduce the effects of leaves on the calculations.
Figure 16 shows the apple image used to orient multiple targets. Figure 16a1,b1 are RGB images, the blue box is the apple bounding box. The apple recognition algorithm accurately detects apples in shadow or partially occluded apples. Figure 16a2,b2 are the point cloud images of fruit trees and apples, in which the red point cloud is the point cloud of the apple surface segmented by the RANSAC algorithm. This method can segment the apple that is blocked by the leaves and can also segment apples that are more densely distributed. The green point clouds in Figure 16a3,b3 are the branches found in the fruit neighborhood space by the method in this paper. There may be some common branch point clouds between adjacent fruits because they grow on the branch. The red arrow is the picking direction OC of the fruit defined in this paper. The yellow curve is the picking path of the end effector determined under the OC constraint, which can reduce the probability of collision. Therefore, calculating the apple picking direction OC is of great significance to automatic apple picking.
The computation time taken for orientation determination must be limited so that it does not delay the picking operation, so the execution time of the algorithm must be tested. In the tests, the number of iterations for sphere fitting was 1000, and for branch fitting was 3000. The computer and camera were used to carry out 1000 determination experiments on apples. Table 4 is the statistical and analytical results of the experiment. Statistical analysis showed that recognition time was approximately the same for each apple, sphere fitting time was influenced by the quantity of cloud points around the apple, and the total time for a single apple orientation determination was 0.543 s, which has little effect on apple picking. Reducing the number of iterations will increase the overall speed of the operation, but it will also reduce the accuracy of the determination.

4. Conclusions

In order to reduce the interference between automatic picking equipment and branches during picking, this paper proposes an algorithm that uses the positional relationship between the fruit and its adjacent branches to determine the picking direction. The algorithm was tested in experiments that included rotating the apple and branches around three coordinate axes. The results showed that the algorithm accurately parameterized the sphere to fit the apple and accurately parameterized the line to fit the branch closest to the apple. The picking direction was the vector of the approach direction calculated by the algorithm, and the calculated orientation of the apple was consistent with the actual orientation.
The dependability of the algorithm was tested. The standard deviation of the sphere radius fitting the apple was 1.79 mm, and the standard deviations of the center coordinates of the sphere were ≤2.5 mm. The average angle error between the calculated picking direction vector and the expected direction vector was 11.81°, and the standard deviation was 13.65°; 62.658% of the determinations erred by ≤10°, and 85.021% erred by ≤20°. The average time for estimating the orientation of an apple was 0.543 s.
The accuracy and speed of the algorithm suggest that it can meet the normal working requirements of a robotic apple picker. The apple orientation determination algorithm used an RGB-D camera to provide 3D images of apple orientations; this capability ensures that the robot will avoid grabbing branches or otherwise misgrabbing the apple. Future work will mainly focus on synthesizing the information of all objects around the apple to improve the determination of apple orientation and thus make the robot more effective.

Author Contributions

Conceptualization, R.G. and Q.Z.; methodology, R.G., S.C. and Q.Z.; investigation, Q.Z.; data curation, R.G., S.C. and Q.J.; formal analysis, Q.Z. and Q.J.; writing—original draft, R.G.; writing—review and editing, S.C., Q.Z. and Q.J.; project administration, R.G.; funding acquisition, Q.J. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Zhejiang University Science and Technology Innovation Activity Plan (grant no. 2021R409041).

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

The authors would like to thank the Key Laboratory of Transplanting Equipment and Technology of Zhejiang Province for its support. The authors also thank Haiyao Xia, Zeqiang Sun, and Chuang Li for their help.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhang, Z.; Wang, Y.; Zhang, Z.; Li, D.; Wu, Z.; Bai, R.; Meng, G. Ergonomic and efficiency analysis of conventional apple harvest process. Int. J. Agric. Biol. Eng. 2019, 12, 210–217. [Google Scholar] [CrossRef]
  2. Wang, S.; Niu, C.; Li, Z.; Shi, W.; Fan, G. Current research status and development trends of typical fruit and vegetable harvest robot. Xinjiang Agric. Mech. 2021, 6, 14–17. [Google Scholar]
  3. Zhou, Q.; Lu, R.; Li, F. Research status and development trend of apple picking machinery. For. Mach. Woodwork. Equip. 2021, 49, 4–9. [Google Scholar]
  4. Fu, L.; Gao, F.; Wu, J.; Li, R.; Karkee, M.; Zhang, Q. Application of consumer RGB-D cameras for fruit detection and localization in field: A critical review. Comput. Electron. Agric. 2020, 177, 105687. [Google Scholar] [CrossRef]
  5. Zhang, Z.; Igathinathane, C.; Li, J.; Cen, H.; Lu, Y.; Flores, P. Technology progress in mechanical harvest of fresh market apples. Comput. Electron. Agric. 2020, 175, 105606. [Google Scholar] [CrossRef]
  6. Zhou, J.; Zhang, G.; Liu, R.; Jin, Y. Apple attitude estimation based on particle filter for harvesting robot. Nongye Jixie Xuebao 2011, 42, 161–165. [Google Scholar]
  7. Giefer, L.A.; Arango Castellanos, J.D.; Babr, M.M.; Freitag, M. Deep learning-based pose estimation of apples for inspection in logistic centers using single-perspective imaging. Processes 2019, 7, 424. [Google Scholar] [CrossRef] [Green Version]
  8. Xie, Z.; Xu, Y.; Ji, C.; Guo, X.; Zhu, S. Estimation method of apple growing attitude based on computer vision. Nongye Jixie Xuebao 2011, 42, 154–157. [Google Scholar]
  9. Barth, R.; Hemming, J.; Van Henten, E.J. Angle estimation between plant parts for grasp optimisation in harvest robots. Biosyst. Eng. 2019, 183, 26–46. [Google Scholar] [CrossRef]
  10. Arad, B.; Balendonck, J.; Barth, R.; Ben-Shahar, O.; Edan, Y.; Hellström, T.; Hemming, J.; Kurtser, P.; Ringdahl, O.; Tielen, T.; et al. Development of a sweet pepper harvesting robot. J. Field Robot. 2020, 37, 1027–1039. [Google Scholar] [CrossRef]
  11. Wang, X.; Kang, H.W.; Zhou, H.Y.; Au, W.; Chen, C. Geometry-aware fruit grasping estimation for robotic harvesting in apple orchards. Comput. Electron. Agric. 2022, 193, 106716. [Google Scholar] [CrossRef]
  12. Kang, H.; Zhou, H.; Wang, X.; Chen, C. Real-time fruit recognition and grasping estimation for robotic apple harvesting. Sensors 2020, 20, 5670. [Google Scholar] [CrossRef] [PubMed]
  13. Kang, H.; Zhou, H.; Chen, C. Visual Perception and Modeling for Autonomous Apple Harvesting. IEEE Access 2020, 8, 62151–62163. [Google Scholar] [CrossRef]
  14. Lin, G.; Tang, Y.; Zou, X.; Xiong, J.; Li, J. Guava detection and pose estimation using a low-cost RGB-D sensor in the field. Sensors 2019, 19, 428. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Fan, J.; Guo, Y.; Li, X.; Zhang, X.; Wang, J. Research on fruit and vegetable detection algorithm based on YOLOv3. Changjiang Inf. Commun. 2022, 35, 3–6. [Google Scholar]
  16. Tian, Y.; Yang, G.; Wang, Z.; Wang, H.; Li, E.; Liang, Z. Apple detection during different growth stages in orchards using the improved YOLO-V3 model. Comput. Electron. Agric. 2019, 157, 417–426. [Google Scholar] [CrossRef]
  17. Kuznetsova, A.; Maleva, T.; Soloviev, V. Using YOLOv3 Algorithm with Pre- and Post-Processing for Apple Detection in Fruit-Harvesting Robot. Agronomy 2020, 10, 1016. [Google Scholar] [CrossRef]
  18. Fan, P.; Lang, G.; Guo, P.; Liu, Z.; Yang, F.; Yan, B.; Lei, X. Multi-feature patch-based segmentation technique in the gray-centered RGB color space for improved apple target recognition. Agriculture 2021, 11, 273. [Google Scholar] [CrossRef]
  19. Sabzi, S.; Abbaspour-Gilandeh, Y.; Hernandez-Hernandez, J.L.; Azadshahraki, F.; Karimzadeh, R. The use of the combination of texture, color and intensity transformation features for segmentation in the outdoors with emphasis on video processing. Agriculture 2019, 9, 104. [Google Scholar] [CrossRef] [Green Version]
  20. Raguram, R.; Chum, O.; Pollefeys, M.; Matas, J.; Frahm, J.M. USAC: A universal framework for random sample consensus. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 2022–2038. [Google Scholar] [CrossRef]
  21. Birdal, T.; Busam, B.; Navab, N.; Ilic, S.; Sturm, P. Generic primitive detection in point clouds using novel minimal quadric fits. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 1333–1347. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Shang, S. Study on Sphere Detection Methods Based on Three-dimensional Point Clouds; Northwest University: Shenzhen, China, 2020. [Google Scholar]
  23. Li, J. Discussion on the equation of a sphere through four points in space and the equation of a circle with points on a plane. J. Dali Univ. 2003, 2, 61–62. [Google Scholar]
  24. Li, J. Spatial linear fitting based on singular value decomposition of the matrix. Coll. Math. 2019, 35, 5–8. [Google Scholar]
Figure 1. Pictures of the growth of apple trees in an orchard in Wendeng District, Weihai City, Shandong Province, China: (a) picture of orchard condition; (b) picture of fruit growing on tree; (c) picture of apple picking experiment.
Figure 1. Pictures of the growth of apple trees in an orchard in Wendeng District, Weihai City, Shandong Province, China: (a) picture of orchard condition; (b) picture of fruit growing on tree; (c) picture of apple picking experiment.
Agriculture 12 01170 g001
Figure 2. Cases of picking failure because the fruit growth orientation has not been considered: (a) the gripper cannot grasp the fruit because of interference from a branch; (b) fruit and branches are gripped at the same time.
Figure 2. Cases of picking failure because the fruit growth orientation has not been considered: (a) the gripper cannot grasp the fruit because of interference from a branch; (b) fruit and branches are gripped at the same time.
Agriculture 12 01170 g002
Figure 3. (a) Robotic apple picking system; (b) picking orientation (A and B are two points on the branch, O is the center of mass of the apple, C is a point on the branch, and AB is perpendicular to OC, and OC is the picking direction); (c) robot picking track.
Figure 3. (a) Robotic apple picking system; (b) picking orientation (A and B are two points on the branch, O is the center of mass of the apple, C is a point on the branch, and AB is perpendicular to OC, and OC is the picking direction); (c) robot picking track.
Agriculture 12 01170 g003
Figure 4. Flowchart of apple orientation determination algorithm.
Figure 4. Flowchart of apple orientation determination algorithm.
Agriculture 12 01170 g004
Figure 5. Apple recognition renderings using the YOLOv3 neural network.
Figure 5. Apple recognition renderings using the YOLOv3 neural network.
Agriculture 12 01170 g005
Figure 6. Flowchart of sphere fitting algorithm using RANSAC.
Figure 6. Flowchart of sphere fitting algorithm using RANSAC.
Agriculture 12 01170 g006
Figure 7. Fitting a sphere to the apple: (a) global point cloud of the detected apple and tree within the camera image; (b) point cloud in the bounding box; (c) spherical fitting results. The pink point cloud in (c) is the RANSAC apple surface point cloud; the sphere is fitted by the least squares method.
Figure 7. Fitting a sphere to the apple: (a) global point cloud of the detected apple and tree within the camera image; (b) point cloud in the bounding box; (c) spherical fitting results. The pink point cloud in (c) is the RANSAC apple surface point cloud; the sphere is fitted by the least squares method.
Agriculture 12 01170 g007
Figure 8. Flowchart of spatial line detection algorithm using RANSAC.
Figure 8. Flowchart of spatial line detection algorithm using RANSAC.
Agriculture 12 01170 g008
Figure 9. The straight line fitting point cloud in the neighborhood of the target apple: (a) point cloud Minter in the branch detection space (i.e., with the sphere points removed); (b) straight line fitting. The green point cloud in is the point cloud for the branch closest to the centroid O fitted by RANSAC.
Figure 9. The straight line fitting point cloud in the neighborhood of the target apple: (a) point cloud Minter in the branch detection space (i.e., with the sphere points removed); (b) straight line fitting. The green point cloud in is the point cloud for the branch closest to the centroid O fitted by RANSAC.
Agriculture 12 01170 g009
Figure 10. Calculation of the position of the apple relative to the branch: (a) OC is the direction vector from the centroid to the line; (b) result graph showing useful points and nonuseful points; (c) result graph showing only useful points. The red arrow indicates the calculated apple picking direction; the point cloud of the apple surface is pink, the point cloud of the branch nearest to the apple is green, and gray represents other point clouds. Note that the plane of collinearity of A, B, C, and O is not necessarily the plane of this page. (This coloring is used in subsequent figures.).
Figure 10. Calculation of the position of the apple relative to the branch: (a) OC is the direction vector from the centroid to the line; (b) result graph showing useful points and nonuseful points; (c) result graph showing only useful points. The red arrow indicates the calculated apple picking direction; the point cloud of the apple surface is pink, the point cloud of the branch nearest to the apple is green, and gray represents other point clouds. Note that the plane of collinearity of A, B, C, and O is not necessarily the plane of this page. (This coloring is used in subsequent figures.).
Agriculture 12 01170 g010
Figure 11. Euler angle diagram: (a) end effector origin coordinate system; (b) coordinate system after rotating RZ around z axis; (c) coordinate system after rotating RY around y axis; (d) the x axis of the end effector coincides with the calculated direction.
Figure 11. Euler angle diagram: (a) end effector origin coordinate system; (b) coordinate system after rotating RZ around z axis; (c) coordinate system after rotating RY around y axis; (d) the x axis of the end effector coincides with the calculated direction.
Agriculture 12 01170 g011
Figure 12. (a) Rotate through −90°, −45°, 0°, 45°, and 90° around the x axis. (b) Rotate through −60°, −30°, 0°, 30°, and 60° around the y axis. (c) Rotate through −90°, −45°, 0°, 45°, and 90° around the z axis.
Figure 12. (a) Rotate through −90°, −45°, 0°, 45°, and 90° around the x axis. (b) Rotate through −60°, −30°, 0°, 30°, and 60° around the y axis. (c) Rotate through −90°, −45°, 0°, 45°, and 90° around the z axis.
Agriculture 12 01170 g012aAgriculture 12 01170 g012b
Figure 13. Distribution of apple orientation determinations for 474 experiments: (a) average of all results; (b) distribution of all calculated vectors OC.
Figure 13. Distribution of apple orientation determinations for 474 experiments: (a) average of all results; (b) distribution of all calculated vectors OC.
Agriculture 12 01170 g013
Figure 14. Histogram of angle error between calculated direction and measured direction.
Figure 14. Histogram of angle error between calculated direction and measured direction.
Agriculture 12 01170 g014
Figure 15. Normalized cumulative distribution of angle errors.
Figure 15. Normalized cumulative distribution of angle errors.
Agriculture 12 01170 g015
Figure 16. Fruit and environment modeling on orchard RGB-D images. (a1,b1) are fruit identification through RGB images; (a2,b2) are the extracted fruit point clouds; (a3,b3) are graphs for calculating picking directions. The apple and grasping direction are indicated by red dots and red arrows. Branches and leaves are represented by green dots. The yellow curve is the picking path of the end effector.
Figure 16. Fruit and environment modeling on orchard RGB-D images. (a1,b1) are fruit identification through RGB images; (a2,b2) are the extracted fruit point clouds; (a3,b3) are graphs for calculating picking directions. The apple and grasping direction are indicated by red dots and red arrows. Branches and leaves are represented by green dots. The yellow curve is the picking path of the end effector.
Agriculture 12 01170 g016aAgriculture 12 01170 g016b
Table 1. Experimental data rotated around the x axis.
Table 1. Experimental data rotated around the x axis.
Rotation Angle (°)Apple Fitting Spherical Radius (mm)Apple Actual Attitude Direction VectorExperiment to Calculate the Orientation Direction VectorAngle Error (°)
x (mm)y (mm)z (mm)x (mm)y (mm)z (mm)
−9024.744001−0.026−0.6550.9645.23
−4527.7800−0.7070.707−0.012−0.7070.7080.71
026.7000−100.000−0.999−0.0331.87
4525.7350−0.707−0.7070.040−0.710−0.7042.30
9021.02200−1−0.012−0.119−0.9936.85
Table 2. Experimental data rotated around the y axis.
Table 2. Experimental data rotated around the y axis.
Rotation Angle (°)Apple Fitting Spherical Radius (mm)Apple Actual Attitude Direction VectorExperiment to Calculate the Orientation Direction VectorAngle Error (°)
x (mm)y (mm)z (mm)x (mm)y (mm)z (mm)
−6026.6180−10−0.076−0.9710.0004.45
−3026.7330−10−0.013−0.993−0.1196.89
026.5440−100.029−0.999−0.0322.51
3027.1060−100.010−0.989−0.0412.43
6026.9990−100.014−0.999−0.0372.27
Table 3. Experimental data rotated around the z axis.
Table 3. Experimental data rotated around the z axis.
Rotation Angle (°)Apple Fitting Spherical Radius (mm)Apple Actual Attitude Direction VectorExperiment to Calculate the Orientation Direction VectorAngle Error (°)
x (mm)y (mm)z (mm)x (mm)y (mm)z (mm)
−9027.4111000.9980.0450.0353.27
−4527.7050.707−0.70700.721−0.671−0.17110.08
026.7120−100.000−0.996−0.0905.15
4526.669−0.707−0.7070−0.706−0.698−0.1166.65
9026.959−100−0.992−0.045−0.1197.304
Table 4. Efficiency analysis of apple orientation determination algorithm.
Table 4. Efficiency analysis of apple orientation determination algorithm.
SubtasksAverage Time (s)Standard Deviation (s)
Apple detection0.0420.004
Apple sphere fitting0.2730.061
Straight line fitting and orientation determining0.2280.009
Total0.5430.063
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gao, R.; Zhou, Q.; Cao, S.; Jiang, Q. An Algorithm for Calculating Apple Picking Direction Based on 3D Vision. Agriculture 2022, 12, 1170. https://doi.org/10.3390/agriculture12081170

AMA Style

Gao R, Zhou Q, Cao S, Jiang Q. An Algorithm for Calculating Apple Picking Direction Based on 3D Vision. Agriculture. 2022; 12(8):1170. https://doi.org/10.3390/agriculture12081170

Chicago/Turabian Style

Gao, Ruilong, Qiaojun Zhou, Songxiao Cao, and Qing Jiang. 2022. "An Algorithm for Calculating Apple Picking Direction Based on 3D Vision" Agriculture 12, no. 8: 1170. https://doi.org/10.3390/agriculture12081170

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop