Next Article in Journal
Primary Mapping and Analysis of the CmARM14 Candidate Gene for Mature Fruit Abscission in Melon
Next Article in Special Issue
Study on Spray Deposition and Drift Characteristics of UAV Agricultural Sprayer for Application of Insecticide in Redgram Crop (Cajanus cajan L. Millsp.)
Previous Article in Journal
Effects of Cutting Stages and Additives on the Fermentation Quality of Triticale, Rye and Oat Silage in Qinghai-Tibet Plateau
Previous Article in Special Issue
Double-DQN-Based Path-Tracking Control Algorithm for Orchard Traction Spraying Robot
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on a Map-Based Cooperative Navigation System for Spraying–Dosing Robot Group

1
College of Mechanical and Electronic Engineering, Northwest A&F University, Yangling 712100, China
2
Ministry of Agriculture and Rural Affairs Apple Full Mechanization Research Base, Yangling 712100, China
3
Scientific Observation and Experimental Station of Agricultural Equipment for the Northern China Ministry of Agriculture and Rural Affairs, Yangling 712100, China
4
State Key Laboratory of Soil Erosion and Dryland Agriculture on Loess Plateau, Yangling 712100, China
*
Author to whom correspondence should be addressed.
Agronomy 2022, 12(12), 3114; https://doi.org/10.3390/agronomy12123114
Submission received: 21 October 2022 / Revised: 3 December 2022 / Accepted: 4 December 2022 / Published: 8 December 2022
(This article belongs to the Special Issue Agricultural Environment and Intelligent Plant Protection Equipment)

Abstract

:
To solve the problem encountered when the spraying robot has run out of medicine even though the spraying task on the field is not complete, we developed a spraying–dosing robot group and proposed a collaborative navigation system based on an orchard map. Firstly, we constructed a 3D orchard point cloud map and set up navigation path points on the projected map. Secondly, we developed a master–slave command-based cooperative navigation strategy, where the spraying robot was the master and the dosing robot was the slave. Finally, the spraying robot and the dosing robot completed the cooperative navigation on the constructed map by using the pure pursuit algorithm and D-A control algorithm, respectively. To validate the cooperative navigation system, we conducted field tests on the separate communication and navigation control. The results of communication experiments demonstrated that the packet loss rate was less than 5%, which satisfied communication requirements. The experimental results of the navigation control demonstrated that the maximum value of the absolute lateral error is 24.9 cm for the spraying robot and 29.7 cm for the dosing robot. The collaborative navigation system proposed in this research can meet the automatic navigation requirements of the spraying–dosing robot group for collaborative tasks in traditional orchards.

1. Introduction

With the vigorous promotion of smart agriculture, agricultural robots have been widely used in production practice. However, single robots cannot meet the labor-intensive seasonal work, and the research of agricultural collaborative robots is gradually increasing. Juan et al. developed a collaborative robot for avocado harvesting and investigated the human–robot interaction strategy [1,2,3]. Thibault et al. studied a group of spraying robots in a vineyard and proposed a co-localization method based on Ultra-Wide Band technology [4].
China is the world’s largest fruit producer and consumer, with the highest fruit production and planted area in the world. China’s fruit production reached 286.92 million tons, with 12.64 million hectares of orchards planted, of which about 75% are traditional orchards [5]. Fruit trees in traditional orchards are planted with high density and crossed branches and leaves. The operating space of traditional orchards is extremely limited, allowing only small machinery to enter the orchard [6,7]. Orchard plant protection is a necessary part of orchard management. At present, spraying chemical pesticides for pest control is an important means to ensure fruit quality [8]. The number of spraying tasks reaches as many as 8–15 times a year, and the workload accounts for about 30% of the total orchard management workload [9].
Small-sized spraying robots that replaced manual operations have been widely used in traditional orchards [10]. However, the capacity of a single spraying robot is extremely limited and the amount of pesticide is not sufficient to complete the entire spraying tasks. The distance between water sources and orchards in arid areas is usually far, so the task of transporting water back and forth is particularly onerous. Therefore, it is necessary to use a dosing robot to transport the water needed for the spraying robots back and forth. Further, the collaboration between the spraying robot and the dosing robot not only reduces labor but also greatly increases productivity. Collaborative navigation is one of the most important technologies for the collaborative task of spraying and dosing robots.
At this stage, most of the robot groups are based on global navigation satellite systems (GNSS) for collaborative navigation. With the promotion and application of satellite positioning technology, Jo et al. used the difference in GNSS data between outdoor robots to calculate the distance between robots and to realize the cooperative positioning of robot groups [11]. However, the high density of fruit trees planted in tulip orchards makes it a huge workload to manually measure the GNSS position information of each fruit tree and then use this information as a target point to assist the robot group in cooperative localization. Tim et al. developed a ground-based agricultural robot for high-throughput crop phenotyping. The robot navigated autonomously by using path tracking between GPS waypoints. However, as the season progresses, this capability was lost when the sorghum grew taller than the GPS antenna [12]. Liang Zhang et al. developed an integrated BDS/IMU automatic navigation system for orchard spraying robots [13]. The system used a combination of BDS and IMU loosely coupled with path planning based on the terrain features of the orchard. Although this navigation system can make the spraying robot track the preset path smoothly and stably, it cannot satisfy the cooperative navigation of spraying and dosing. Xu Aigong et al. proposed a cooperative navigation vehicle localization method based on the BDS/UWB combination [14]. Firstly, the combined observation equation of BDS/UWB was constructed, and then the position parameters were solved by using the extended Kalman filter (EKF). Finally, the vehicle positioning result of cooperative navigation was obtained. Although the positioning accuracy of this method is significantly better than that of a single BDS system, it is not applicable to traditional orchards, where navigation satellite signals are easily lost [15].
Currently, visual navigation and laser navigation are also widely used in orchards [16,17]. Although vision sensors are low-cost and information-rich, they are susceptible to light and cannot be steadily navigated in the orchard [18,19]. Laser navigation would be more stable in a traditional orchard, and LiDAR build maps are used by some for collaborative navigation [20,21,22]. Cheein et al. proposed an Extended information filter-simultaneous localization and mapping (EIF-SLAM) method using fruit trees as environmental references [23]. The method utilized olive tree trunk information recognized by the robot’s on-board 2D LiDAR and monocular vision, and reconstructed the trunk information into a feature map. Its experimental results demonstrated that the map drawn by the robot by this method was consistent with the real scene. However, the trunk information could not be obtained when there were people or similar objects interfering next to the trunk, which led to mapping errors. To reduce the relative errors that occurred during laser or visual measurements, Gimenez et al. proposed a method for constructing maps based on solving optimization problems with nonlinear constraints [24]. The method measured the polar angle of fruit trees by robotic vehicle-mounted 2D LiDAR, and then optimized this measurement information with the absolute GPS position information of fruit trees in the four corners of the orchard. However, the method is not suitable for identifying fruit tree trunks that are obscured by canopy or branches and leaves. To identify lower fruit tree trunks, Shalal et al. used vehicle-mounted 2D LiDAR and vision to detect fruit tree trunks [25], and GPS information from multiple known points within the fruit tree rows was used to correct the detected trunk information and generate an orchard map. However, this map only contains the location information of the center point of the fruit trees in the orchard, and there are measurement errors in the vision sensors carried by different robots. Whether the orchard map constructed based on this method can provide cooperative positioning information for the robot group needs to be further experimented.
In this paper, a map-based construction of a collaborative navigation system for spraying-dosing robot groups was developed for the production needs of traditional orchard spraying and dosing sessions in arid regions. In this study, a 3D orchard point cloud map was constructed first, then paths and path points were set on the 2D projection map, and finally, the spraying and dosing robot groups achieved collaborative navigation on the map according to the developed collaborative navigation strategy and motion control algorithm. Synergizing the spraying and dosing robots in traditional orchards can not only alleviate the labor shortage but also improve production efficiency, and the research of the cooperative navigation system is an important foundation for the intelligent development of multiple robots in orchards.
This paper is organized as follows: Section 2 describes the system components and research methodology of the spraying–dosing robot group; Section 3 describes the specific experimental protocol and experimental results; Section 4 discusses the experimental results and Section 5 presents the conclusions of this article.

2. Materials and Methods

2.1. Structure and Design

2.1.1. Navigation System

The spraying–dosing robot group consists of a spraying robot and a dosing robot, and its hardware system is shown in Figure 1. The spraying robot consists of two main parts, the traction robot and the sprayer, which are connected by pins. The spraying robot and the dosing robot communicate through the wireless routers and receive each other’s navigation information for cooperative navigation.
The central control unit of the spraying robot adopts UNO-2484G computer, the bottom control unit includes a KYDBL2450-2E motor controller and two 48 V DC brushless motors, the centimeter-level accuracy RTK-GNSS in the positioning unit is composed of M600 mobile station and T300 base station produced by Sinan, and the model of LIDAR is Velodyne VLP-16. The central control unit of the dosing robot adopts BOXER-8150AI computer, the bottom control unit includes a VSY100D60-2 motor controller and two 48 V DC brushless motors, the positioning unit adopts CGI-410 mobile station and i70 base station produced by Sinan, and the model of LIDAR is RS-LIDAR-16.
The software system of the spray–dosing robot is shown in Figure 2. The information interaction layer is used for the information interaction between the spraying robot and the dosing robot under the Ubuntu system; the information processing layer is the cooperative navigation program based on the Robot Operating System (ROS); the execution layer is based on the Ubuntu system’s mobile platform control, which outputs speed information to the underlying servo controller and finally realizes the autonomous navigation of the spraying robot and the dosing robot.

2.1.2. Communication System Design

A stable and reliable wireless communication system is a prerequisite for the collaborative navigation of the spraying–dosing robot group, under which the robot group can co-locate and navigate based on exchanged position and velocity information [26,27]. To save development time, the wireless communication system is based on Wi-Fi communication, using the existing IEEE 802.11 series of communication standards and the free public frequency band of 2.4 GHz. The data frame format for robot information interaction uses the communication data frame format developed by the team research [28,29].
The two robots communicate point-to-point in a local area network with the help of a wireless router, where the wireless router (ADV1) has a connection (Internet Protocol, IP) address of 192.168.62.1. The IP address of the spraying robot communication system is set to 192.168.62.2, the sub-virtual mask set to 24, and the gateway address is the same as the router address. Similarly, the IP address of the dosing robot is set to 192.168.62.20, and the rest is the same as the spray delivery robot. After the setting is completed, check whether the address of the wireless network card is modified successfully by opening the terminal.

2.2. Map Construction

In this section, based on the structure and design of the robot, we construct a point cloud map of the orchard that provides a priori environmental information for collaborative navigation.
Firstly, we remotely control the body of the spraying robot to drive continuously through the orchard while recording point cloud information. Secondly, we process the collected point cloud data and perform point cloud stitching by finding the neighboring points in two adjacent frames. To reduce the amount of data during point cloud stitching, we refer to the lightweight and ground-optimized LiDAR mapping algorithm (LeGO-LOAM), which well balances the accuracy and efficiency of building maps [30,31,32]. Thirdly, we use the adjacent points in the point cloud to calculate the attitude estimate between two adjacent frames and generate LiDAR odometry information. Finally, we optimized the LiDAR odometry information using real-time GNSS pose information as a constraint and constructed a point cloud map of the orchard, as shown in Figure 3.

2.2.1. Feature Classification

We establish the local coordinate system 𝑋𝑌𝑍 with the center point of the spraying robot body as the origin (LiDAR is at the center point of the spraying robot body). Since the LiDAR was calibrated before being used, the calibrated point cloud image is projected directly onto the 𝑋𝑂𝑌 plane frame by frame here. In the current frame of the point cloud p t , the distance from a single point p i t to the LiDAR is the value of that point. As shown in Figure 4, any point with horizontal scan angle α i and vertical scan angle β i in the 𝑋𝑌𝑍 spatial coordinate system has coordinates of x i t , y i t and z i t , respectively. The local coordinate system 𝑋𝑌𝑍 is rotated around 𝑋-axis, 𝑌-axis and 𝑍-axis by the angles θ roll , θ pitch and θ yaw , respectively.
In order to distinguish the projected points on the 𝑋𝑂𝑌 plane matrix and to improve the speed of searching the point cloud, we label the individual points p t in the point cloud matrix according to Equation (1). The main markers are the row number index number r i , column number index number c i and the value v ( t , i ) r of the plane in which the point is located. Each point is unique according to the index and value.
r i = ( arctan ( Z i t X i t 2 + Y i t 2 ) × 180 π + 15 ) 2 c i = ( arctan ( X i t Y R ( t ) i ) × 180 π 90 ) 0.2 + 360 2 v ( t , i ) r = X i t 2 + Y i t 2 + Z i t 2
In Equation (1), X i t , Y i t , Z i t are the horizontal, vertical and height coordinates of any point x in space in the point cloud of the current frame.
After marking the points in a single frame utilizing reprojection, the ground points and non-ground points p t can be quickly distinguished. The principle is that the points smaller than 1° of the LiDAR vertical scanning angle are ground points and the rest are non-ground points. For non-ground points, the breadth-first search (BFS) algorithm is used to search the surrounding neighborhoods adjacent to any point p i t according to the row index number, as shown in Figure 5. If the row number of any point p i t is r i and the column number is c i , the row number of the neighboring points is r i 1 or r i + 1 , and the column number of the neighboring points is c i 1 or c i + 1 . According to Equation (2), determining whether p i t and its neighboring points are in the same plane when the angle between two points is less than 60° is considered as the same plane, and vice versa is a different plane [19]. After the neighboring points of point p i t are traversed one by one, then search the adjacent points of neighborhood, such as point p j t , until all single-frame point clouds are traversed.
δ = arctan v ( t , j ) r sin γ v ( t , i ) r v ( t , j ) r cos γ j ( r ± 1 , c ± 1 )
In Equation (2), 𝛿 is the angle between any point and the adjacent point, v ( t , j ) r is the Euclidean distance between the adjacent point and the LiDAR, v ( t , i ) r is the Euclidean distance between any point and the LiDAR, and γ is the angular difference between any point and the adjacent point.
In the orchard environment, there will be a lot of weeds, rugged ground, leaves blown by the wind, etc. This will result in scattered points in the point cloud acquired in the current frame, and these scattered points will have a large impact on the accuracy of feature classification. In order to obtain stable and reliable point cloud features, when the number of point clouds in the plane of non-ground point cloud p i t is less than 30, these discrete point clouds are removed directly. After point cloud clustering, only large objects such as tree trunks, ground, signage, and other feature point clouds are left, as shown in Figure 6. Figure 6a shows the original point cloud, and Figure 6b shows the point cloud after clustering. It can be observed that the number of point clouds after clustering is dramatically reduced compared with the original point cloud. Only stable point cloud features are retained, such as green for ground point clouds and black for non-ground object features.
Then, the clustered point cloud planar features are divided again into 6 regions equally according to the LiDAR horizontal scanning angle (0∼360°), and the same row index number c in each region is regarded as a subset. Drawing on the method of BFS search, the five points adjacent to the left and right of the point cloud particles p i t in the same subset are taken for comparison each time. The smoothness S i of each point cloud particle in the same subset is calculated by substituting the point cloud particles into Equation (3) according to the distance between the neighboring points p j t and p i t . When the smoothness S i of the point cloud particles is greater than the set threshold S th (set to 5 × 10−3) then it is an edge feature, and conversely, S i is less than the set threshold then it is a surface feature.
S i = 1 | P | r i j S   j i ( p j r p i r )
In Equation (3), 𝑃 is a continuous subset of 5 adjacent points on the projection plane of the current frame.
The non-ground point cloud with the maximum smoothness Smax in each subset within a single aliquot is set as the edge feature n F e t , and the minimum smoothness S𝑚𝑖𝑛 is set as the surface feature n F s t . The F e t and F s t of the 6 aliquots are the point cloud collections of edge and plane features of the current frame point cloud. As shown in Figure 7a, the red points in the figure are the point clouds of edge features, and the green points are the point clouds of surface features in the current frame. The set of non-ground surface features with maximum smoothness in the aliquot is the set of non-ground edge features 𝑛 F e t , and the set of ground surface features with minimum smoothness in each aliquot is the set of ground edge features 𝑛 F s t . The F e t and F s t of 6 aliquots are the sets of non-ground edge feature and ground surface feature point clouds for the current frame, respectively. Here, F e t belongs to F e t and F s t to F s t . The overlapping point clouds can be removed by smoothing calculation. As shown in Figure 7b, the yellow points are the point clouds with non-ground edge features for the current frame, and the blue points are the point clouds with distinctive ground surface features.
After downsampling the point cloud by feature classification, the surface features and edge features of the current frame point cloud can be obtained, and this operation can greatly reduce the number of single-frame point clouds, where each point cloud feature set contains the point cloud, as shown in Equation (4).
F s t = { [ p 1 t , , p i t , , p n 1 t ] T 1 i n 1 } F e t = { [ p 1 t , , p i t , , p n 2 t ] T 1 i n 2 } F s t = { [ p 1 t , , p i t , , p n 3 t ] T 1 i n 3 } F e t = { [ p 1 t , , p i t , , p n 4 t ] T 1 i n 4 }
In Equation (4), n1, n2, n3, n4 are the number of point clouds in the current frame for F s t , F e t , F s t , and F e t , respectively.

2.2.2. Posture Estimation

We perform correlation calculation on the surface feature point clouds to obtain the point cloud particles with the same surface features in the point clouds of the two adjacent frames. Then, the surface feature Iterative Closest Points (ICP) matching is performed by the same point cloud particles (neighboring points) to obtain the corresponding plane rotation matrix. Under the constraint of this rotation matrix, we perform correlation calculation on the edge feature point clouds to obtain the point cloud particles with the same edge features in the adjacent two frames. Immediately after that, we perform ICP matching of the edge features by the neighboring points to obtain the corresponding edge rotation matrix relationship. The pose estimation of a single-frame point cloud can be obtained by unifying the two matrices, which is the pose estimation of the LiDAR sensor relative to the previous frame. The steps of point cloud pose estimation are shown in Figure 8. The principle is to feature match the point clouds of two adjacent frames in LiDAR to obtain the conversion relationship from a single point to an edge point cloud or a surface point cloud. Then, the coordinate system of the original point cloud is transformed according to the transformation relationship to obtain the point cloud pose estimation.
The covariance is used to determine the correlation between two adjacent point clouds in the above equation, as shown in Equation (5). We start the calculation from the second frame by default, and the first frame is only used to record the time information. If the correlation coefficient ρ F s t 1 F s t between the points in the current frame F s t and the points in the previous frame F s t 1 is greater than 0, the point cloud particles in the surface features of the two frames are correlated. At this time, the correlated points in F s t 1 are recorded as ground points and saved at this time, otherwise, they are regarded as non-ground points and discarded directly. Similarly, if the correlation coefficient ρ F e t 1 F e t between the points in the current frame F e t and the points in the previous frame F e t 1 is greater than 0, the point cloud particles in the edge features of the two frames are correlated. Then, we record the relevant points in F e t 1 as the same edge and save them according to the point markers.
ρ F s t 1 F s t = cov ( F s t 1 , F s t ) σ F s t 1 σ F s t ρ F e t 1 F e t = cov ( F e t 1 , F e t ) σ F e t 1 σ F e t
In Equation (5), ρ F s t 1 F s t is the correlation coefficient between two adjacent frames of the point cloud, which p i t is in the same plane as p i t 1 ,   ρ F e t 1 F e t is the correlation coefficient where p i t and p i t 1 are on the same side in two adjacent frames of the point cloud,   cov ( F s t 1 , F s t ) is the covariance between p i t and p i t 1 in the surface feature point cloud of two adjacent frames, and cov ( F e t 1 , F e t ) is the covariance between p i t and p i t 1 in the edge feature point cloud of two adjacent frames.
The point cloud particles with correlation coefficients ρ F s t 1 F s t > 0 and ρ F e t 1 F e t > 0 are substituted into Equation (6) to find the surface feature vector λ j l m corresponding to the feature maxima A s of the current frame surface feature point cloud p i t and the previous frame surface feature points p j t 1 , p l t 1 and p m t 1 . Similarly, the edge feature vector λ j l corresponding to the feature maxima A e of the current frame edge feature point p i t and the previous frame edge feature points p j t 1 and p l t 1 can also be found.
A s λ j l m Σ ^ ( F s t 1 , F s t ) x p i t = 0 A e λ j l Σ ^ ( F e t 1 , F e t ) x p i t = 0 ,
In Equation (6), x p i t is the value of the current frame feature point p i t in the local coordinate system XYZ, Σ ^ ( F s t 1 , F s t ) is the surface feature-related particle swarm covariance matrix, and Σ ^ ( F e t 1 , F e t ) is the edge feature-related particle group covariance matrix.
According to the area of space quadrilateral method, the distance d ε from the current frame surface feature point p i t to the previous frame plane feature vector λ j l m can be found. According to the area of the parallelogram method, the distance d H from any point in the current frame edge feature point cloud to the previous frame edge feature vector λ j l can be found. In Figure 9, s j l is the direction vector of p j t and p l t in the previous frame edge feature point clouds, and s j m is the direction vector of p j t and p m t .
d ε = λ j l | s j l | d H = λ j l m | s j l × s j m | ,
The smaller the values of d ε and d H , the shorter the distance between the current frame feature point p i t and the previous frame surface feature F s t 1 or edge feature point cloud F e t 1 , as shown in Figure 10. When point p i t is perpendicular to the previous frame feature F s t 1 , the distance min( d H ) between feature point p i t and the corresponding point p i t 1 of the previous frame feature point cloud is the shortest. This vector value is used to estimate the point cloud pose between two frames in the fastest way. The initial frame point cloud is used as the starting point to establish the map coordinate system as X M Y M Z M . The distance D between the initial frame point p i 0 and p i t 1 is calculated, and this distance is used to obtain the pose estimate of the current frame point cloud relative to the map coordinate system.
The coordinate system rotation is the inverse rotation of the vector. Referring to Euler’s formula according to the rule of clockwise rotation, the relationship between the current frame point cloud p i t in XYZ coordinate system and the corresponding point p M in X M Y M Z M coordinate system can be obtained, as shown in Equation (8).
p M = Q ( p M , p i t ) = R y R x R z p i t + T = R p i t + T
In Equation (8), Q is the transformation function of the current frame point p i t in XYZ coordinate system, and the point p M in X M Y M Z M coordinate system, R is the rotation matrix of p i t and p M , and T is the translation matrix of p i t verses p M .
There is a certain error between p i t obtained by Equation (8) and the true value, as shown in Equation (9).
f ( X ) = D ( p M , M ) = D ( R p i t + T , M ) ,
In Equation (9), f ( X ) is the error function and M is the true value of p i t in X M Y M Z M coordinate system.
The initial value f ( X ) is substituted into Equation (10) and iterated until the conversion error is minimized (10−6) or the number of iterations is satisfied (set to 500). The pose estimate ξ i [ t x , t y , t z , θ r o l l , θ p t i c h , θ y a w ] of the local single-frame point cloud in the point cloud map coordinate system can be obtained, and the single-frame point cloud is converted into the point cloud map after substituting it into Equation (8).
J T J Δ x l m = J T f ( X ) J = [ d H θ r o l l d H θ p t i c h d ε θ y a w d ε t x d ε t y d H t z ] T ,
In Equation (10), Δ x l m is the iteration step length and J is the Jacobi matrix of single-frame feature point observables.

2.2.3. Map Optimization

LiDAR odometry information will drift when the robot turns too many times or the rotation angle is too large, which leads to an increase in the error of the point cloud map. In order to control the drift of the LiDAR odometry and improve the accuracy of the point cloud map, this section adopts a GNSS optimization method based on graph theory to optimize the map, as shown in Figure 11. We used the GNSS information with fixed frequency (2 Hz) to constrain the LiDAR pose estimation several times and obtain an orchard environment map based on GNSS information. In Figure 11, the solid black line is the robot’s trajectory, the green dots represent the LiDAR point clouds of the current frame with continuously updated position estimates, the red pentagrams represent GNSS information ( X G , Y G , Z G ) , and the red dots are the robot’s positions aligned with GNSS frequency with position point coordinates of ( X m , Y m , Z m , θ r o l l m , θ p i t c m h , θ y a w m ) , where X m , Y m , Z m are the robots’ coordinates in the cooperative coordinate system and θ r o l l m , θ p i t c m h , θ y a w m are the position points’ cross roll, pitch and yaw angle information.
According to Equation (10), the covariance matrix Σ r i of the point cloud position pose ξ i for each current frame can be obtained. After substituting Σ r i and the covariance matrix Σ m j of the current GNSS position data into Equation (11), the position of the robot can be obtained after integrating the data from the beginning to the present.
A i = Σ i = 1 n Σ r i 1 + Σ m j 1 i [ 1 , 2 , 3 , , n ] ,
In Equation (11), A i is the Point cloud pose and GNSS position data integration accuracy estimation matrix.
A r i = A i 1 Σ r i 1 A m j = A i 1 Σ m j 1 ξ m i = A r i T r + A m j m j ,
In Equation (11), T r is the robot’s current position parameters and ξ m i the GNSS-corrected geoid position parameters.
According to Equation (12), the positional parameters ξ m i can be obtained and a point cloud map optimized based on GNSS information can be generated, as shown in Figure 12.

2.3. Motion Control

To achieve the best path tracking results for the spraying robot and the dosing robot in their respective navigation methods, pure pursuit control based on the trajectory motion model is used for the spraying robot, and D-A control based on RTK-GNSS is used for the dosing robot.

2.3.1. Pure Pursuit Control

Based on the research of our team members, we adopt the pure pursuit algorithm based on robot kinematic model for the spraying robot [28]. The pure pursuit algorithm is a widely applicable geometric method for low-speed movement control with the advantages of few parameters, high predictability, and accurate linear tracking capability in narrow scenarios [33]. The principle of this algorithm is to change the lateral deviation of the robot’s current position point from the linear trajectory by the magnitude of the pre-sight distance to achieve a good tracking effect.

2.3.2. D-A Control

The dosing robot takes the target point as the tracking object and adopts the Distance-Angle (D-A) tracking control method [34]. Firstly, input the longitude and latitude information of the target point measured by RTK-GNSS at static time, and then calculate the direction difference and distance between the current point and the target point. It is judged with the set threshold value to adjust the direction and movement strategy of the next driving of the dosing robot. In addition, to prevent collision between two robots during cooperative movement, each robot adopts the collision strategy of the LiDAR detection of obstacles in the direction of movement and the emergency stop.

2.4. Collaborative Navigation Strategy

Based on the construction of a 3D orchard point cloud map, a master–slave command-based collaborative navigation strategy was further developed, as shown in Figure 13. The cooperative navigation consists of two parts, which are the navigation of the spraying robot in the orchard and the navigation of the dosing robot on the road. The spraying robot as the master robot sends the co-navigation command to the dosing robot and publishes the information about the co-dosing point. The dosing robot receives and responds to the cooperative request issued by the spraying robot and turns on the fixed-point navigation function. The dosing robot first goes to the water source to fill the water, then moves to the collaborative dosing point issued by the spraying robot after filling the water, while feeding the navigation status information to the spraying robot. After the dosing robot arrives at the collaborative dosing point, it doses the spraying robot within the waiting time. After the dosing task is finished, the spraying robot continues to perform the spraying task. The dosing robot returns to the water source and waits for a new cooperative navigation command with the next cooperative dosing point.

3. Results

To test the performance of the cooperative navigation system of the spraying–dosing robot group, the communication system and navigation control were experimented on separately. The experiment site was selected from the National Persimmon Germplasm Resource Park of Northwest Agriculture and Forestry University, in which the persimmon orchards were all traditional orchards with fruit trees spaced about 2.4 m apart in rows, as shown in Figure 14.
The experimental orchard environment in which the spraying–dosing robot group work consists of a garage, an orchard, a house, a road and a water source. Figure 15a shows a schematic diagram of the orchard environment and Figure 15b shows a physical diagram of the orchard environment.

3.1. Communication Experiment

To verify the reliability of the information interaction between the spraying robot and the dosing robot, the communication system of the cooperative navigation system was experimented upon. The dosing robot was docked at point D on the water source, and the spraying robot was docked at points A, B and C in the orchard, as shown in Figure 16. To ensure the minimum communication distance of the robot group, the wireless router was deployed in the middle of the orchard. The height of the wireless router deployment is affected by the orchard environment. If the deployment position is too high, the robot group and the wireless signal are easily affected by the low branches of the fruit trees, resulting in its poor passability; if the deployment position is too low, it is easy to aggravate the influence of weeds on signal propagation (including reflection, bypass, etc.) [29]. Therefore, to ensure the wireless signal strength of the communication system, the wireless router was deployed at the fork of the main trunk of the fruit tree.
Before starting the experiment, the wireless communication router ADV1 was turned on to set up a LAN for the two robots, and then the dosing robot was connected to the spraying robot via SSH commands and IP addresses. The experiment was conducted by sending 500 packets from the spraying robot to the dosing robot. Then, according to the location between the two robots, the IP addresses of the robots connected from the spraying robot to the dosing robot were checked and the number of packets received by the dosing robot after communication was checked, and the packet loss rate was calculated. The experiment was repeated three times independently and the average packet loss rate at each experiment point was calculated.
The rate of communication data loss between the spraying robot and the dosing robot in the experiment orchard environment is shown in Table 1.

3.2. Cooperative Navigation Control Experiment

To test the effect of motion control on the cooperative navigation system, the experiment of cooperative navigation control is carried out for the spraying–dosing robot group. Since the spraying robot and the dosing robot chose different models of RTK-GNSS base stations in the positioning module, to ensure the consistency of the positioning information, it is necessary to match the positioning points of the two base stations. Firstly, the base stations of the two robots were respectively set up on a fixed tripod. Secondly, the base station of the spraying robot was used to measure and record the position point information of the base station of the dosing robot. Finally, the position point information was input into the base station of the dosing robot.
The 3D point cloud map needs to be projected into a 2D point cloud map before the experiment. Then, the navigation preset paths and points of the spraying–dosing robot group are set on the 2D point cloud map. In Figure 17, the red pentagram is the navigation starting point of the spraying robot, the green line is the preset path of the spraying robot, the yellow line is the preset path of the dosing robot, and the blue dot is the cooperative dosing point. The spraying robot travels through the orchard in the direction of the red arrow, and the dosing robot goes from the garage to the water source, where it finishes filling up and heads to the collaborative dosing point in the orchard.
In Figure 18, a preset point is set at the midpoint of each end of the row, and the two points are connected as a preset path for spraying between the rows. The spraying robot adopts smooth steering when turning at the head of the ground, so two path points are needed at the head of the ground to assist in turning [35].
Based on the results of several previous orchard experiments, the walking speed of both the spraying robot and the dosing robot was set to 0.75 m/s in this experiment [7,28]. During the experiment, the spraying–dosing robot group were moved along the preset paths on the map, and the GNSS coordinate information of the driving trajectory was recorded at the same time. The experiment was repeated three times, in which the spraying robot was tested for continuous operation between five rows. The experimental scenarios are shown in Figure 19.
To verify the path-tracking effect of the spraying–dosing robot group, the motion trajectory of the spraying–dosing robot group during operation is plotted in Figure 20. When the spraying robot sprays in between rows, the actual motion trajectory is roughly the same as the preset path between rows, but there are also some trajectories with deviations. When the spraying robot turns at the head of the ground, the actual trajectory deviates more from the preset path at the head of the ground. The deviation of the trajectory of the movement of the dosing robot to the reservoir and the trajectory of the return to the orchard is smaller.
To analyze the performance of the cooperative navigation system more accurately, the lateral deviation and heading deviation of the spraying robot and the dosing robot from the known trajectory were calculated, respectively, as shown in Figure 21a–d.
The lateral deviation of the spraying robot was within 15 cm and the heading deviation was about 10° when spraying between rows in the orchard. However, the deviation of the spraying robot changed significantly during the head turn, the maximum value of the lateral deviation was less than 52 cm and the maximum value of the heading deviation was less than 45°. Since the spraying robot does not carry out a spraying operation when the ground head turns and no obstacles are blocking the ground head, the deviation value can meet the demand of cooperative ground head navigation. The maximum lateral deviation of the dosing robot was 30.1 cm, the average lateral deviation was 5.7 cm, the maximum heading deviation was less than 90°, and the minimum heading deviation was more than 10°. The maximum and average values of lateral deviation of the dosing robot were 29.7 cm and 5.7 cm, respectively, and the lateral deviation during linear motion was about 5 cm.

4. Discussion

The above experimental results demonstrated that the developed map-based construction of the spraying–dosing robot group cooperative navigation system can meet the navigation requirements of traditional orchard spraying and dosing tasks.
(1) In the experimental results of the communication system, the maximum communication distance of the spraying–dosing robot group was 91.3 m, and the maximum value of the packet loss rate was 3.8%, which was less than 5%. The experimental results demonstrated that the spraying–dosing robot group can achieve point-to-point messaging in the traditional orchard. Although the packet loss rate of communication between robot groups increased with the increase in communication distance, the packet loss rates were all less than 5%. Therefore, the Wi-Fi-based spraying–dosing robot communication system can meet the communication needs of the cooperative operation in orchards.
(2) The experimental results of cooperative navigation control demonstrated that the average lateral deviation of the spraying robot was 6.7 cm when it traveled at 0.75 m/s between orchard rows and 19.8 cm when it made a turn in the ground, while the average lateral deviation of the dosing robot was 5.7 cm. This is because the canopy above the robot’s path is less shaded than the spraying robot. In addition, the working road of the dosing robot is a hard concrete road, while the road in the orchard where the spraying robot works is soft, and there may be slippage during the driving process. The actual motion trajectory of the dosing robot was approximately the same as the preset path, with an average lateral deviation of 5.7 cm, indicating that the D-A control method can ensure that the dosing robot followed the path point in a relatively stable manner. The average lateral deviation was 6.4 cm, but the lateral deviation and heading deviation increased to 51.7 cm and 43.1°, respectively, during the turn. To reduce the lateral deviation and heading deviation of the spraying robot when turning on the ground, the pure pursuit algorithm needs to be optimized in the future.
The spraying robot as a master robot in this article demonstrated a 17% reduction in maximum lateral deviation compared to the spraying robot based on the fuzzy control algorithm [28]. The dosing robot as a slave robot has reduced the maximum lateral deviation by 25.2% compared to the orchard transport robot based on following navigation [7].

5. Conclusions

In this study, a map-based construction of a collaborative navigation system for spraying–dosing robot groups was developed and field experiments were conducted in orchards, and the following are the specific findings of this study.
(1) The results of the navigation communication experiments demonstrated that the packet loss rate during communication was less than 5%, which proved that the communication system constructed in this paper met the communication needs of traditional orchards.
(2) The results of the navigation control experiments showed that the maximum value of the absolute lateral error in the row for the spraying robot was 24.9 cm, and the maximum value of the absolute lateral error for the dosing robot was 29.7 cm. The results proved that the cooperative map-based navigation system developed in this paper met the navigation accuracy requirements for spraying and dosing operations.
(3) In future work, the cooperative navigation strategy needs to be optimized and the pure pursuit control algorithm needs to be improved to reduce the deviation during the ground head turn.

Author Contributions

Conceptualization, F.Y.; methodology, W.M.; software, J.Q.; validation, J.Q., W.W. and W.M.; formal analysis, M.Y.; investigation, H.L.; resources, F.Y.; data curation, W.W.; writing—original draft preparation, J.Q.; writing—review, F.Y. and J.Q.; writing—editing, J.Q.; visualization, Z.R. and S.S.; supervision, F.Y.; project administration, F.Y.; funding acquisition, F.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Shaanxi Province Key Research and Development - Key Industry Innovation Chain Agricultural Field Project (Program No. S2022-YF-ZDCXL-ZDLNY-0128) and the Major Science and Technology Project of Shaanxi Province of China (Program No. 2020zdzx03-04-01).

Data Availability Statement

Not applicable.

Acknowledgments

We also thank the critical comments and suggestions from the anonymous reviewers for improving the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Vasconez, J.P.; Kantor, G.A.; Cheein, F.A.A. Human–robot interaction in agriculture, A survey and current challenges. Biosyst. Eng. 2019, 179, 35–48. [Google Scholar] [CrossRef]
  2. Vásconez, J.P.; Cheein, F.A.A. Workload and production assessment in the avocado harvesting process using human-robot collaborative strategies. Biosyst. Eng. 2022, 223, 56–77. [Google Scholar] [CrossRef]
  3. Vasconez, J.P.; Guevara, L.; Cheein, F.A. Social robot navigation based on HRI non-verbal communication: A case study on avocado harvesting. In Proceedings of the 34th ACM/SIGAPP Symposium on Applied Computing, Limassol, Cyprus, 8–12 April 2019; pp. 957–960. [Google Scholar]
  4. Tourrette, T.; Deremetz, M.; Naud, O.; Lenain, R.; Laneurit, J.; De Rudnicki, V. Close coordination of mobile robots using radio beacons, A new concept aimed at smart spraying in agriculture. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 7727–7734. [Google Scholar]
  5. National Bureau of Statistics of China. China Statistical Yearbook; National Bureau of Statistics of China: Beijing, China, 2021.
  6. Zheng, Y.; Jiang, S.; Chen, B.; Lü, H.; Wan, C.; Kang, F. Review on technology and equipment of mechanization in hilly orchard. Trans. Chin. Soc. Agric. 2020, 51, 1–20. [Google Scholar]
  7. Liu, Z.; Wang, X.; Ren, Z.; Mao, W.; Yang, F. Crawler tractor navigation path tracking control algorithm based on virtual radarmodel. Trans. Chin. Soc. Agric. Mach. 2021, 52, 375–385. [Google Scholar]
  8. Gu, C.; Wang, X.; Wang, X.; Yang, F.; Zhai, C. Research progress on variable-rate spraying technology in orchards. Appl. Eng. Agric. 2020, 36, 927–942. [Google Scholar] [CrossRef]
  9. Zhai, C.Y.; Zhao, C.J.; Ning, W.; John, L.; Wang, X.; Paul, W.; Zhang, H.H. Research progress on precision control methods of air-assisted spraying in orchards. Trans. Chin. Soc. Agric. Eng. 2018, 34, 1–15. [Google Scholar]
  10. Liu, L.; Liu, Y.; He, X.; Liu, W. Precision Variable-Rate Spraying Robot by Using Single 3D LIDAR in Orchards. Agronomy 2022, 12, 2509. [Google Scholar] [CrossRef]
  11. Jo, K.; Lee, J.; Kim, J. Cooperative multi-robot localization using differential position data. In Proceedings of the 2007 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Zurich, Switzerland, 4–7 September 2007. [Google Scholar]
  12. Mueller-Sim, T.; Jenkins, M.; Abel, J.; Kantor, G. The Robotanist, A Ground-based Agricultural Robot for High-throughput Crop Phenotyping. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 3634–3639. [Google Scholar]
  13. Zhang, L.; Zhu, X.; Huang, J.; Huang, J.; Xie, J.; Xiao, X.; Yin, G.; Wang, X.; Li, M.; Fang, K. BDS/IMU Integrated Auto-Navigation System of Orchard Spraying Robot. Appl. Sci. 2022, 12, 8173. [Google Scholar] [CrossRef]
  14. Xu, A.; Cao, N.; Sui, X.; Wang, C.; Gao, S. Cooperative vehicle positioning method based on BDS/UWB. Sci. Surv. Mapp. 2020, 45, 1–8. [Google Scholar]
  15. Li, M.; Imou, K.; Wakabayashi, K.; Yokoyama, S. Review of research on agricultural vehicle autonomous guidance. Int. J. Agric. Biol. Eng. 2009, 2, 1–16. [Google Scholar]
  16. Radcliffe, J.; Cox, J.; Bulanon, D.M. Machine vision for orchard navigation. Comput. Ind. 2018, 98, 165–171. [Google Scholar] [CrossRef]
  17. Zhang, S.; Guo, C.; Gao, Z.; Sugirbay, A.; Chen, J.; Chen, Y. Research on 2D Laser Automatic Navigation Control for Standardized Orchard. Appl. Sci. 2020, 10, 2763. [Google Scholar] [CrossRef]
  18. Zhang, M.; Ji, Y.; Li, S.; Cao, R.; Xu, H.; Zhan, Z. Research Progress of Agricultural Machinery Navigation Technology. Trans. Chin. Soc. Agric. Mach. 2020, 51, 1–18. [Google Scholar]
  19. Yan, C.; Xu, L.; Yuan, Q.; Ma, S.; Niu, C.; Zhao, S. Design and experiments of vineyard variable spraying control system based on binocular vision. Trans. Chin. Soc. Agric. Eng. 2021, 37, 13–22. [Google Scholar]
  20. Liu, W.; He, X.; Liu, Y.; Wu, Z.; Yuan, C.; Liu, L.; Qi, P.; Li, T. Navigation method between rows for orchard based on 3D LiDAR. Trans. Chin. Soc. Agric. Eng. 2021, 37, 165–174. [Google Scholar]
  21. Santos, L.C.; Aguiar, A.S.; Santos, F.N.; Valente, A.; Ventura, J.B.; Sousa, A.J. Navigation Stack for Robots Working in Steep Slope Vineyard. In Advances. in Intelligent Systems and Computing, Proceedings of the Intelligent Systems and Applications, London, UK, 3–4 September 2020; Arai, K., Kapoor, S., Bhatia, Eds.; Springer: Cham, Switzerland, 2021; Volume 1250, p. 1250. [Google Scholar]
  22. Xue, X.; Xu, X.; Li, Z.; Hong, T.; Xie, J.; Chen, J.; Song, S. Design and test of variable spray model based on leaf wall area in orchards. Trans. Chin. Soc. Agric. Eng. 2020, 36, 16–22. [Google Scholar]
  23. Cheein, F.A.; Steiner, G.; Paina, G.P.; Carelli, R. Optimized EIF-SLAM algorithm for precision agriculture mapping based on stems detection. Comput. Electron. Agric. 2011, 78, 195–207. [Google Scholar] [CrossRef]
  24. Gimenez, J.; Herrera, D.; Tosetti, S.; Carelli, R. Optimization methodology to fruit grove mapping in precision agriculture. Comput. Electron. Agric. 2015, 116, 88–100. [Google Scholar] [CrossRef]
  25. Shalal, N.; Low, T.; McCarthy, C.; Hancock, N. Orchard mapping and mobile robot localisation using on-board camera and laser scanner data fusion—Part A, Tree detection. Comput. Electron. Agric. 2015, 119, 254–266. [Google Scholar] [CrossRef]
  26. Mao, W.; Liu, H.; Wang, D.; Yang, F.; Liu, Z. An Improved AODV Routing Protocol for Multi-Robot Communication in Orchard. Smart Agric. 2021, 3, 96. [Google Scholar]
  27. Mao, W.; Liu, Z.; Liu, H.; Yang, F.; Wang, M. Research Progress on Synergistic Technologies of Agricultural Multi-Robots. Appl. Sci. 2021, 11, 1448. [Google Scholar] [CrossRef]
  28. Mao, W.; Liu, H.; Hao, W.; Yang, F.; Liu, Z. Development of a Combined Orchard Harvesting Robot Navigation System. Remote Sens. 2022, 14, 675. [Google Scholar] [CrossRef]
  29. Liu, Z.; Liu, H.; Mao, W.; Yang, F.; Wang, W.; Qin, J. Research on Wireless Signal Propagation Characteristics of Traditional Apple Orchard for Multi-robot. Trans. Chin. Soc. Agric. Mach. 2022, 53, 283–293. [Google Scholar]
  30. Zhou, Z.; Cao, J.; Di, S. Overview of 3D Lidar SLAM algorithms. Chin. J. Sci. Instrum. 2021, 42, 13–27. [Google Scholar]
  31. Shan, T.; Englot, B. Lego-loam, Lightweight and ground-optimized lidar odometry and mapping on variable terrain. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4758–4765. [Google Scholar]
  32. Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping. In Proceedings of the 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS), Las Vegas, NV, USA, 24 October–24 January 2020; pp. 5135–5142. [Google Scholar]
  33. Chai, S.; Yao, L.; Xu, L.; Chen, Q.; Xu, T.; Yang, Y. Research on greenhouse agricultural machinery path tracking based on dynamic look ahead distance pure pursuit model. J. Chin. Agric. Mech. 2021, 42, 58–64. [Google Scholar]
  34. Mao, W.; Liu, H.; Wang, X.; Yang, F.; Liu, Z.; Wang, Z. Design and experiment of a dual navigation mode orchard transport robot. Trans. Chin. Soc. Agric. Mach. 2022, 53, 27–39. [Google Scholar]
  35. Zhou, J.; He, Y. Research Progress on Navigation Path Planning of Agricultural Machinery. Trans. Chin. Soc. Agric. Mach. 2021, 52, 1–14. [Google Scholar]
Figure 1. The hardware structure of the spraying–dosing robot cooperative navigation system.
Figure 1. The hardware structure of the spraying–dosing robot cooperative navigation system.
Agronomy 12 03114 g001
Figure 2. Software system structure of the spraying–dosing robot.
Figure 2. Software system structure of the spraying–dosing robot.
Agronomy 12 03114 g002
Figure 3. Construction process of orchard point cloud map.
Figure 3. Construction process of orchard point cloud map.
Agronomy 12 03114 g003
Figure 4. Point cloud reprojection.
Figure 4. Point cloud reprojection.
Agronomy 12 03114 g004
Figure 5. Point cloud search.
Figure 5. Point cloud search.
Agronomy 12 03114 g005
Figure 6. Results of point cloud clustering: (a) Raw point cloud; (b) the point cloud after clustering.
Figure 6. Results of point cloud clustering: (a) Raw point cloud; (b) the point cloud after clustering.
Agronomy 12 03114 g006
Figure 7. Point cloud feature classification: (a) Surface feature point cloud and edge feature point cloud; (b) non-ground edge feature point cloud and ground plane feature point cloud.
Figure 7. Point cloud feature classification: (a) Surface feature point cloud and edge feature point cloud; (b) non-ground edge feature point cloud and ground plane feature point cloud.
Agronomy 12 03114 g007
Figure 8. The point cloud pose estimation.
Figure 8. The point cloud pose estimation.
Agronomy 12 03114 g008
Figure 9. Principle of eigenvector solving based on area method.
Figure 9. Principle of eigenvector solving based on area method.
Agronomy 12 03114 g009
Figure 10. Principle of point cloud pose estimation.
Figure 10. Principle of point cloud pose estimation.
Agronomy 12 03114 g010
Figure 11. Principle of map optimization based on GNSS information.
Figure 11. Principle of map optimization based on GNSS information.
Agronomy 12 03114 g011
Figure 12. Orchard point cloud map.
Figure 12. Orchard point cloud map.
Agronomy 12 03114 g012
Figure 13. Spraying–dosing robot cooperative navigation strategy.
Figure 13. Spraying–dosing robot cooperative navigation strategy.
Agronomy 12 03114 g013
Figure 14. Traditional orchard environment.
Figure 14. Traditional orchard environment.
Agronomy 12 03114 g014
Figure 15. Map of the orchard environment: (a) Diagram of the orchard environment; (b) physical picture of the orchard environment.
Figure 15. Map of the orchard environment: (a) Diagram of the orchard environment; (b) physical picture of the orchard environment.
Agronomy 12 03114 g015
Figure 16. Communication experiment schematic.
Figure 16. Communication experiment schematic.
Agronomy 12 03114 g016
Figure 17. Pre-set path for spraying–dosing robot group.
Figure 17. Pre-set path for spraying–dosing robot group.
Agronomy 12 03114 g017
Figure 18. Pre-set paths for spraying robot.
Figure 18. Pre-set paths for spraying robot.
Agronomy 12 03114 g018
Figure 19. Collaborative navigation trials in orchards.
Figure 19. Collaborative navigation trials in orchards.
Agronomy 12 03114 g019
Figure 20. Movement trajectory of spraying–dosing robot group.
Figure 20. Movement trajectory of spraying–dosing robot group.
Agronomy 12 03114 g020
Figure 21. Results of cooperative navigation control experiments with spraying–dosing robot group. (a) Lateral error of spraying robot; (b) heading error of spraying robot; (c) lateral error of dosing robot; (d) heading error of dosing robot.
Figure 21. Results of cooperative navigation control experiments with spraying–dosing robot group. (a) Lateral error of spraying robot; (b) heading error of spraying robot; (c) lateral error of dosing robot; (d) heading error of dosing robot.
Agronomy 12 03114 g021aAgronomy 12 03114 g021b
Table 1. Packet loss in robotic communication systems.
Table 1. Packet loss in robotic communication systems.
LocationDistance/mGroup NumberReceived Packets/PCSPacket Loss/%Average Packet Loss/%
A84.414941.21.7
24902
34911.8
B88.114911.82.1
24882.4
34892.2
C91.314853.03.4
24833.4
34813.8
The average values of the packet loss rate of the dosing robot at A, B and C are 1.7%, 2.1% and 3.4%, respectively.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Qin, J.; Wang, W.; Mao, W.; Yuan, M.; Liu, H.; Ren, Z.; Shi, S.; Yang, F. Research on a Map-Based Cooperative Navigation System for Spraying–Dosing Robot Group. Agronomy 2022, 12, 3114. https://doi.org/10.3390/agronomy12123114

AMA Style

Qin J, Wang W, Mao W, Yuan M, Liu H, Ren Z, Shi S, Yang F. Research on a Map-Based Cooperative Navigation System for Spraying–Dosing Robot Group. Agronomy. 2022; 12(12):3114. https://doi.org/10.3390/agronomy12123114

Chicago/Turabian Style

Qin, Jifeng, Wang Wang, Wenju Mao, Minxin Yuan, Heng Liu, Zhigang Ren, Shuaiqi Shi, and Fuzeng Yang. 2022. "Research on a Map-Based Cooperative Navigation System for Spraying–Dosing Robot Group" Agronomy 12, no. 12: 3114. https://doi.org/10.3390/agronomy12123114

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop