Next Article in Journal
Selection of Agronomic Parameters and Construction of Prediction Models for Oleic Acid Contents in Rapeseed Using Hyperspectral Data
Previous Article in Journal
Design and Testing of Bionic-Feature-Based 3D-Printed Flexible End-Effectors for Picking Horn Peppers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Application Progress of UAV-LARS in Identification of Crop Diseases and Pests

1
College of Engineering, South China Agricultural University/National Center for International Collaboration Research on Precision Agricultural Aviation Pesticides Spraying Technology, Guangzhou 510642, China
2
Guangdong Laboratory for Lingnan Modern Agriculture, Guangzhou 510642, China
3
College of Electronic Engineering, South China Agricultural University/National Center for International Collaboration Research on Precision Agricultural Aviation Pesticides Spraying Technology, Guangzhou 510642, China
*
Author to whom correspondence should be addressed.
Agronomy 2023, 13(9), 2232; https://doi.org/10.3390/agronomy13092232
Submission received: 18 July 2023 / Revised: 11 August 2023 / Accepted: 24 August 2023 / Published: 26 August 2023
(This article belongs to the Section Precision and Digital Agriculture)

Abstract

:
Disease and pest stress is one of the important threats to crop growth and development, which have a very adverse impact on crop yield and quality every year, even leading to crop failure. Currently, the use of plant protection unmanned aerial vehicles (UAVs) for pesticide spraying is the most effective means of controlling crop diseases and pests. However, the areas where crop pests and diseases occur are often presented in “point-like” or “patchy” forms, and the UAV’s full-coverage spraying method results in a great waste of pesticides. Therefore, there is an urgent need to research a method for identifying the area of diseases and pest stress to achieve precise targeted spraying, in order to reduce the amount of pesticides used and improve their utilization. By analyzing the low-altitude remote sensing images of crop fields taken by UAVs, real-time pesticide spraying prescription maps can be generated to meet the demand for precise targeted spraying. This review focuses on the actual needs of precise targeted spraying by plant protection UAVs. Firstly, the RS monitoring mechanism of crop diseases and pests by UAVs is studied. Secondly, a comprehensive investigation of the literature on UAV Low-altitude Remote Sensing (UAV-LARS) technology for monitoring and identifying crop diseases and pests is conducted, summarizing the research progress in monitoring and identifying crop diseases and pests, especially in wheat, cotton, and rice. Finally, the key issues to be addressed and the future development direction of UAV-LARS monitoring of crop diseases and pests are proposed.

1. Introduction

Disease and pest stress is one of the important threatening factors in the growth and development of crops, causing significant negative impacts on crop yield and quality, and even leading to crop failure. According to statistics from the Global Crop Disease and Pest Monitoring System, the losses caused by diseases and pests in rice, corn, and wheat were 30%, 22.6%, and 21.5%, respectively, in 2019 [1,2]. Reasonable pesticide spraying helps improve crop productivity and quality. Unmanned aerial vehicles (UAVs) for plant protection have the advantages of high work efficiency, no restrictions on crop growth and terrain, good work quality, and low cost. They play a crucial role in disease and insect prevention and control and ensuring national food production security [3,4,5]. In recent years, with the widespread application of UAVs in agricultural production, the application scope of UAVs for plant protection in the prevention and control of diseases and pests has become increasingly extensive. Spraying pesticides with plant protection UAVs has become one of the most effective means to prevent and control crop diseases and pests. Nowadays, plant protection UAVs have achieved “uniform spraying” with a consistent amount of pesticide per unit area, solved the problem of “over-spraying” or “missed-spraying”, and improved the utilization rate of pesticides to a certain extent [6,7]. However, uniform spraying covers the whole farmland area, while the occurrence of crop diseases and pests often presents in “point” or “patch” forms, inevitably resulting in pesticide waste. Thus, it can reduce resource utilization efficiency, pollute the environment, and affect crop quality and safety, which cannot meet the requirements of modern agricultural development [8,9,10]. If intelligent identification of the occurrence areas of crop diseases and pests can be achieved, targeted spraying in different areas and precise target spraying will greatly reduce the use of pesticides and improve utilization efficiency.
The UAV Low-altitude Remote Sensing (UAV-LARS) system, equipped with imaging or non-imaging sensors, has shown great potential in crop disease and pest monitoring and identification due to its low cost, precision, speed, and non-contact features [11,12,13]. By analyzing the image data obtained by UAV-LARS, the occurrence and status of diseases and pests can be monitored in real time, enabling precise targeted spraying. This review focuses on the actual need for precise target spraying by plant protection UAVs. Firstly, this review studies the monitoring mechanism of low-altitude remote sensing for crop diseases and pests. Secondly, it conducts a comprehensive investigation of the relevant literature on UAV-LARS monitoring and identification of crop diseases and pests, and summarizes the research progress and monitoring methods of UAV-LARS in monitoring and identifying diseases and pests in wheat, cotton, and rice. Finally, the key problems that need to be solved urgently and the future development direction of UAV-LARS monitoring crop diseases and pests are put forward.

2. UAV-LARS System

Obtaining timely, accurate, and effective farming information based on high-altitude RS, low-altitude RS, and ground-based RS is a prerequisite for achieving precision target spraying operations. However, farming operations have strong seasonal and regional characteristics. Although satellite RS can monitor a wide range of crop areas, and its image resolution has also been greatly improved (ultra-high resolution images less than 1 m can be obtained), it is vulnerable to weather factors such as wind and cloud, as well as terrain factors, and the revisit cycle is long, so it cannot timely and effectively obtain disease and pest information. Satellite RS is often used to calculate crop loss and yield assessment caused by biological or abiotic stresses [14,15,16]. Ground RS mainly refers to the RS system, that carries a ground object spectrometer or various imaging or non-imaging sensors on the platforms such as high towers, vehicles, or ships. Ground RS has poor mobility, a small monitoring range, and cannot monitor multiple plots at the same time, and is also seriously affected by terrain factors so manual operation is still required in areas with poor terrain, and farming production efficiency is low. UAVs have the characteristics of flexibility, real-time, mobility, etc. And the obtained image has a higher spatial, temporal, and spectral resolution, which makes up the defects of traditional monitoring methods and is an important means to obtain real-time and effective farming information [11,17].
With the development of UAV and sensor technology, UAV-LARS, equipped with imaging sensors such as visible light (RGB), multi/hyperspectral, thermal infrared, and laser radar (LiDAR), has become an effective means to obtain timely and accurate farming information. The UAV-LARS system is mainly composed of the flight system, the mission payload system, and the ground control system, as shown in Figure 1. The acquired RS images are processed using UAV application software, including radiometric calibration, geometric correction, matching and stitching, feature extraction, and monitoring and identification processes, to ultimately achieve farming information monitoring [18]. As an early warning system, the UAV-LARS system allows farmers to intervene in crops as early as possible, enabling them to spray pesticides before diseases and pests spread widely.

2.1. RGB Imaging RS System

The physical and physiological characteristics of crops will change after being stressed by diseases and pests. The true color image of the visible spectral area obtained through an RGB camera can extract physical information such as color, shape, and size related to harmful symptoms of disease and pests. Machine vision techniques can then be used to monitor and identify crop pests and diseases.
RGB sensors are cost-effective, lightweight, and easy to install on UAV platforms for agricultural data collection. Traditional machine learning (ML) or deep learning (DL) methods can be applied to extract visual features such as color, gradients, textures, and shapes from images for monitoring and identifying crop pests and diseases, providing guidance for pesticide spraying [19]. By comparing the spectral characteristics of stressed crops with those of healthy crops, changes in the physiological characteristics of stressed crops can be monitored, allowing intervention before visible symptoms appear. For example, in the identification of rice sheath blight (ShB), the accuracy of disease and pest monitoring models based on crop physical changes increases as the severity of ShB infection increases. However, in the case of mild stress, the accuracy is low and cannot provide timely and effective guidance for spraying [20,21]. Research by Dehkordi et al. [22] showed that wheat stripe rust (WSR) and leaf rust (WLR) cause significant changes in spectral reflectance in the green and red channels of plant leaves. The severity of WSR and WLR infection can be identified based on different combinations of green, red, and blue spectral bands.
Currently, the RGB imaging system based on UAV has been relatively mature in obtaining farming information, which is widely applied in various aspects such as agricultural natural disaster assessment, monitoring, and identification of crop diseases and pests and weed, water and fertilizer monitoring, and growth monitoring [23,24,25,26,27,28]. However, due to the limited amount of information provided by RGB imaging systems, the accuracy of farming information monitoring is relatively low.

2.2. Thermal Imaging RS System

The electromagnetic spectrum used for thermal RS is in the infrared region with a wavelength of 0.7–100 μm. Generally, the wavelength in the range of 0.7–3.0 μm is called reflective infrared (RIR), and the wavelength in the range of 3–100 μm is called thermal infrared (TIR). In the TIR band, the thermal radiation energy of objects is greater than the reflected energy of the sun. Therefore, thermal RS mainly reflects farming information through the difference in thermal radiation energy emitted by ground objects. When crops are subject to biotic or abiotic stresses, leaf surface temperature will change. Thermal RS can transform invisible infrared energy from crops into visible thermal images, according to which the change of crop leaf temperature can be monitored, and then conduct real-time monitoring of crop stress [29].
When the crop has sufficient water, transpiration is strong, energy is mainly used for latent heat flux consumption, and the difference between canopy temperature and air temperature is small. When the crop water is insufficient, the crop closes part of the leaf stomata to reduce water consumption, the latent heat flux consumption decreases and the sensible heat flux increases, which increases the canopy temperature and increases the difference with the air temperature. Therefore, crop water stress can be reflected by the temperature difference between the canopy and air. Thermal RS is sensitive to micro-temperature changes caused by water stress, and it can quickly and practically evaluate crop water content by obtaining a crop water stress index (CWSI). Therefore, the UAV-LARS thermal imaging system is mostly used to monitor crop water stress, and irrigation is carried out at the initial stage of stress to prevent yield loss [30,31,32,33].
The influence of weather conditions such as cloud, rain, and air temperature on thermal RS is large. At this time, it is difficult to collect crop high-resolution thermal RS data, and the subsequent data processing is complex, with low accuracy and high cost [34,35]. Moreover, there are many kinds of crop diseases and pests, resulting in complex symptoms. A single thermal imaging system makes it difficult to accurately and effectively identify crop diseases and pests, so its research on monitoring crop diseases and pests is less.

2.3. Multispectral Imaging RS System

Multispectral imaging sensors typically have three to six bands [36]. Figure 2 shows the DJI Phantom 4 Multispectral camera lens, which has six bands and can obtain complex data information such as damage to crop cell structure or damage to pigment systems. Based on the spectral differences between the infrared and red regions of stressed crops and healthy crops, a mathematical model is constructed using the Normalized Difference Vegetation Index (NDVI) and crop disease index (DI) to evaluate the level of crop disease and pest stress. This allows farmers to intervene in stressed areas in advance. Zhao et al. [20] used drones equipped with visible light and multispectral sensors to obtain sensitive VIs for rice ShB, and established a model for disease severity, comparing the monitoring results of the two sensors. The results showed that the two sensors had similar monitoring effects in areas severely affected by rice ShB, but the multispectral sensor had higher accuracy in areas with milder disease. Liu et al. [37] found that spectral bands near 410, 470, 490, 570, 625, 665, and 720 nm were sensitive for detecting the harmful status of rice leaf roller, providing a basis for designing multispectral channels for identifying rice leaf roller infection.
The premise of monitoring crop disease and pest stress using sensitive VIs is to obtain sufficient spectral bands. Although multispectral sensors and RGB sensors have the advantages of low cost, flexible operation, simplicity, and convenient data analysis, they do not provide continuous spectral information, and the obtained spectral bands are limited. In contrast, combining thermal RS, RGB, and multispectral sensors can provide representative spectral band information within the range of 0.400 μm~15 μm, which can essentially achieve the practical calculation of VIs required for agricultural monitoring [16,31,38]. For example, Elarab et al. [39] evaluated the chlorophyll content of oat plants using thermal images and multispectral images captured by UAV-LARS. Maimaitijiang et al. [40] obtained RS images using UAVs equipped with RGB, multispectral, and thermal sensors to predict the biochemical and biophysical parameters of soybeans. Zheng et al. [41] carried out early-stage rice growth image capture using RGB, near-infrared, and multispectral cameras mounted on a multi-rotor UAV and proposed a new decision tree combining texture and spectral features for rice plant detection, which can accurately distinguish rice plants from the background.

2.4. Hyperspectral Imaging RS System

Hyperspectral imaging systems can provide hundreds or even thousands of contiguous bands within the visible and near-infrared (NIR) spectral range, enabling the acquisition of more useful information. Bohnenkamp et al. [42] conducted the first experiment on WSR detection using hyperspectral imaging on both ground vehicles and UAVs. The results showed consistency between the UAV-based hyperspectral imaging monitoring results and ground visual assessment, laying the foundation for UAV-LARS monitoring of diseases and pests. Zhu et al. [43] compared the spectral response differences in wheat canopy under different wheat take-all disease index based on hyperspectral data obtained from ASD FieldSpec 4 and UAV hyperspectral RS images, and constructed an RS monitoring model for wheat take-all disease, with a classification accuracy of 90.35%. Mahlein, Chen, and other researchers [44,45] found that the variations in the 500~533 nm spectral range are related to changes in carotenoids caused by crop diseases. The variations in the 560~675 nm and 682~733 nm spectral ranges are related to chlorophyll content in crop leaves. The variations in the 927~931 nm spectral range are related to water content in crop leaves. Different spectral bands can be used to predict the differences in cell structures of healthy and diseased crops, enabling the monitoring and identification of stressed crops, including the type and severity of the stress. Liang et al. [46] collected high-resolution images (500~900 nm) of corn canopy at multiple growth stages and built 13 monitoring models for spot disease based on 12 sensitive bands of corn canopy spot disease. Eight of the models reached a highly significant correlation with the measured values of the spot disease. Abdulridha et al. [47] used a UAV-based hyperspectral imaging system to detect powdery mildew (PM) in pumpkins and classify the severity of PM. Significant bands (388 nm, 591 nm, 646 nm, 975 nm, and 1012 nm) that can effectively distinguish healthy crops from those in the early, intermediate, and late stages of disease development were extracted within the spectral range of 380–1020 nm. A spectral VIs model was constructed based on these bands, and the classification results for the early and late stages of PM development were 89% and 96%, respectively.
The hundreds or thousands of contiguous bands provided by hyperspectral sensors are valuable for researchers to extract sensitive bands related to crop stress, offering more possibilities for real-time and effective agricultural monitoring. However, due to their high cost and complex data processing, hyperspectral sensors are mainly used in laboratory or greenhouse conditions and deployed on near-ground RS platforms for studies on predicting crop diseases and pests, as well as predicting phenotypic traits such as nitrogen content, biomass, and chlorophyll content. There is relatively limited research on the use of hyperspectral sensors mounted on UAV platforms for crop disease and pest monitoring and identification [42,48,49].

2.5. LiDAR Imaging RS System

Laser radar (LiDAR) has the characteristics of high accuracy and is not affected by nighttime work. It can obtain detailed crop canopy information and reconstruct the 3D canopy structure of crops. Due to its high price, complex data analysis, and weak correlation with crop disease and pest symptoms, LiDAR is currently mainly used for forest canopy coverage estimation and crop breeding research [50,51,52]. Lidar (LiDAR) can estimate tree canopy coverage and provide references for forest inventory, sustainable management, and ecosystem assessment. The 3D data obtained by LiDAR can quickly, objectively, and non-destructively estimate crop above-ground biomass (AGB) and canopy height (CH), which has great potential in crop breeding [53,54].

3. Application of UAV-LARS in Monitoring of Crop Diseases and Pests

3.1. Data Source and Processing

To analyze the current research status and trends in the use of UAV-LARS for monitoring and identifying crop diseases and pests, we conducted a comprehensive literature review and statistical analysis using data from the Web of Science (WOS) and CNKI. In these two databases, we searched for the literature using keywords such as UAV-LARS, diseases or pests, monitoring or identification, and other related subjects, within the timeframe of 2010 to 2022. The search was performed across titles, abstracts, and keywords, focusing on articles and reviews. Following a manual check of the search results, we obtained 139 English articles and 72 Chinese papers from the literature after eliminating irrelevant and duplicate sources.
Figure 3 shows the interannual changes in the number of articles in the literature issued by the two databases. It can be seen that since 2012, the research of UAV-LARS monitoring and identification of crop diseases and pests has shown a trend of overall growth in the number of articles in the literature issued in China and abroad. Especially in recent years, with the development of sensor technology, the UAV-LARS, UAV carrying NIR, multispectral, and hyperspectral sensors have achieved rapid development in identifying crop diseases and pests. The number of articles in the literature issued on WOS reached the maximum in 2020, and the ones on CNKI reached the maximum in 2021. In addition, although the annual number of articles in the literature issued on WOS is generally higher than that of CNKI, more than 40% of them are published by Chinese researchers.
Figure 4 shows the crops and the main diseases and pests involved in the related literature. From (a), it can be seen that the crops involved are mainly grain crops wheat, corn, rice, and cash crop cotton, accounting for 56% in total. Wheat-related research is the most, accounting for 25%. Cotton, corn, and rice accounted for 11%, 11%, and 9%, respectively, which were much lower than wheat. Citrus, soybean, and grape accounted for less than half of the total. Accordingly, the crop diseases and pests involved in the relevant literature are mainly distributed in wheat, corn, cotton, and rice, as shown in (b). Common diseases and pests of wheat include stripe rust (WSR), powdery mildew (PM), and fusarium head blight (FHB). The number of articles in the relevant literature is the largest. Common diseases and pests of cotton include root rot (CRR), aphids, and spider mites. There are few articles in the literature about the identification of corn and rice diseases and pests, and there are many kinds of diseases and pests. Corn diseases include leaf blight, leaf spot, stripe, and so on, which are collectively referred to as corn diseases, and the pests are mainly armyworms. Rice mainly includes rice blast, streak, cnaphalocrocis medinalis, and sheath blight (ShB). In addition, the typical disease of citrus is Huanglongbing, and the main diseases and pests of soybean are leaf diseases and aphids, and, although the number of relevant articles in the literature on the two is small, it is only second to that of WSR in terms of a single disease or pest.
Figure 5 shows the statistics of RS technology involved in the related literature. It can be seen that there are many related research projects on monitoring and identifying crop diseases and pests by carrying RGB, multispectral, and hyperspectral imaging sensors on UAVs, and the proportion of these three related articles in the literature is 35%, 31%, and 20%, respectively. Thermal imaging and NIR imaging accounted for 3% and 2%, respectively. RGB imaging can obtain high-resolution images at a low cost, which is a good choice for identifying crop diseases and pests. However, RGB images can identify physical symptoms caused by crop diseases and pest stress, and it is difficult to monitor changes in physiology, such as chlorophyll content and water content. With the development of spectral technology, multispectral is applied to the monitoring and identification of crop diseases and pests, which can reflect other spectral bands except visible light, so it has good applicability in the identification of crop diseases and pests. Compared with multispectral, hyperspectral covers more continuous spectral bands and contains more information, which can capture weak or inconspicuous spectral changes. It is very important for monitoring crop diseases and pests in early or complex scenes, but the cost is high, so the application range is not as wide as RGB and multispectral [55,56]. The change in respiration and photosynthesis caused by the corps diseases and pests stress leads to the change in the heat emission level of leaves, which can be captured by thermal imaging [57,58,59]. In fact, thermal RS is mostly used to monitor crop water stress, evaluate yield, and study phenotype [30,32,60,61]. The spectral band of NIR is narrow, and there are many kinds of crop diseases and pests. The same disease or pest causes different symptoms of the crop, or multiple pests and diseases cause similar symptoms, which requires more spectral bands to provide more information, so it has greater limitations on monitoring and identifying crop diseases and pests [62]. Therefore, thermal imaging and NIR imaging are usually combined with other imaging sensors such as hyperspectral or multispectral, which is expected to improve the accuracy of monitoring and identification [63,64,65,66,67].
Currently, UAV-LARS systems are mainly equipped with RGB, multispectral, or hyperspectral imaging sensors and are primarily used for monitoring and identifying diseases and pests in three main crops: wheat [68,69], cotton [70], and corn. There is relatively less monitoring targeting crop diseases and pests such as rice, citrus, and soybeans. Wheat and rice, as staple crops for humans, face difficulties in monitoring and identification due to the wide variety of diseases and pests. In particular, rice is not only challenging to differentiate between different types of diseases and pests but also faces significant annual losses worldwide compared to other crops. Cotton is an economically important crop globally and holds an irreplaceable position in China and the world’s economic development. Therefore, this review focused on the research and analysis of RS technologies and identification methods used for monitoring and identifying diseases and pests in wheat, cotton, and rice. The findings provide a reference and basis for the application of UAV-LARS in precise targeted spraying.

3.2. Application Progress in Monitoring Wheat Diseases and Pests

Wheat is one of the oldest cultivated crops in the world and is a major staple crop in northern China [71]. Wheat is susceptible to a wide range of diseases and pests, which severely impact its yield and quality. When wheat is under disease and pest stress, there are changes in its physiological and biochemical characteristics, leading to physical symptoms such as abscesses and scabs on leaf surfaces, as well as physiological changes such as cell pigment destruction and structural damage. These changes result in different responses in spectral reflectance [44,72,73]. After applying image processing techniques such as denoising, enhancement, and filtering, diseases, and pests can be more accurately identified by extracting color or shape features. The image data can be obtained by using low-cost RGB cameras mounted on drones, and this is a low-cost and easy-to-operate method for monitoring and identifying wheat diseases and pests that have been widely applied. Li et al. [74] used the UAV to collect field images of wheat take-all. Based on digital image processing and fuzzy-clustering image segmentation techniques, they designed and implemented a monitoring system for take-all using Java web technology. Liu et al. [75] found a significant correlation between the color feature parameters of images and the wheat disease index (DI). They established an estimated model for WSR based on the image parameter. Although the monitoring effect is lower compared to near-ground hyperspectral monitoring results, it provides a basis for large-scale monitoring of wheat diseases and pests using UAV-LARS. However, monitoring and identifying diseases and pests through image processing techniques are more suitable for severe cases, as the accuracy decreases significantly during the early and intermediate stages of infestation. Based on different spectral reflectance responses, the occurrence of diseases and pests can be predicted before the physical symptoms of stressed plants occur. Therefore, using characteristic spectrum or VIs derived from images of wheat under stress is an effective method for monitoring and identifying wheat diseases and pests. Fu et al. [76] used image processing and spectral analysis techniques to automatically evaluate the severity level of wheat talk-all by RS images obtained through UAV. Bhandari et al. [77] calculated VIs from RGB images of wheat canopy acquired by UAVs and demonstrated a good correlation between the indices and WSR infection coefficients. Guo et al. [78] constructed a regression model between spectral indices and disease index (DI) using UAV hyperspectral image data. They inverted the disease index of winter wheat and created a spatial distribution map of wheat common bunt. Khan et al. [79] extracted normalized difference texture indices (NDTIs) and VIs from hyperspectral images to monitor wheat PM, achieving high overall accuracy (OA) and offering the possibility of detecting early-stage diseases in crops.
Combining the color, shape, texture, spectrum, and other characteristics of disease and pest images with ML, and establishing monitoring models based on partial least squares regression (PLSR), random forest (RF), support vector machine (SVM), and other algorithms can improve the accuracy of identification [80,81], which is also a research hotspot in the field of crop disease and pest detection in recent years. Guo et al. [82] extracted sensitive VIs and TFs from the WSR hyperspectral images and established PLSR models for different infection periods. It was found that the VI-TIs combined monitoring model had high accuracy in identifying WSR at all infection stages. Xiao et al. [83] proposed a method to select the optimal window size for gray-level co-occurrence matrix (GLCM), extracted sensitive spectral and texture features from the optimal window, and established a logistic model for monitoring FHB with an accuracy of 90%. Su et al. [84] used RF and shuffled frog leaping algorithm (SFLA) to select three VIs that are sensitive to the severity of WSR from multiple VIs, and achieved accurate detection of the stress state of WSR. Su et al. [85,86] developed a low-cost and high-precision automatic WSR detection system for field scales using different wheat canopy VIs combined with ML techniques. Hu et al. [87] established a monitoring model for a mild and severe infection of FHB based on the original spectral features, VIs, and dual-time-point VIs of winter wheat using SVM, RF, and Extremely Randomized Trees (ExrRa Trees) classification algorithms, with an OA of 94%. Continuous wavelet transform (CWT) can decompose the original spectral data into different amplitudes and frequencies to capture subtle spectral absorption features. Continuous wavelet transform (CWT) can decompose the original spectra into different amplitudes and frequencies to capture subtle spectral absorption characteristics. The spectral features based on CWT are robust for different pest and disease stress stages [88,89]. Ma et al. [90] extracted and optimized three features, namely spectral bands (SBs), VIs, and wavelet features (WFs), which are sensitive to FHB, from the hyperspectral images of wheat FHB. They combined the traditional SBs and VIs with WFs and established an FHB detection model based on SVM, effectively improving the detection accuracy of FHB.
With the rapid development of information technology, DL has expanded the network “depth” of traditional ML, making the structure more complex, and shown great potential in the identification of crop diseases and pests and weeds, yield estimation, fruit detection, and grading [91,92,93,94]. Zhang [95,96] improved the UNeT algorithm and proposed the improved Ir-UNet and DF-UNet models. These models achieve high recognition accuracy while reducing computational load, enabling automatic and efficient detection of the severity of WSR, and can provide a basis for regional-scale monitoring and identification of diseases and pests. Huang et al. [97] used a convolutional neural network (CNN) to evaluate the severity of leaf spot stress based on image datasets collected by UAVs. The OA was 91.43%, with a standard error of 0.83%. Sun [49] acquired wheat canopy RGB and near-infrared images using UAVs. Based on the correlation between the original bands and DIs, Sun established a Backpropagation Neural Network (BPNN) prediction model for WLR to obtain more reliable inversion values. Liu et al. [98] extracted original spectral bands, VIs, and TFs from hyperspectral images of wheat infected with FHB at the filling stage. They used an improved BPNN to establish an FHB monitoring model with an OA of 98%.
There are many relevant articles on the application of UAV-LARS in wheat disease monitoring and identification. It has achieved good results in the identification of WSR, WLR, take-all, and FHB. As shown in Table 1, a summary of the RS equipment, sensitive features, and algorithms used in monitoring and identifying wheat diseases, as well as the ground sampling distance (GSD) suitable for different stressors, is provided. The UAV-LARS system often utilizes DJI series UAVs with built-in multispectral or RGB sensors, or other sensors such as The RedEdge camera (Micasense Corp., Seattle, WA, USA) for capturing multispectral data, and the UHD 185 hyperspectral imaging system (Cubert GmbH, Ulm, Baden-Württemberg, Germany) for capturing hyperspectral data. In terms of feature and algorithm selection, bands such as Red, NIR, red-edge, 478 nm, 650 nm, 702 nm, and characteristic indices (CIs) like VEG, VARI, MGRVI, and NDRDI, as well as texture indices, play an important role in wheat disease classification and severity estimation. Combining these features with algorithms such as partial least square regression (PLSR), logistic regression, RF, SVM, etc., can yield favorable results. Regarding data acquisition, different GSDs have a significant impact on accuracy, with optimal resolutions for RGB, multispectral, and hyperspectral imagery being ≤1 cm/pixel, ≤2.5 cm/pixel, and ≤4 cm/pixel, respectively. Summarizing these aspects can provide valuable references for the efficient and rapid identification of wheat diseases. However, crop growth exhibits periodicity and continuity, and the monitoring models obtained by analyzing the current RS images often have low accuracy in practical applications. In addition, in the experiments, the crop disease and pest plots are often created by providing suitable living environments for diseases and pests, and inoculating them with disease-causing strains [42,75,77,79,82,85,86], rather than occurring naturally. The actual field conditions of crop diseases and pests are complex and influenced by various factors such as crop variety, habitat, and pathogen sources. Therefore, the monitoring models lack universality, and there is still a considerable distance from practical application. It is necessary to consider multiple data sources, continuously acquire RS data on stressed wheat, and continuously monitor and identify wheat diseases and pests to improve the model’s generalization ability. This will provide a basis for the practical application of UAV-LARS in wheat.

3.3. Application Progress in Monitoring Cotton Diseases and Pests

Cotton is an important economic crop in China, but frequent diseases and pests have caused significant losses to cotton cultivation. Utilizing the advantages of UAV-LARS technology in terms of spatial and temporal resolution, real-time and dynamic monitoring of farming information at the field scale is of great significance for the precise targeted spraying of cotton diseases and pests [99,100]. UAV-LARS is a regional-scale observation method. Researchers at home and abroad have extracted the characteristic spectra, VIs, and other features related to cotton diseases and pests. They have constructed monitoring models for cotton diseases and pests at the regional scale based on algorithms such as SVM and logistics, enabling real-time and dynamic monitoring of cotton field information. Huang et al. [101] used SVM and CNN to monitor and classify multi-spectral images of cotton mites obtained by UAVs, demonstrating the potential of UAV-LARS technology in mite monitoring. Guo Wei et al. [102] obtained spectral bands sensitive to aphid stress from cotton spectral images and constructed a model for estimating the severity of aphid infestation based on spectral reflectance and ratio derivative spectral values, which can provide a spatial distribution map of aphid severity at the field scale. Xavier et al. [36] monitored the level of cotton leaf blight infection with a UAV-mounted three-band spectral camera at different resolutions, achieving an OA of 79%. The monitoring performance was limited, but it provided a basis for monitoring cotton leaf blight with a low-cost multi-spectral system. Song [103] acquired cotton canopy spectral reflectance from multi-spectral RS images obtained by UAVs, selected relevant bands and VIs sensitive to cotton yellow wilt disease, and constructed a monitoring model for the severity of yellow wilt disease, achieving recognition at the regional scale. Cui et al. [104] selected the optimal VIs based on multi-spectral data of cotton mite damage and constructed a regression model with a classification accuracy of 95%. They also established a monitoring model for the temporal variation of mite damage, enabling dynamic monitoring of the affected area. Wang et al. [105,106] developed an unsupervised plant-by-plant (PBP) classification algorithm for mapping the area affected by CRR at the individual plant level, and tested the algorithm using five-band spectral data obtained by UAVs, achieving a classification OA of 95.94%.
The above-mentioned studies focus on monitoring and identifying a single cotton disease or pest. The cotton diseases include Ramularia leaf blight (RLB), Helminthosporium leaf blotch (HLB), Verticillium wilt (VW), and CRR. The pests include aphids, spider mites, and cotton bollworms. As shown in Table 2, summarizing the identification methods for cotton diseases and pests can help researchers fully utilize high-resolution RS images to extract sensitive features of diseases and pests, and combine them with appropriate classification algorithms to quickly customize different monitoring and identification schemes for specific diseases or pests. For example, the SVM algorithm is used to monitor and identify RLB based on green, red, and NIR spectral reflectance obtained from a multispectral camera with a Ground Sample Distance (GSD) of ~25 cm [36]. For CRR, supervised, unsupervised, and semi-supervised models can be constructed within the GSD range of 2.4 cm~7.64 cm, achieving good identification results [100,105,106]. Cotton spider mites can be identified using a Transferred AlexNet model with multispectral images, achieving an accuracy of up to 95.4% [101]. For HLB, features such as Color Histogram (CH), Local Binary Pattern Histogram (LBPH), and Vegetation VIs can be extracted from RGB images, and SVM or CNN models can accurately identify HLB [97]. However, in natural environments, there may be more than one disease or pest in the same field. To address this, Dilixiti et al. [107] extracted the Soil Adjusted Vegetation Index (SAVI) and NDVI from multi-spectral RS images of cotton and constructed a logistic algorithm model to identify three pests, namely cotton aphids, cotton red spider mites, and bollworms, in cotton fields. This study only identifies three pests during a single growth stage of cotton and does not dynamically monitor the occurrence and spread of pests. The research results can only provide guidance for pesticide application when pests occur in cotton during the growth period.

3.4. Application Progress in Monitoring Rice Diseases and Pests

Research on monitoring and identifying rice diseases and pests based on UAV-LARS often focuses on using spectral characteristics or VIs to construct identification models. Modeling methods based on machine vision are mainly used for leaf-level monitoring and identification, while there is less research on the regional scale of UAVs.
Research on identifying rice diseases and pests using spectral indices or VIs includes the following examples: Liu et al. [37] used a UAV-mounted hyperspectral sensor to acquire canopy spectral images of rice during the heading stage. They developed spectral indices that can estimate leaf curling effectively and serve as universal spectral indicators for leaf detection, which can be used to detect the damage of rice leaf rollers to rice leaves. Ding et al. [108] extracted characteristic bands from hyperspectral data obtained by a UAV and used SVM to classify the severity of rice bacterial leaf streak disease. And they found that the severity of rice stress was highly correlated with the green and near-infrared regions, achieving good monitoring results. Zhao et al. [20] used a DJI Phantom consumer-grade UAV equipped with RGB and multispectral sensors to obtain RS images of rice ShB. They processed and analyzed the images to obtain VIs under ShB and compared them with measured NDVI values obtained by agricultural experts and handheld NDVI meters from Trimble company. The results showed that RGB and multispectral low-altitude RS can assess rice ShB. Lee et al. [109] analyzed UAV images of three rice varieties from 2017 to 2018 to study the changes in growth factors such as plant height, dry weight, leaf area index (LAI), as well as disease occurrence. They developed a UAV-based multispectral image evaluation technology for assessing rice yield and the severity of rice bacterial leaf blight.
Research on identifying rice diseases and pests based on machine vision includes the following examples: Wang et al. [110] used a multi-rotor UAV to capture RGB images and extracted Haar-like features from images of rice white heads. They trained an AdaBoost algorithm for white head recognition, achieving an accuracy of 93.62%. Shen et al. [111] obtained the rice ShB image dataset by UAVs and utilized image processing and analysis techniques along with shallow neural networks to diagnose, monitor, and classify rice ShB. Wei et al. [112] employed RS images from UAVs and the YOLOv4 object detection deep neural network to detect the severity of rice ShB, generate rice density prescription maps, and compare the results with traditional image segmentation methods. The results showed that CNN has good robustness.
Rice is susceptible to bacterial diseases and pest infestations throughout its growth and development stages. There are often more than six common diseases and pests affecting rice. Although there is extensive research on monitoring and identifying rice diseases and pests, most of them focus on leaf-level identification under laboratory conditions [113,114]. It is challenging to obtain reliable and specific low-altitude RS data for analysis in practical research. The lack of such data is the main reason for the low accuracy of monitoring of rice diseases and pests. Moreover, the diverse range of diseases and pests brings difficulties to the monitoring and identification due to the phenomena of “same object with different spectrum” and “different objects with same spectrum”. Currently, there is limited literature on monitoring and identifying rice diseases and pests based on UAV-LARS. Most studies focus on one or two diseases and pests during a specific growth stage of rice, and they do not compare and analyze similar features caused by other diseases, pests, or moisture stress, so the accuracy needs improvement. On the other hand, there is a relatively large body of literature and research on using UAV-LARS for rice weed identification, which is relatively mature and can provide some basis for the identification of rice diseases and pests. For example, Stroppiana et al. [115] used images acquired by a UAV-mounted multispectral sensor and an unsupervised clustering algorithm to label weed and non-weed areas, achieving an OA of 96.5% for early-stage rice weed mapping. Huang, Lan, et al. [116,117] utilized RGB images and DL to identify rice weeds, achieving high recognition rates.

4. Discussion on Methods for Monitoring and Identifying Crop Diseases and Pests

By analyzing the literature on using UAV-LARS to identify wheat, cotton, and rice diseases and pests, it can be understood that crop disease and pest identification often involve acquiring disease and pest image data using UAV-mounted RGB, hyperspectral, or multispectral imaging sensors. The data is then preprocessed through augmentation, normalization, denoising, and other techniques. Subsequently, crop disease and pest identification models are constructed using two main methods, as shown in Figure 6.
Based on spectral or VIs, monitoring and identifying crop diseases and pests are as the left branch shown in Figure 6. Spectral reflectance can reflect the internal physiological and biochemical characteristics of stressed crops. By combining the two, comprehensive disease and pest features can be obtained, thereby improving the identification accuracy and distinguishing different diseases and pests as well as their severity. This type of model requires selecting more significant spectral features from multiple spectra sensitive to diseases and pests. Therefore, the image data is often obtained through UAV-mounted multispectral or hyperspectral cameras. Based on machine vision technology, monitoring and identifying crop diseases and pests are as the right branch shown in Figure 6. The accuracy of monitoring crop diseases and pests based on the physical symptoms exhibited by stressed crops is significantly correlated with the severity of the symptoms. DL algorithm expands the network depth of traditional ML models and has good robustness in learning complex patterns and recognizing important features. It is widely applied in agricultural detection, such as plant recognition and detection, crop disease and pest monitoring and identification, fruit detection and grading, etc. In the field of disease and pest identification, DL architectures such as CNN, VGGNet, ResNet, SSD, etc., are used to automatically extract features and make decision recognition on the training and testing sets obtained by dividing image data. An identification model is constructed and then validated using the validation set to achieve ideal recognition accuracy. In summary, machine vision technology mainly constructs identification models based on the morphology, color, texture, and other features of disease spots on stressed crops. It can achieve high accuracy at the leaf scale, but for UAV images at the regional scale, the detection effect is often unsatisfactory. The change in spectral bands can reflect changes in crop chlorophyll and moisture content, thereby determining the presence of crop diseases and pests. It can also monitor early-stage diseases and pests in complex scenes. Therefore, it has a better prospect to monitor and identify stressed crops by selecting characteristic spectral indices or VIs and building a correlation model between characteristic indices and DIs [118,119].
In conclusion, crop diseases and pests have the characteristics of long duration and strong reproductive and spreading abilities, posing a significant threat to crop growth and development. Traditional pesticide spraying based on manual experience not only leads to severe pesticide abuse, but also fails to effectively mitigate the losses caused by diseases and pests, and to some extent, reduces the quality of agricultural products. Currently, research data based on UAV-LARS is mainly obtained from naturally occurring fields or artificially inoculated experimental fields. Experimental fields affected by natural occurrence have complex environments, with the possibility of multiple disease and pest stresses simultaneously. They exhibit complex pathological features, including suddenness, complexity, and variability. The images obtained from UAV-LARS mainly capture the canopy data of stressed crops. However, early symptoms of some diseases and pests in the canopy are not obvious. There is no difference in canopy texture and spectral features between stressed and healthy crops, making it difficult to accurately monitor and identify early symptoms and types of stress. In contrast, experimental fields with artificially inoculated or induced pest stresses have a simpler structure and do not consider meteorological data such as temperature and humidity, as well as historical plant protection data related to the actual occurrence of diseases and pests. They exclude other complex influencing factors. Monitoring models based on color, texture features, or VIs can obtain relatively satisfactory results in such fields, which can only provide a certain basis for monitoring the same type of stress during the corresponding growth and development stages of crops. However, it does not have universality in practical applications. Integrating multi-source, multi-scale, and multi-temporal RS data can complement each other’s advantages, and obtain more comprehensive information on disease and pest stresses. Constructing monitor models for different growth and development stages of crops can improve the universality of algorithms and models to some extent [120,121], as shown in Figure 7. Zhang et al. [122] integrated four meteorological features and two RS features related to crop characteristics and habitat traits and built a wheat PM prediction model. Compared with the model built only using meteorological data, this model achieved a 9% increase in accuracy. Lin et al. [123] extracted crop VIs and environmental characteristics from two satellite RS data with different resolutions and achieved accurate inversion of the spatial distribution of crop diseases and pests by monitoring the habitat environment of crop pests and diseases within the region.

5. Existing Problems and Outlook

Combining the advantages of ground-based and satellite RS, UAV-LARS has made significant progress in monitoring crop diseases and pests. However, due to the diverse types of crop diseases, the camouflage and migration of pests, as well as the complexity of field environments, the recognition accuracy still needs to be improved. With the continuous improvement of RS platforms and onboard sensors, the development and updates of data processing software have made RS data processing simpler and faster, providing strong support for the efficient development of precision agriculture. However, there are also some challenges to be addressed. Analyzing the issues of monitoring and identifying crop diseases and pests using UAV-LARS can contribute to the rapid implementation of precision targeted spraying technology.

5.1. Insufficient Development of UAV-LARS

The endurance time of UAVs is usually within 30 min, greatly limiting the coverage area of a single flight, and being limited by battery capacity and power, the load has a significant impact on the stability of UAVs and the quality of image data [124,125,126]. UAV images also are heavily influenced by weather conditions, and cloudy weather increases the complexity of subsequent image interpretation, adversely affecting the monitoring effectiveness. In addition, UAV images have certain limitations in depicting image details and require onboard sensors with higher spatial and spectral resolution. However, sensors that meet this requirement, such as hyperspectral sensors, are expensive and require complex data processing, placing higher demands on the expertise of users.
Therefore, improving the stability of UAVs, addressing the issue of short endurance time, and developing low-cost, lightweight, and modular onboard sensors are of great importance to the development of UAV-LARS in monitoring crop diseases and pests. In addition, the complex field environment and non-standard operation of UAV operators during the monitoring of crop diseases and pests can lead to economic losses and potential safety hazards. UAV regulatory authorities should improve the regulatory system, and operators should strictly adhere to flight rules to maximize the safety of UAV operations [127].

5.2. Complex Processing of UAV RS Data

The processing and analysis of RS data are crucial for successfully generating prescription maps. There are various software tools available for processing UAV RS data. However, there is no single software that can simultaneously control image acquisition, stitching, preprocessing, and analysis of UAV images. Multiple software tools are required to complete a series of operations. At present, the software involved in UAV RS image processing includes DJI Terra (v3.0.2) [128], ENVI (Exelis Visual Information Solutions, Boulder, CO, USA) [21], ArcGIS 9.1 (Esri, Redlands, CA, USA) [21,112], Agisoft PhotoScan Professional (Agisoft LLC, St. Petersburg, Russia) [112], QGIS (QGIS, Open Source Geospatial Foundation, Chicago, IL, USA, https://www.qgis.org/) [84], Pix4D mapper (Pix4D Inc., Lausanne, Switzerland, http://www.pix4d.com/) [21,84], etc. The main four software models are shown in Figure 8. The processing methods are mostly limited to copying other image processing methods, and there is a lack of inversion model libraries for crop diseases and pests.
In the future, it is necessary to design a dedicated monitoring platform that integrates the processing and analysis of RS data, as well as inversion models for different crop diseases and pests. By integrating functions such as remote sensing data processing and analysis, spectral image data extraction algorithms, and the construction of different training models into a unified development platform, researchers can openly share crop disease and pest remote sensing data and reproduce previous research results. This will help improve the accuracy of UAV remote sensing data monitoring and identification of crop diseases and pests, and accelerate the realization of precise targeted spraying operations. Integrating RS data processing and analysis, spectral and image feature extraction, and different model construction into an integrated development platform allows researchers to openly share crop disease and pest RS data and replicate previous research results, which helps improve the accuracy of monitoring and identifying crop diseases and pests, and accelerates the implementation of precise target spraying operations.

5.3. Poor Universality of Disease and Pest Monitoring Models

When crops are subjected to disease and pest stress, the symptoms may resemble those caused by non-biological stresses such as water or nutrient deficiencies. The spectral features are also extremely similar, known as “different body with same spectrum.” Additionally, when the same crop is subjected to the same disease or pest stress, different spectral features may be observed, known as “same body with different spectrum”. Moreover, the planting density, variety, and growth period of crops also have a significant impact on the spectral response under stress, which increases the difficulty of monitoring and identifying crop pests and diseases. In recent years, with the continuous development of UAV-LARS technology, domestic and foreign researchers have conducted a large amount of research on monitoring and identifying crop diseases and pests based on UAV-LARS. However, most of the research is focused on monitoring a single disease or pest, and there is a lack of comparative discrimination research on two or more types of disease and pest stress. However, in actual fields, the same crop often suffers from two or more diseases and pests at the same time. Most studies have constructed monitoring models based on data obtained from experimental fields with artificially inoculated single stress, without considering the actual occurrence of diseases and pests, such as temperature, humidity, and other meteorological, as well as previous plant protection data. Therefore, the universality of these models in practical applications is poor.
In the future, it is possible to integrate multi-source and multi-temporal RS information with non-RS information such as meteorological data, soil conditions, and habitat conditions of diseases and pests to achieve continuous monitoring of crop diseases and pests and obtain more comprehensive information on crop stress. By combining RS data from UAV and satellite, spectral and image information for different crops under different disease and pest conditions at different growth stages can be obtained. This can be used to build a monitoring and recognition model with good universality, generate real-time and effective spray prescription maps, and provide a basis for precise targeted spraying of crops. This has significant implications for the realization of precision agriculture.

5.4. Ineffective Results of Disease and Pest Monitoring

The transmission and complex processing and analysis of the massive image data obtained from UAV-LARS require a significant amount of time, making it difficult to achieve real-time delivery of prescription maps and timely targeted spraying operations. This leads to the ineffectiveness of monitoring results.
In future research, if it is possible to extract exclusive spectral response characteristics for different crops and different diseases or pests, a UAV-LARS spectral library for diseases and pests can be established. When constructing disease and pest monitoring and identification models, incorporating big data and artificial intelligence can quickly identify disease and pest characteristic spectra, and build more applicable monitoring models, greatly reducing data processing time. In addition, with the transmission speed of 5G network communication technology reaching 20 Gbps, stable transmission can be achieved even in complex scenarios, which is crucial for agricultural environments with relatively weak signal strength. The integration of UAV-LARS technology with 5G is expected to solve the time-consuming issue of image data transmission and achieve rapid data feedback from UAVs.
Although UAV-LARS is still in its early stages of development, it has shown great potential and application value in the monitoring and identification of crop diseases and pests. To fully unleash its potential, interdisciplinary researchers need to work together to improve the performance of UAV-LARS systems. This includes building a dedicated feature spectrum library for pests and diseases, improving the analysis and processing technology for RS images, and reducing the overall time required to obtain spraying prescription maps. These efforts will guide plant protection UAVs to achieve real-time, efficient, and accurate targeted spraying operations.

Funding

This research was funded by [Guangdong Modern Agricultural Industry Generic Key Technology Research and Development Innovation Team Project], grant number [2023KJ133], and [Laboratory of Lingnan Modern Agriculture Project], grant number [NT2021009]. And The APC was funded by [1960 francs].

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Carvajal, Y.M.; Cardwell, K.; Nelson, A.; Garrett, K.A.; Giovani, B.; Saunders, D.G.O.; Kamoun, S.; Legg, J.P.; Verdier, V.; Lessel, J. A global surveillance system for crop diseases. Science 2019, 364, 1237–1239. [Google Scholar] [CrossRef] [PubMed]
  2. Huang, Y.H.; Li, Z.H.; Zhu, H.T. The use of UAV remote sensing technology to identify crop stress: A review. J. Geo-Inf. Sci. 2019, 21, 512–523. [Google Scholar] [CrossRef]
  3. Lin, J.Y. Development status of agricultural UAV and its application in rice production. Fujian Agric. Mach. 2020, 2, 9. [Google Scholar]
  4. Yan, L.J. Discussion on Mechanization Technology of UAV Plant Protection. Agric. Technol. Equip. 2020, 12, 52–53. [Google Scholar]
  5. Lan, Y.B.; Chen, S.D.; Deng, J.Z.; Zhou, Z.Y.; Ou, Y.F. Development Situation and Problem Analysis of Plant Protection Unmanned Aerial Vehicle in China. J. South China Agric. Univ. 2019, 40, 217–225. [Google Scholar]
  6. Chen, H.B.; Lan, Y.B.; Fritz, B.K.; Hoffmann, W.C.; Liu, S.B. Review of Agricultural Spraying Technologies for Plant Protection Using Unmanned Aerial Vehicle (UAV). Int. J. Agric. Biol. Eng. 2021, 14, 38–49. [Google Scholar] [CrossRef]
  7. Zhang, K.; Zhao, L.; Cui, J.Y.; Mao, P.J.; Yuan, B.H.; Liu, Y.Y. Design and Implementation of Evaluation Method for Spraying Coverage Region of Plant Protection UAV. Agronomy 2023, 13, 1631. [Google Scholar] [CrossRef]
  8. He, X.K. Research Progress and Developmental Recommendations on Precision Spraying Technology and Equipment in China. Smart Agric. 2020, 2, 133–146. [Google Scholar]
  9. Zheng, J.Q.; Xun, Y.L. Development and Prospect in Environment-friendly Pesticide Sprayers. Trans. Chin. Soc. Agric. Mach. 2021, 52, 1–16. [Google Scholar]
  10. Zhou, Z.Y.; Zhang, Y.; Luo, X.W.; Lan, Y.B.; Xue, X.Y. Technology Innovation Development Strategy on Agricultural Aviation Industry for Plant Protection in China. Trans. Chin. Soc. Agric. Eng. (Trans. CSAE) 2013, 29, 1–10, (In Chinese with English Abstract). [Google Scholar]
  11. Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
  12. Sishodla, R.P.; Ray, R.L.; Singh, S.K. Applications of remote sensing in precision agriculture: A review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
  13. Francesconi, S.; Harfouche, A.; Maesano, M.; Balestra, G.M. UAV-Based Thermal, RGB Imaging and Gene Expression Analysis Allowed Detection of Fusarium Head Blight and Gave New Insights into the Physiological Responses to the Disease in Durum Wheat. Front. Plant Sci. 2021, 12, 628575. [Google Scholar]
  14. McCabe, M.F.; Houborg, R.; Lucieer, A. High-resolution sensing for precision agriculture: From Earth-observing satellites to unmanned aerial vehicles. Remote Sens. Agric. Ecosyst. Hydrol. 2016, 9998, 999811. [Google Scholar]
  15. Khaliq, A.; Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Chiaberge, M.; Gay, P. Comparison of satellite and UAV-based multispectral imagery for vineyard variability assessment. Remote Sens. 2019, 11, 436. [Google Scholar] [CrossRef]
  16. Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar]
  17. Araus, J.L.; Cairns, J.E. Field high-throughput phenotyping: The new crop breeding frontier. Trends Plant Sci. 2014, 19, 52–61. [Google Scholar]
  18. Zhang, H.D.; Wang, L.Q.; Tian, T.; Yin, J.H. A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
  19. Xie, C.Q.; Yang, C. A review on plant high-throughput phenotyping traits using UAV-based sensors. Comput. Electron. Agric. 2020, 178, 105731. [Google Scholar] [CrossRef]
  20. Zhao, X.Y.; Zhang, J.; Zhang, D.Y.; Zhou, X.G.; Liu, X.H.; Xie, J. Comparison between the Effects of Visible Light and Multispectral Sensor Based on Low-Altitude Remote Sensing Platform in the Evaluation of Rice Sheath Blight. Spectrosc. Spectr. Anal. 2019, 39, 1192–1198. [Google Scholar]
  21. Zhang, D.Y.; Zhou, X.G.; Zhang, J.; Lan, Y.B.; Xu, C.; Dong, L. Detection of rice sheath blight using an unmanned aerial system with high-resolution color and multispectral imaging. PLoS ONE 2018, 13, e0187470. [Google Scholar] [CrossRef]
  22. Dehkordi, R.H.; El Jarroudi, M.; Kouadio, L.; Meersmans, J.; Beyer, M. Monitoring Wheat Leaf Rust and Stripe Rust in Winter Wheat Using High-Resolution UAV-Based Red-Green-Blue Imagery. Remote Sens. 2020, 12, 3696. [Google Scholar] [CrossRef]
  23. Gao, D.M.; Sun, Q.; Hu, B.; Zhang, S. A Framework for Agricultural Pest and Disease Monitoring Based on Internet-of-Things and Unmanned Aerial Vehicles. Sensors 2020, 20, 1487. [Google Scholar] [CrossRef]
  24. Dong, J.H.; Yang, X.D.; Gao, L.; Wang, B.S.; Wang, L. Information extraction of winter wheat lodging area based on UAV remote sensing image. Heilongjiang Agric. Sci. 2016, 10, 147–152. [Google Scholar]
  25. Zheng, E.G.; Tian, Y.F.; Chen, T. Region extraction of corn lodging in UAV images based on deep learning. J. Henan Agric. Sci. 2018, 47, 155–160. [Google Scholar]
  26. Sugiura, R.; Tsuda, S.; Tamiya, S.; Itoh, A.; Nishiwaki, K.; Murakami, N.; Shibuya, Y.; Hirafuji, M.; Nuske, S. Field phenotyping system for the assessment of potato late blight resistance using RGB imagery from an unmanned aerial vehicle. Biosyst. Eng. 2016, 148, 1–10. [Google Scholar] [CrossRef]
  27. Tetila, E.C.; Machado, B.B.; Belete, N.A.D.; Guimaraes, D.A.; Pistori, H. Identification of soybean foliar diseases using unmanned aerial vehicle images. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2190–2194. [Google Scholar] [CrossRef]
  28. Liu, Z.; Wan, W.; Huang, J.Y.; Han, Y.W.; Wang, J.Y. Progress on key parameters inversion of crop growth based on unmanned aerial vehicle remote sensing. Trans. Chin. Soc. Agric. Eng. 2018, 34, 60–71. [Google Scholar]
  29. Messina, G.; Modica, G. Applications of UAV Thermal Imagery in Precision Agriculture: State of the Art and Future Research Outlook. Remote Sens. 2020, 12, 1491. [Google Scholar]
  30. Gerhards, M.; Schlerf, M.; Rascher, U.; Udelhoven, T.; Juszczak, R.; Alberti, G.; Miglietta, F.; Inoue, Y. Analysis of airborne optical and thermal imagery for detection of water stress symptoms. Remote Sens. 2018, 10, 1139. [Google Scholar] [CrossRef]
  31. Bian, J.; Zhang, Z.; Chen, J.; Chen, H.; Cui, C.; Li, X.; Chen, S.; Fu, Q. Simplified Evaluation of Cotton Water Stress Using High Resolution Unmanned Aerial Vehicle Thermal Imagery. Remote Sens. 2019, 11, 267. [Google Scholar] [CrossRef]
  32. Crusiol, L.G.T.; Nanni, M.R.; Furlanetto, R.H.; Sibaldelli, R.N.R.; Cezar, E.; Mertz-Henning, L.M.; Nepomuceno, A.L.; Neumaier, N.; Farias, J.R.B. UAV-based thermal imaging in the assessment of water status of soybean plants. Int. J. Remote Sens. 2020, 41, 3243–3265. [Google Scholar]
  33. Parisi, E.I.; Suma, M.; Güleç Korumaz, A.; Rosina, E.; Tucci, G. Aerial platforms (uav) surveys in the vis and tir range. Appl. Archaeol. Agric. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 945–952. [Google Scholar]
  34. Bellvert, J.; Marsal, J.; Girona, J.; Gonzalez-Dugo, V.; Fereres, E.; Ustin, S.L.; Zarco-Tejada, P.J. Airborne thermal imagery to detect the seasonal evolution of crop water status in peach, nectarine and Saturn peach orchards. Remote Sens. 2016, 8, 39. [Google Scholar] [CrossRef]
  35. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for precision agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef]
  36. Xavier, T.W.F.; Souto, R.N.V.; Statella, T.; Galbieri, R.; Santos, E.S.; Suli, G.S.; Zeilhofer, P. Identification of Ramularia Leaf Blight Cotton Disease Infection Levels by Multispectral, Multiscale UAV Imagery. Drones 2019, 3, 33. [Google Scholar] [CrossRef]
  37. Liu, T.; Shi, T.Z.; Zhang, H.; Wu, C. Detection of Rise Damage by Leaf Folder (Cnaphalocrocis medinalis) Using Unmanned Aerial Vehicle Based Hyperspectral Data. Sustainability 2020, 12, 9343. [Google Scholar] [CrossRef]
  38. Maes, W.H.; Steppe, K. Perspectives for Remote Sensing with Unmanned Aerial Vehicles in Precision Agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar]
  39. Elarab, M.; Ticlavilca, A.M.; Torres-Rua, A.F.; Maslova, I.; McKee, M. Estimating chlorophyll with thermal and broadband multispectral high resolution imagery from an unmanned aerial system using relevance vector machines for precision agriculture. Int. J. Appl. Earth Obs. Geoinf. 2015, 43, 32–42. [Google Scholar]
  40. Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S.; et al. Unmanned aerial system (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar]
  41. Zheng, H.B.; Zhou, X.; He, J.Y.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Early Season Detection of Rice Plants Using RGB, NIR-G-B and Multispectral Images from Unmanned Aerial Vehicle (UAV). Comput. Electron. Agric. 2020, 169, 105223. [Google Scholar] [CrossRef]
  42. Bohnenkamp, D.; Behmann, J.; Mahlein, A. In-Field Detection of Yellow Rust in Wheat on the Ground Canopy and UAV Scale. Remote Sens. 2019, 11, 2495. [Google Scholar] [CrossRef]
  43. Zhu, Y.H. Monitoring and Classification of Wheat Take-All in Field Based on UAV Hyperspectral Image. Master’s Thesis, Henan Agricultural University, Zhengzhou, China, 2018. [Google Scholar]
  44. Mahlein, A.; Rumpf, T.; Welke, P.; Dehne, H.; Plümer, L.; Steiner, U.; Oerke, E. Development of spectral indices for detecting and identifying plant diseases. Remote Sens. Environ. 2013, 128, 21–30. [Google Scholar] [CrossRef]
  45. Chen, T.T.; Yang, W.G.; Zhang, H.J.; Zhu, B.Y.; Zeng, R.E.; Wang, X.Y.; Wang, S.B.; Wang, L.D.; Qi, H.X.; Lan, Y.B.; et al. Early detection of bacterial wilt in peanut plants through leaf-level hyperspectral and unmanned aerial vehicle data. Comput. Electron. Agric. 2020, 177, 105708. [Google Scholar] [CrossRef]
  46. Liang, H.; He, J.; Lei, J.J. Monitoring of Corn Canopy Blight Disease Based on UAV Hyperspectral Method. Spectrosc. Spectr. Anal. 2020, 40, 1965–1972. [Google Scholar]
  47. Abdulridha, J.; Ampatzidis, Y.; Roberts, P.; Kakarla, S.C. Detecting powdery mildew disease in squash at different stages using UAV-based hyperspectral imaging and artificial intelligence. Biosyst. Eng. 2020, 197, 135–148. [Google Scholar] [CrossRef]
  48. Yang, L.L.; Wang, Z.P.; Wu, C.C. Research on Large-scale Monitoring of Spider Mite Infestation in Xinjiang Cotton Field Based on Multi-source Data. Spectrosc. Spectr. Anal. 2021, 41, 3949–3956. [Google Scholar]
  49. Sun, R.L. Research on Remote Sensing Monitoring of Wheat Leaf Rust Based on Ground Hyperspectral and UAV Images. Master’s Thesis, Yangzhou University, Yangzhou, China, 2021. [Google Scholar]
  50. Yang, Q.; Su, Y.; Jin, S.; Kelly, M.; Hu, T.; Ma, Q.; Li, Y.; Song, S.; Zhang, J.; Xu, G.; et al. The Influence of Vegetation Characteristics on Individual Tree Segmentation Methods with Airborne LiDAR Data. Remote Sens. 2019, 11, 2880. [Google Scholar] [CrossRef]
  51. Li, P.; Shen, X.; Dai, J.; Cao, L. Comparisons and Accuracy Assessments of LiDAR-Based Tree Segmentation Approaches in Planted Forests. Sci. Silvae Sin. 2018, 54, 127–136. [Google Scholar]
  52. Liao, L.H.; Cao, L.; Xie, Y.J.; Luo, J.Z.; Wang, G.B. Phenotypic Traits Extraction and Genetic Characteristics Assessment of Eucalyptus Trials Based on UAV-Borne LiDAR and RGB Images. Remote Sens. 2022, 14, 765. [Google Scholar] [CrossRef]
  53. Camarretta, N.; Harrison, P.A.; Lucieer, A.; Potts, B.M.; Davidson, N.; Hunt, M. From Drones to Phenotype: Using UAV-LiDAR to Detect Species and Provenance Variation in Tree Productivity and Structure. Remote Sens. 2020, 12, 3184. [Google Scholar] [CrossRef]
  54. Walter, J.D.C.; Edwards, J.; McDonald, G.; Kuchel, H. Estimating Biomass and Canopy Height with LiDAR for Field Crop Breeding. Front. Plant Sci. 2019, 10, 1145. [Google Scholar] [CrossRef] [PubMed]
  55. Thomas, S.; Kuska, M.T.; Bohnenkamp, D.; Brugger, A.; Alisaac, E.; Wahabzada, M.; Behmann, J.; Mahlein, A.K. Benefits of hyperspectral imaging for plant disease detection and plant protection: A technical perspective. J. Plant Dis. Protect. 2018, 125, 5–20. [Google Scholar]
  56. Vanegas, F.; Bratanov, D.; Powell, K.; Weiss, J.; Gonzalez, F. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data. Sensors 2018, 18, 260. [Google Scholar] [CrossRef] [PubMed]
  57. Raza, S.E.; Prince, G.; Clarkson, J.P.; Rajpoot, N.M. Automatic Detection of Diseased Tomato Plants Using Thermal and Stereo Visible Light Images. PLoS ONE 2015, 10, e0123262. [Google Scholar] [CrossRef] [PubMed]
  58. Hu, J.; Liu, X.; Liu, L.; Guan, L. Evaluating the Performance of the SCOPE Model in Simulating Canopy Solar-induced Chlorophyll Fluorescence. Remote Sens. 2018, 10, 250. [Google Scholar] [CrossRef]
  59. Han, Y.; Tarakey, B.A.; Hong, S.J.; Kim, S.Y.; Kim, E.; Lee, C.H.; Kim, G. Calibration and Image Processing of Aerial Thermal Image for UAV Application in Crop Water Stress Estimation. J. Sens. 2021, 2021, 14. [Google Scholar] [CrossRef]
  60. Liu, T.; Li, R.; Zhong, X.C.; Jiang, M.; Jin, X.L.; Zhou, P.; Liu, S.P.; Sun, C.M.; Guo, W.S. Estimates of Rice Lodging Using Indices Derived from UAV Visible and Thermal Infrared Images. Agric. For. Meteorol. 2018, 252, 144–154. [Google Scholar] [CrossRef]
  61. Sagan, V.; Maimaitijiang, M.; Sidike, P.; Eblimit, K.; Peterson, K.T.; Hartling, S.; Esposito, F.; Khanal, K.; Newcomb, M.; Pauli, D.; et al. UAV-Based High Resolution Thermal Imaging for Vegetation Monitoring, and Plant Phenotyping Using ICI 8640 P, FLIR Vue Pro R 640, and thermoMap Cameras. Remote Sens. 2019, 11, 330. [Google Scholar] [CrossRef]
  62. Shi, Y.; Huang, W.J.; Gonzalez-Moreno, P.; Luke, B.; Dong, Y.Y.; Zheng, Q.; Ma, H.Q.; Liu, L.Y. Wavelet-based rust spectral feature set (WRSFs): A novel spectral feature set based on continuous wavelet transformation for tracking progressive host-pathogen interaction of yellow rust on wheat. Remote Sens. 2018, 10, 525. [Google Scholar] [CrossRef]
  63. Calderon, R.; Navas-Cortes, J.A.; Lucena, C.; Zarco-Tejada, P.J. High-resolution Airborne Hyperspectral and Thermal Imagery for Early, Detection of Verticillium Wilt of Olive Using Fluorescence, Temperature and Narrow-band Spectral Indices. Remote Sens. Environ. 2013, 139, 231–245. [Google Scholar] [CrossRef]
  64. Calderon, R.; Montes-Borrego, M.; Landa, B.B.; Navas-Cortes, J.A.; Zarco-Tejada, P.J. Detection of Downy Mildew of Opium Poppy Using High-resolution Multi-spectral and Thermal Imagery Acquired with an Unmanned Aerial Vehicle. Precis. Agric. 2014, 15, 639–661. [Google Scholar] [CrossRef]
  65. Chen, X.X. Detection of Sclerotinia Stem Rot of Oilseed Rape by Using Image-Based Low-Altitude Remote Sensing Technology. Master’s Thesis, Zhejiang University, Hangzhou, China, 2017. [Google Scholar]
  66. Cao, F.; Liu, F.; Guo, H.; Kong, W.W.; Zhang, C.; He, Y. Fast Detection of Sclerotinia sclerotiorum on Oilseed Rape Leaves Using Low-Altitude Remote Sensing Technology. Sensors 2018, 18, 4464. [Google Scholar] [CrossRef] [PubMed]
  67. Dang, L.M.; Wang, H.X.; Li, Y.F.; Min, K.; Kwak, J.T.; Lee, O.N.; Park, H.; Moon, H. Fusarium wilt of Radish Detection Using RGB and near Infrared Images from Unmanned Aerial Vehicles. Remote Sens. 2020, 12, 2863. [Google Scholar] [CrossRef]
  68. Dammer, K.H.; Garz, A.; Hobart, M.; Schirrmann, M. Combined UAV-and Tractor-based Stripe Rust Monitoring in Winter Wheat under Field Conditions. Agron. J. 2022, 114, 651–661. [Google Scholar] [CrossRef]
  69. Yang, C.; Yan, G.; Du, S.M.; Li, X.D. Application Review of Unmanned Aerial Vehicle Remote Sensing Technology Wheat Production. Henan Sci. 2021, 39, 1598–1602. [Google Scholar]
  70. Yang, G.F.; He, Y.; Feng, X.P.; Li, X.Y.; Zhang, J.N.; Yu, Z.Y. Methods and New Research Progress of Remote Sensing Monitoring of Crop Disease and Pest Stress Using Unmanned Aerial Vehicle. Smart Agric. 2022, 4, 1–16. [Google Scholar]
  71. Shi, Y.; Huang, W.J.; Luo, J.H.; Huang, L.S.; Zhou, X.F. Detection and Discrimination of Pests and Diseases in Winter Wheat Based on Spectral Indices and Kernel Discriminant Analysis. Comput. Electron. Agric. 2017, 141, 171–180. [Google Scholar] [CrossRef]
  72. Xie, Y.P.; Chen, F.N.; Zhang, J.C.; Zhou, B.; Wang, H.J.; Wu, K.H. Study on Monitoring of Common Diseases of Crops Based on Hyperspectral Technology. Spectrosc. Spectr. Anal. 2018, 38, 2233–2240. [Google Scholar]
  73. Zhang, J.C.; Huang, Y.B.; Pu, R.L.; Gonzalez-Moreno, P.; Yuan, L.; Wu, K.; Huang, W. Monitoring plant diseases and pests through Remote Sensing technology: A review. Comput. Electron. Agric. 2019, 165, 104943. [Google Scholar] [CrossRef]
  74. Li, Z.X. Design and Implementation of Monitoring System for wheat Take-All Disease Based on UAV Remote. Master’s Thesis, Henan Agricultural University, Zhengzhou, China, 2016. [Google Scholar]
  75. Liu, W.; Yang, G.Q.; Xu, F.; Qiao, H.B.; Fan, J.R.; Song, Y.L. Comparisons of Detection of Wheat stripe Rust Using Hyper-spectrometer and UAV Aerial Photography. Acta Phytopathol. Sin. 2018, 48, 223–227. [Google Scholar] [CrossRef]
  76. Fu, W. Research of Wheat Take-all-disease Based on UAV Remote Sensing Monitoring. Master’s Thesis, Henan Agricultural University, Zhengzhou, China, 2015. [Google Scholar]
  77. Bhandari, M.; Ibrahim, A.M.H.; Xue, Q.W.; Jung, J.H.; Chang, A.J.; Rudd, J.C.; Maeda, M.; Rajan, N.; Neely, H.; Landivar, J. Assessing winter wheat foliage disease severity using aerial imagery acquired from small Unmanned Aerial Vehicle (UAV). Comput. Electron. Agric. 2020, 176, 105665. [Google Scholar] [CrossRef]
  78. Guo, W.; Zhu, Y.H.; Wang, H.F.; Zhang, J.; Dong, P.; Qiao, H.B. Monitoring Model of Winter Wheat Take-all Based on UAV Hyperspectral Imaging. Trans. Chin. Soc. Agric. Mach. 2019, 50, 162–169. [Google Scholar]
  79. Khan, I.H.; Liu, H.Y.; Li, W.; Cao, A.Z.; Wang, X.; Liu, H.Y.; Cheng, T.; Tian, Y.C.; Zhu, Y.; Cao, W.X.; et al. Early Detection of Powdery Mildew Disease and Accurate Quantification of Its Severity Using Hyperspectral Images in Wheat. Remote Sens. 2021, 13, 3612. [Google Scholar] [CrossRef]
  80. Liakos, K.G.; Busato, P.; Moshou, D.; Pearson, S. Machine Learning in Agriculture: A Review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef]
  81. Gewali, U.B.; Monteiro, S.T.; Saber, E. Machine learning based hyperspectral image analysis: A survey. arXiv 2018, arXiv:1802.08701. [Google Scholar]
  82. Guo, A.T.; Huang, W.J.; Dong, Y.Y.; Ye, H.C.; Ma, H.Q.; Liu, B.; Wu, W.B.; Ren, Y.; Ruan, C.; Geng, Y. Wheat Yellow Rust Detection Using UAV-Based Hyperspectral Technology. Remote Sens. 2021, 13, 123. [Google Scholar] [CrossRef]
  83. Xiao, Y.X.; Dong, Y.Y.; Huang, W.J.; Liu, L.Y.; Ma, H.Q. Wheat Fusarium Head Blight Detection Using UAV-Based Spectral and Texture Features in Optimal Window Size. Remote Sens. 2021, 13, 2437. [Google Scholar] [CrossRef]
  84. Su, B.F.; Liu, Y.L.; Huang, Y.C.; Wei, R.; Cao, X.F.; Han, D.J. Analysis for Stripe Rust Dynamics in Wheat Population Using UAV Remote Sensing. Trans. Chin. Soc. Agric. Eng. (Trans. CSAE) 2021, 37, 127–135. [Google Scholar]
  85. Su, J.Y.; Liu, C.J.; Coombes, M.; Hu, X.P.; Wang, C.H.; Xu, X.M.; Li, Q.D.; Guo, L.; Chen, W.H. Wheat Yellow Rust Monitoring by Learning from Multispectral UAV Aerial Imagery. Comput. Electron. Agric. 2018, 155, 157–166. [Google Scholar] [CrossRef]
  86. Su, J.Y.; Liu, C.J.; Hu, X.P.; Xu, X.M.; Guo, L.; Chen, W.H. Spatio-temporal monitoring of wheat yellow rust using UAV multispectral imagery. Comput. Electron. Agric. 2019, 167, 105035. [Google Scholar] [CrossRef]
  87. Hu, Y.G. Monitoring of Wheat Scab Based on Multi-Source Remote Sensing Data. Master’s Thesis, Anhui University, Hefei, China, 2021. [Google Scholar] [CrossRef]
  88. Shi, Y.; Huang, W.J.; Zhou, X.F. Evaluation of Wavelet Spectral Features in Pathological Detection and Discrimination of Yellow Rust and Powdery Mildew in Winter Wheat with Hyperspectral Reflectance Data. J. Appl. Remote Sens. 2017, 11, 026025. [Google Scholar] [CrossRef]
  89. Liang, D.; Yang, Q.Y.; Huang, W.J.; Peng, D.L.; Zhao, J.L.; Huang, L.S.; Zhang, D.Y.; Song, X.Y. Estimation of Leaf Area Index Based on Wavelet Transform and Support Vector Machine Regression in Winter Wheat. Infrared Laser Eng. 2015, 44, 335–340. [Google Scholar]
  90. Ma, H.Q.; Huang, W.J.; Dong, Y.Y.; Liu, L.Y.; Guo, A.T. Using UAV-Based Hyperspectral Imagery to Detect Winter Wheat Fusarium Head Blight. Remote Sens. 2021, 13, 3024. [Google Scholar] [CrossRef]
  91. Zhu, N.Y.; Liu, X.; Liu, Z.Q.; Hu, K.; Wang, Y.K.; Tan, J.L.; Huang, M.; Zhu, Q.B.; Ji, X.S.; Jiang, Y.N.; et al. Deep Learning for Smart Agriculture: Concepts, Tools, Applications, and Opportunities. Int. J. Agric. Biol. Eng. 2018, 11, 32–44. [Google Scholar] [CrossRef]
  92. Lv, S.P.; Li, D.H.; Xian, R.H. Research Status of Deep Learning in Agriculture of China. Comput. Eng. Appl. 2019, 55, 24–33. [Google Scholar]
  93. Paoletti, M.E.; Haut, J.M.; Plaza, J.; Plaza, A. Deep Learning Classifiers for Hyperspectral Imaging: A Review. ISPRS J. Photogramm. Remote Sens. 2019, 158, 279–317. [Google Scholar] [CrossRef]
  94. Ma, L.; Liu, Y.; Zhang, X.L.; Ye, Y.X.; Yin, G.F.; Johnson, B.A. Deep learning in Remote Sensing applications: A Meta-analysis and Review. ISPRS J. Photogramm. Remote Sens. 2019, 152, 166–177. [Google Scholar] [CrossRef]
  95. Zhang, T.X.; Xu, Z.Y.; Su, J.Y.; Yang, Z.F.; Liu, C.J.; Chen, W.H.; Li, J.Y. Ir-UNet: Irregular Segmentation U-Shape Network for Wheat Yellow Rust Detection by UAV Multispectral Imagery. Remote Sens. 2021, 13, 3892. [Google Scholar] [CrossRef]
  96. Zhang, T.X.; Yang, Z.F.; Xu, Z.Y.; Li, J.Y. Wheat Yellow Rust Severity Detection by Efficient DF-UNet and UAV Multispectral Imagery. IEEE Sens. J. 2022, 22, 9057–9068. [Google Scholar] [CrossRef]
  97. Huang, H.S.; Deng, J.Z.; Lan, Y.B.; Yang, A.Q.; Zhang, L.; Wen, S.; Zhang, H.H.; Zhang, Y.L.; Deng, Y.S. Detection of Helminthosporium Leaf Blotch Disease Based on UAV Imagery. Appl. Sci. 2019, 9, 558. [Google Scholar] [CrossRef]
  98. Liu, L.Y.; Dong, Y.Y.; Huang, W.J.; Du, X.P.; Ma, H.Q. Monitoring Wheat Fusarium Head Blight Using Unmanned Aerial Vehicle Hyperspectral Imagery. Remote Sens. 2020, 12, 3811. [Google Scholar] [CrossRef]
  99. Cao, Z.; Wang, G.B. Research Progress of the Application of Unmanned Aerial Vehicle in Cotton Fieldmanagement. China South. Agric. Mach. 2021, 52, 83–84. [Google Scholar]
  100. Wang, T.Y.; Thomasson, J.A.; Yang, C.H.; Isakeit, T.; Nichols, R.L. Automatic Classification of Cotton Root Rot Disease Based on UAV Remote Sensing. Remote Sens. 2020, 12, 1310. [Google Scholar] [CrossRef]
  101. Huang, H.S.; Deng, J.Z.; Lan, Y.B.; Yang, A.; Deng, X.L.; Zhang, L.; Wen, S.; Jiang, Y.; Suo, G.Y.; Chen, P.C. A two-stage Classification Approach for the Detection of Spider Mite-infested Cotton Using UAV Multispectral Imagery. Remote Sens. Lett. 2018, 9, 933–941. [Google Scholar] [CrossRef]
  102. Guo, W.; Qiao, H.B.; Zhao, H.Q.; Zhang, J.J.; Pei, P.C.; Liu, Z.L. Cotton Aphid Damage Monitoring Using UAV Hyperspectral Data Based on Derivative of Ratio Spectroscopy. Spectrosc. Spectr. Anal. 2021, 41, 1543–1550. [Google Scholar]
  103. Song, Y. Identification of Verticillium Wilt in Cotton and Estimation of Yield Loss Based on UAV Multi-spectral Remote Sensing. Master’s Thesis, Shihezi University, Shihezi, China, 2021. [Google Scholar]
  104. Cui, M.N.; Dai, J.G.; Wang, S.H.; Zhang, G.S.; Xue, J.L. Research on ldentification Method of Mite Infection Cotton Based on of UAV Multi-spectral Image. Xinjiang Agric. Sci. 2018, 55, 1457–1466. [Google Scholar]
  105. Wang, T.; Thomasson, J.A.; Isakeit, T.; Yang, C.H.; Nichols, R.L. A Plant-by-Plant Method to Identify and Treat Cotton Root Rot Based on UAV Remote Sensing. Remote Sens. 2020, 12, 2453. [Google Scholar] [CrossRef]
  106. Wang, T.Y.; Thomasson, J.A.; Yang, C.H.; Isakeit, T.; Nichols, R.L.; Collett, R.M.; Han, X.Z.; Bagnall, C. Unmanned aerial vehicle Remote Sensing to delineate cotton root rot. J. Appl. Remote Sens. 2020, 14, 034522. [Google Scholar] [CrossRef]
  107. Dilixiti, Y.M.M.; Zhou, J.P.; Xu, Y.; Fan, X.P.; Yalikun, S.W.T. Cotton Pest Monitoring Based on Logistic Algorithm and Remote Sensing Image. J. South China Agric. Univ. 2022, 43, 87–95. [Google Scholar]
  108. Ding, Y.; Zhang, Y.; Wang, A.F.; Meng, M.; Shen, H. Monitoring of Bacterial Streak Disease in Rice Based on UAV Hyperspectral Method. Geomat. Spat. Inf. Technol. 2022, 45, 44–47. [Google Scholar]
  109. Lee, K.D.; Kim, S.M.; An, H.Y. ChanWon, P.; Hong, S.Y.; So, K.H.; SangIl, N. Yearly Estimation of Rice Growth and Bacterial Leaf Blight Inoculation Effect Using UAV Imagery. J. Korean Soc. Agric. Eng. 2020, 62, 75–86. [Google Scholar]
  110. Wang, Z.; Chu, G.K.; Zhang, H.J.; Liu, S.X.; Huang, X.C.; Gao, F.R.; Zhang, C.Q.; Wang, J.X. Identification of Diseased Empty Rice Panicles Based on Haar-like Feature of UAV Optical Image. Trans. Chin. Soc. Agric. Eng. (Trans. CSAE) 2018, 34, 73–82. [Google Scholar]
  111. Shen, M.; Yu, X. Research and Implementation of Unmanned Aerial Vehicle Search for Crop Diseasesbased on Machine Vision: Taking Rice Sheath Blight as an Example. Wirel. Internet Technol. 2019, 16, 112–114. [Google Scholar]
  112. Wei, L.L.; Luo, Y.S.; Xu, L.Z.; Zhang, Q.; Cai, Q.B.; Shen, M.J. Deep Convolutional Neural Network for Rice Density Prescription Map at Ripening Stage Using Unmanned Aerial Vehicle-Based Remotely Sensed Images. Remote Sens. 2022, 14, 46. [Google Scholar] [CrossRef]
  113. Harshadkumar, B.P.; Jitesh, P.S.; Vipul, K.D. Detection and Classification of Rice Plant Diseases. Intell. Decis. Technol. 2018, 11, 357–373. [Google Scholar]
  114. Rahman, C.R.; Arko, P.S.; Ali, M.E.; Khan, M.A.I.; Apon, S.H.; Nowrin, F.; Abu, W. Identification and recognition of rice diseases and pests using convolutional neural networks. Biosyst. Eng. 2020, 194, 112–120. [Google Scholar] [CrossRef]
  115. Stroppiana, D.; Villa, P.; Sona, G.; Ronchetti, G.; Candiani, G.; Pepe, M.; Busetto, L.; Migliazzi, M.; Boschetti, M. Early season weed mapping in rice crops using multi-spectral UAV data. Int. J. Remote Sens. 2018, 39, 5432–5452. [Google Scholar] [CrossRef]
  116. Huang, H.S.; Deng, J.Z.; Lan, Y.B.; Yang, A.Q.; Deng, X.L.; Wen, S.; Zhang, H.H.; Zhang, Y.L. Accurate Weed Mapping and Prescription Map Generation Based on Fully Convolutional Networks Using UAV Imagery. Sensors 2018, 18, 3299. [Google Scholar] [CrossRef]
  117. Lan, Y.B.; Huang, K.H.; Yang, C.; Lei, L.C.; Ye, J.H.; Zhang, J.L.; Zeng, W.; Zhang, Y.L.; Deng, J.Z. Real-Time Identification of Rice Weeds by UAV Low-Altitude Remote Sensing Based on Improved Semantic Segmentation Model. Remote Sens. 2021, 13, 4370. [Google Scholar] [CrossRef]
  118. Yang, Y. The Key Diagnosis Technology of Rice Blast Based on Hyperspectral Image. Ph.D. Thesis, Zhejiang University, Hangzhou, China, 2012. [Google Scholar]
  119. Verma, T.; Dubey, S. Impact of Color Spaces and Feature Sets in Automated Plant Diseases Classifier: A Comprehensive Review Based on Rice Plant Images. Arch. Comput. Methods Eng. 2020, 27, 1611–1632. [Google Scholar] [CrossRef]
  120. Liu, L.Y. Research on the Methods of Wheat Fusarium Head Blight and Powderymildew Monitoring Using Remote Sensing Technologv at Different Scales. Ph.D. Thesis, University of the Chinese Academy of Sciences, Beijing, China, 2020. [Google Scholar] [CrossRef]
  121. Ma, H.Q. Dynamic Monitoring of Major Wheat Diseases Based on Multi-source and Multitemporal Remote Sensing Analysis. Ph.D. Thesis, Nanjing University of Information Engineering, Nanjing, China, 2020. [Google Scholar] [CrossRef]
  122. Zhang, J.C.; Pu, R.L.; Lin, Y.; Huang, W.J.; Nie, C.W.; Yang, G.J. Integrating Remotely Sensed and Meteorological Observations to Forecast Wheat PowderyMildew at a Regional Scale. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 7, 4328–4339. [Google Scholar] [CrossRef]
  123. Lin, Y.B.; Bao, Z.Y.; Zhang, H.B.; Zhang, Y.T.; Liang, X. Habitat monitoring to evaluate crop disease and pest distributions based on multisource satellite remote sensing imagery. Optic 2017, 145, 66–73. [Google Scholar]
  124. Lan, Y.; Deng, X.; Zeng, G. Advances in diagnosis of crop diseases, pests and weeds by UAV remote sensing. Smart Agric. 2019, 1, 1–19. [Google Scholar]
  125. Yao, H.; Qin, R.; Chen, X. Unmanned aerial vehicle for remote sensing applications—A review. Remote Sens. 2019, 11, 1443. [Google Scholar] [CrossRef]
  126. Awais, M.; Li, W.; Cheema, M.J.M.; Hussain, S.; AlGarni, T.S.; Liu, C.C.; Ali, A. Remotely sensed identification of canopy characteristics using UAV-based imagery under unstable environmental conditions. Environ. Technol. Innov. 2021, 22, 101465. [Google Scholar] [CrossRef]
  127. Rubio, H.J.; Gupta, A.; Ong, Y.S. Data-driven risk Assessment and Multicriteria Optimization of UAV Operations. Aerosp. Sci. Technol. 2018, 77, 510–523. [Google Scholar] [CrossRef]
  128. Wang, C.W.; Chen, Y.C.; Xiao, Z.P.; Zeng, X.M.; Tang, S.H.; Lin, F.; Zhang, L.X.; Meng, X.L.; Liu, S.Q. Cotton Blight Identification with Ground Framed Canopy Photo-assisted Multispectral UAV Images. Agronomy 2023, 13, 1222. [Google Scholar] [CrossRef]
Figure 1. Composition of UAV-LARS system.
Figure 1. Composition of UAV-LARS system.
Agronomy 13 02232 g001
Figure 2. The multispectral lens of DJI Phantom 4.
Figure 2. The multispectral lens of DJI Phantom 4.
Agronomy 13 02232 g002
Figure 3. Statistics on the number of articles in the literature related to UAV-LARS monitoring and identification of crop diseases and pests.
Figure 3. Statistics on the number of articles in the literature related to UAV-LARS monitoring and identification of crop diseases and pests.
Agronomy 13 02232 g003
Figure 4. Statistics of crops (a) and main diseases and pests (b) related to UAV-LARS monitoring and identification of crop diseases and pests.
Figure 4. Statistics of crops (a) and main diseases and pests (b) related to UAV-LARS monitoring and identification of crop diseases and pests.
Agronomy 13 02232 g004
Figure 5. Statistics of RS technology related to UAV-LARS monitoring and identification of crop diseases and pests.
Figure 5. Statistics of RS technology related to UAV-LARS monitoring and identification of crop diseases and pests.
Agronomy 13 02232 g005
Figure 6. Model structure for monitoring and identifying crop diseases and pests using UAV-LARS.
Figure 6. Model structure for monitoring and identifying crop diseases and pests using UAV-LARS.
Agronomy 13 02232 g006
Figure 7. A model for monitoring and identifying crop diseases and pests based on multi-source and multi-scale RS analysis.
Figure 7. A model for monitoring and identifying crop diseases and pests based on multi-source and multi-scale RS analysis.
Agronomy 13 02232 g007
Figure 8. Main software required for UAV RS data processing: (a) DJI Terra; (b) Pix4D; (c) Agisoft PhotoScan Pro; (d) ENVI.
Figure 8. Main software required for UAV RS data processing: (a) DJI Terra; (b) Pix4D; (c) Agisoft PhotoScan Pro; (d) ENVI.
Agronomy 13 02232 g008
Table 1. Method for monitoring and identifying wheat diseases and pests using UAV-LARS.
Table 1. Method for monitoring and identifying wheat diseases and pests using UAV-LARS.
Types of CoercionUAV-LARS SystemSensitive FeaturesOptimal AlgorithmGSD/cmReferences
WLRDJI Inspire 1
DJI Inspire 2
DJI Phantom 4 Pro
Bands: Red
CIs: VEG, VARI, MGRVI, NDRDI, GLI, GI, NDI, NDVI
BPNN, LR<1[22,49,77]
take-allUVA + CCD digital camera
AZUP-T8 + UHD 185
CIs: DSI, RSI, NDSI
TFs: First and Second Moment of HSV
LR, SVM, RBF, PLSR≤1[76,78]
WSRM100 + The RedEdge camera
S1000 + The RedEdge camera
UVA + UHD185
Bands: Red, NIR, Red-edge
CIs: SIPI, PRI, PSRI, MSR, DVIRE, GVI, NDVI, RVI, NIDVI, OSAVI
TFs: VAR2, CON2, …
LR, PLSR, RF, SVM, UNet1~2.5[75,82,84,85,95]
FHBM600 + Cubero S185 Firefly SE + The RedEdge camera
S1000 + UHD185
Bands: 478 nm, 650 nm, 702 nm, …
CIs: PSRI, ARI, NRI, MCARI, PRI, PhRI, PSRI, RVSI
TFs: Mean, Variance, Homogeneity, …
LR, SVM, RF, ETC, BPNN4[83,87,90,98]
Table 2. Method for monitoring and identifying cotton diseases and pests using UAV-LARS.
Table 2. Method for monitoring and identifying cotton diseases and pests using UAV-LARS.
Types of CoercionUAV-LARS SystemSensitive FeaturesOptimal AlgorithmGSD/cmReferences
HLB,
RLB,
VM
DJI Phantom 4
The MK Okto XL UAV + The TetraCam ADC camera
Bands: R550, R656, R800
CIs: NGRDI, NGBDI, ExG, ExR, ExGR, GLI, DVI, …
TFs: Color histogram, LBPH
Others: DCnor values
Logistic regression, RF, SVM, CNN, Multiple LR3.4~25[36,97,103]
CRRThe Lancaster UAV + The 1J3 camera
The Tuffwing Mapper UAV + The RedEdge camera
DJI M100 + The RedEdge camera
Bands: Green, Red, NIRLogistic regression, KMSVM, KMSEG, Maximum likelihood 2.5~7.64[100,105,106]
Pest stressDJI M100 + The ADC-Lite camera
DJI S1000 + The Micro MCA12 Snap camera
DJI M600 + The RedEdge camera
AZUP-T8 + UHD 185
Bands: NIR, Red, Green, R514, R566, R698
CIs: NDVI, EVI, GNDVI, DVI, RGI, ACI, MACI, GRVI, TVI, RDVI, SAVI
TFs: LBPH
Others: DR514, DR566, DR698,
Logistic regression, SVM, CNN,
Transferred AlexNet,
Simple LR, PLSR,
1~4[101,102,104,107]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhao, G.; Zhang, Y.; Lan, Y.; Deng, J.; Zhang, Q.; Zhang, Z.; Li, Z.; Liu, L.; Huang, X.; Ma, J. Application Progress of UAV-LARS in Identification of Crop Diseases and Pests. Agronomy 2023, 13, 2232. https://doi.org/10.3390/agronomy13092232

AMA Style

Zhao G, Zhang Y, Lan Y, Deng J, Zhang Q, Zhang Z, Li Z, Liu L, Huang X, Ma J. Application Progress of UAV-LARS in Identification of Crop Diseases and Pests. Agronomy. 2023; 13(9):2232. https://doi.org/10.3390/agronomy13092232

Chicago/Turabian Style

Zhao, Gaoyuan, Yali Zhang, Yubin Lan, Jizhong Deng, Qiangzhi Zhang, Zichao Zhang, Zhiyong Li, Lihan Liu, Xu Huang, and Junjie Ma. 2023. "Application Progress of UAV-LARS in Identification of Crop Diseases and Pests" Agronomy 13, no. 9: 2232. https://doi.org/10.3390/agronomy13092232

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop