remotesensing-logo

Journal Browser

Journal Browser

UAV Imagery for Precision Agriculture

A special issue of Remote Sensing (ISSN 2072-4292).

Deadline for manuscript submissions: closed (31 March 2022) | Viewed by 85726

Special Issue Editor


E-Mail Website
Guest Editor
Leibniz Institute for Agricultural Engineering and Bioeconomy (ATB), Potsdam, Germany
Interests: precision agriculture; proximal sensing; UAV applications; spatial variability; crop monitoring; image analysis

Special Issue Information

Dear Colleagues,

Unmanned aerial vehicles (UAVs) are a potential disruptive remote sensing technology that may have strong impacts on the future of farming and specifically on the implementation of precision agriculture. As precision agriculture is foremost concerned with managing the spatio-temporal variability within fields or orchards, it sets high standards for monitoring crops over the season. In many ways, UAV imagery can fulfill these demands even better than satellite-based remote sensing or proximal sensing because image acquisition is not limited by factors such as cloud coverage or traffic lanes. In fact, flight planning and camera sensor equipment can be highly adapted to the needs of the application. Low-altitude flight campaigns yield ultra-high-resolution imagery that even resolves individual plants, species, or pests. 3D canopy information can be estimated from overlapping imagery with structure from motion. This enables new dimensions for the site-specific and selective treatment of crops unprecedented in agriculture.

This Special Issue of Remote Sensing asks for papers related to new technological advancements in the application of UAV imagery for precision agriculture. The following specific topics are suggested recommendations:

  • Ultra-high resolution: Mapping on individual plant and species level with deep convolutional neural networks (e.g., weed mapping, crop disease detection).
  • Intelligent sensing: Integration of camera sensors and embedded systems on UAV platforms for online crop monitoring with optimized on-board image processing.
  • Bridging the scales: Connecting UAV imagery with proximal sensing from ground vehicle platforms or satellite-based remote sensing.
  • Third dimension: Investigating tree and field crops with 3D point cloud data estimated from UAV imagery.
  • Testing out the flexibility: Individual approaches to flight patterns, oblique imagery, novel camera sensors.
  • Lessons learned: Problems and their solutions when generating orthophotos and/or 3D point cloud data with structure from motion from UAV imagery of crop canopies.
  • Novel approaches of delineating basic agronomic parameters from UAV imagery (e.g., leaf nitrogen, leaf area index, or biomass).

Dr. Michael Schirrmann
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • UAV application
  • Crop monitoring
  • Photogrammetry (SfM)
  • 3D point cloud
  • Deep learning
  • Ultra-high resolution

Published Papers (19 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

21 pages, 12123 KiB  
Article
Crop Monitoring Using Sentinel-2 and UAV Multispectral Imagery: A Comparison Case Study in Northeastern Germany
by Minhui Li, Redmond R. Shamshiri, Cornelia Weltzien and Michael Schirrmann
Remote Sens. 2022, 14(17), 4426; https://doi.org/10.3390/rs14174426 - 05 Sep 2022
Cited by 17 | Viewed by 4860
Abstract
Monitoring within-field crop variability at fine spatial and temporal resolution can assist farmers in making reliable decisions during their agricultural management; however, it traditionally involves a labor-intensive and time-consuming pointwise manual process. To the best of our knowledge, few studies conducted a comparison [...] Read more.
Monitoring within-field crop variability at fine spatial and temporal resolution can assist farmers in making reliable decisions during their agricultural management; however, it traditionally involves a labor-intensive and time-consuming pointwise manual process. To the best of our knowledge, few studies conducted a comparison of Sentinel-2 with UAV data for crop monitoring in the context of precision agriculture. Therefore, prospects of crop monitoring for characterizing biophysical plant parameters and leaf nitrogen of wheat and barley crops were evaluated from a more practical viewpoint closer to agricultural routines. Multispectral UAV and Sentinel-2 imagery was collected over three dates in the season and compared with reference data collected at 20 sample points for plant leaf nitrogen (N), maximum plant height, mean plant height, leaf area index (LAI), and fresh biomass. Higher correlations of UAV data to the agronomic parameters were found on average than with Sentinel-2 data with a percentage increase of 6.3% for wheat and 22.2% for barley. In this regard, VIs calculated from spectral bands in the visible part performed worse for Sentinel-2 than for the UAV data. In addition, large-scale patterns, formed by the influence of an old riverbed on plant growth, were recognizable even in the Sentinel-2 imagery despite its much lower spatial resolution. Interestingly, also smaller features, such as the tramlines from controlled traffic farming (CTF), had an influence on the Sentinel-2 data and showed a systematic pattern that affected even semivariogram calculation. In conclusion, Sentinel-2 imagery is able to capture the same large-scale pattern as can be derived from the higher detailed UAV imagery; however, it is at the same time influenced by management-driven features such as tramlines, which cannot be accurately georeferenced. In consequence, agronomic parameters were better correlated with UAV than with Sentinel-2 data. Crop growers as well as data providers from remote sensing services may take advantage of this knowledge and we recommend the use of UAV data as it gives additional information about management-driven features. For future perspective, we would advise fusing UAV with Sentinel-2 imagery taken early in the season as it can integrate the effect of agricultural management in the subsequent absence of high spatial resolution data to help improve crop monitoring for the farmer and to reduce costs. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Graphical abstract

45 pages, 32522 KiB  
Article
Herbage Mass, N Concentration, and N Uptake of Temperate Grasslands Can Adequately Be Estimated from UAV-Based Image Data Using Machine Learning
by Ulrike Lussem, Andreas Bolten, Ireneusz Kleppert, Jörg Jasper, Martin Leon Gnyp, Jürgen Schellberg and Georg Bareth
Remote Sens. 2022, 14(13), 3066; https://doi.org/10.3390/rs14133066 - 26 Jun 2022
Cited by 11 | Viewed by 2760
Abstract
Precise and timely information on biomass yield and nitrogen uptake in intensively managed grasslands are essential for sustainable management decisions. Imaging sensors mounted on unmanned aerial vehicles (UAVs) along with photogrammetric structure-from-motion processing can provide timely data on crop traits rapidly and non-destructively [...] Read more.
Precise and timely information on biomass yield and nitrogen uptake in intensively managed grasslands are essential for sustainable management decisions. Imaging sensors mounted on unmanned aerial vehicles (UAVs) along with photogrammetric structure-from-motion processing can provide timely data on crop traits rapidly and non-destructively with a high spatial resolution. The aim of this multi-temporal field study is to estimate aboveground dry matter yield (DMY), nitrogen concentration (N%) and uptake (Nup) of temperate grasslands from UAV-based image data using machine learning (ML) algorithms. The study is based on a two-year dataset from an experimental grassland trial. The experimental setup regarding climate conditions, N fertilizer treatments and slope yielded substantial variations in the dataset, covering a considerable amount of naturally occurring differences in the biomass and N status of grasslands in temperate regions with similar management strategies. Linear regression models and three ML algorithms, namely, random forest (RF), support vector machine (SVM), and partial least squares (PLS) regression were compared with and without a combination of both structural (sward height; SH) and spectral (vegetation indices and single bands) features. Prediction accuracy was quantified using a 10-fold 5-repeat cross-validation (CV) procedure. The results show a significant improvement of prediction accuracy when all structural and spectral features are combined, regardless of the algorithm. The PLS models were outperformed by their respective RF and SVM counterparts. At best, DMY was predicted with a median RMSECV of 197 kg ha−1, N% with a median RMSECV of 0.32%, and Nup with a median RMSECV of 7 kg ha−1. Furthermore, computationally less expensive models incorporating, e.g., only the single multispectral camera bands and SH metrics, or selected features based on variable importance achieved comparable results to the overall best models. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Figure 1

18 pages, 3298 KiB  
Article
Comparative Sensitivity of Vegetation Indices Measured via Proximal and Aerial Sensors for Assessing N Status and Predicting Grain Yield in Rice Cropping Systems
by Telha H. Rehman, Mark E. Lundy and Bruce A. Linquist
Remote Sens. 2022, 14(12), 2770; https://doi.org/10.3390/rs14122770 - 09 Jun 2022
Cited by 9 | Viewed by 6773
Abstract
Reflectance-based vegetation indices can be valuable for assessing crop nitrogen (N) status and predicting grain yield. While proximal sensors have been widely studied in agriculture, there is increasing interest in utilizing aerial sensors. Given that few studies have compared aerial and proximal sensors, [...] Read more.
Reflectance-based vegetation indices can be valuable for assessing crop nitrogen (N) status and predicting grain yield. While proximal sensors have been widely studied in agriculture, there is increasing interest in utilizing aerial sensors. Given that few studies have compared aerial and proximal sensors, the objective of this study was to quantitatively compare the sensitivity of aerially sensed Normalized Difference Vegetation Index (NDVI) and Normalized Difference Red-Edge Index (NDRE) and proximally sensed NDVI for assessing total N uptake at panicle initiation (PI-NUP) and predicting grain yield in rice. Nitrogen response trials were established over a 3-year period (10 site-years) at various locations throughout the Sacramento Valley rice growing region of California. At PI, a multispectral unmanned aircraft system (UAS) was used to measure NDVIUAS and NDREUAS (average ground sampling distance: 3.7 cm pixel−1), and a proximal GreenSeeker (GS) sensor was used to record NDVIGS. To enable direct comparisons across the different indices on an equivalent numeric scale, each index was normalized by calculating the Sufficiency-Index (SI) relative to a non-N-limiting plot. Kernel density distributions indicated that NDVIUAS had a narrower range of values that were poorly differentiated compared to NDVIGS and NDREUAS. The critical PI-NUP where yields did not increase with higher PI-NUP averaged 109 kg N ha−1 (±4 kg N ha−1). The relationship between SI and PI-NUP for the NDVIUAS saturated lower than this critical PI-NUP (96 kg N ha−1), whereas NDVIGS and NDREUAS saturated at 111 and 130 kg N ha−1, respectively. This indicates that NDVIUAS was less suitable for making N management decisions at this crop stage than NDVIGS and NDREUAS. Linear mixed effects models were developed to evaluate how well each SI measured at PI was able to predict grain yield. The NDVIUAS was least sensitive to variation in yields as reflected by having the highest slope (2.4 Mg ha−1 per 0.1 SI). In contrast, the slopes for NDVIGS and NDREUAS were 0.9 and 1.1 Mg ha−1 per 0.1 SI, respectively, indicating greater sensitivity to yields. Altogether, these results indicate that the ability of vegetation indices to inform crop management decisions depends on the index and the measurement platform used. Both NDVIGS and NDREUAS produced measurements sensitive enough to inform N fertilizer management in this system, whereas NDVIUAS was more limited. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Graphical abstract

24 pages, 2926 KiB  
Article
Hyperspectral Indices for Predicting Nitrogen Use Efficiency in Maize Hybrids
by Monica B. Olson, Melba M. Crawford and Tony J. Vyn
Remote Sens. 2022, 14(7), 1721; https://doi.org/10.3390/rs14071721 - 02 Apr 2022
Cited by 5 | Viewed by 2468
Abstract
Enhancing the nitrogen (N) efficiency of maize hybrids is a common goal of researchers, but involves repeated field and laboratory measurements that are laborious and costly. Hyperspectral remote sensing has recently been investigated for measuring and predicting biomass, N content, and grain yield [...] Read more.
Enhancing the nitrogen (N) efficiency of maize hybrids is a common goal of researchers, but involves repeated field and laboratory measurements that are laborious and costly. Hyperspectral remote sensing has recently been investigated for measuring and predicting biomass, N content, and grain yield in maize. We hypothesized that vegetation indices (HSI) obtained mid-season through hyperspectral remote sensing could predict whole-plant biomass per unit of N taken up by plants (i.e., N conversion efficiency: NCE) and grain yield per unit of plant N (i.e., N internal efficiency: NIE). Our objectives were to identify the best mid-season HSI for predicting end-of-season NCE and NIE, rank hybrids by the selected HSI, and evaluate the effect of decreased spatial resolution on the HSI values and hybrid rankings. Analysis of 20 hyperspectral indices from imaging at V16/18 and R1/R2 by manned aircraft and UAVs over three site-years using mixed models showed that two indices, HBSI1 and HBS2, were predictive of NCE, and two indices, HBCI8 and HBCI9, were predictive of NIE for actual data collected from five to nine hybrids at maturity. Statistical differentiation of hybrids in their NCE or NIE performance was possible based on the models with the greatest accuracy obtained for NIE. Lastly, decreasing the spatial resolution changed the HSI values, but an effect on hybrid differentiation was not evident. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Figure 1

18 pages, 5387 KiB  
Article
UAV-Assisted Thermal Infrared and Multispectral Imaging of Weed Canopies for Glyphosate Resistance Detection
by Austin Eide, Cengiz Koparan, Yu Zhang, Michael Ostlie, Kirk Howatt and Xin Sun
Remote Sens. 2021, 13(22), 4606; https://doi.org/10.3390/rs13224606 - 16 Nov 2021
Cited by 12 | Viewed by 3453
Abstract
The foundation of contemporary weed management practices in many parts of the world is glyphosate. However, dependency on the effectiveness of herbicide practices has led to overuse through continuous growth of crops resistant to a single mode of action. In order to provide [...] Read more.
The foundation of contemporary weed management practices in many parts of the world is glyphosate. However, dependency on the effectiveness of herbicide practices has led to overuse through continuous growth of crops resistant to a single mode of action. In order to provide a cost-effective weed management strategy that does not promote glyphosate-resistant weed biotypes, differences between resistant and susceptible biotypes have to be identified accurately in the field conditions. Unmanned Aerial Vehicle (UAV)-assisted thermal and multispectral remote sensing has potential for detecting biophysical characteristics of weed biotypes during the growing season, which includes distinguishing glyphosate-susceptible and glyphosate-resistant weed populations based on canopy temperature and deep learning driven weed identification algorithms. The objective of this study was to identify herbicide resistance after glyphosate application in true field conditions by analyzing the UAV-acquired thermal and multispectral response of kochia, waterhemp, redroot pigweed, and common ragweed. The data were processed in ArcGIS for raster classification as well as spectral comparison of glyphosate-resistant and glyphosate-susceptible weeds. The classification accuracy between the sensors and classification methods of maximum likelihood, random trees, and Support Vector Machine (SVM) were compared. The random trees classifier performed the best at 4 days after application (DAA) for kochia with 62.9% accuracy. The maximum likelihood classifier provided the highest performing result out of all classification methods with an accuracy of 75.2%. A commendable classification was made at 8 DAA where the random trees classifier attained an accuracy of 87.2%. However, thermal reflectance measurements as a predictor for glyphosate resistance within weed populations in field condition was unreliable due to its susceptibility to environmental conditions. Normalized Difference Vegetation Index (NDVI) and a composite reflectance of 842 nm, 705 nm, and 740 nm wavelength managed to provide better classification results than thermal in most cases. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Figure 1

21 pages, 53405 KiB  
Article
Real-Time Identification of Rice Weeds by UAV Low-Altitude Remote Sensing Based on Improved Semantic Segmentation Model
by Yubin Lan, Kanghua Huang, Chang Yang, Luocheng Lei, Jiahang Ye, Jianling Zhang, Wen Zeng, Yali Zhang and Jizhong Deng
Remote Sens. 2021, 13(21), 4370; https://doi.org/10.3390/rs13214370 - 30 Oct 2021
Cited by 26 | Viewed by 3366
Abstract
Real-time analysis of UAV low-altitude remote sensing images at airborne terminals facilitates the timely monitoring of weeds in the farmland. Aiming at the real-time identification of rice weeds by UAV low-altitude remote sensing, two improved identification models, MobileNetV2-UNet and FFB-BiSeNetV2, were proposed based [...] Read more.
Real-time analysis of UAV low-altitude remote sensing images at airborne terminals facilitates the timely monitoring of weeds in the farmland. Aiming at the real-time identification of rice weeds by UAV low-altitude remote sensing, two improved identification models, MobileNetV2-UNet and FFB-BiSeNetV2, were proposed based on the semantic segmentation models U-Net and BiSeNetV2, respectively. The MobileNetV2-UNet model focuses on reducing the amount of calculation of the original model parameters, and the FFB-BiSeNetV2 model focuses on improving the segmentation accuracy of the original model. In this study, we first tested and compared the segmentation accuracy and operating efficiency of the models before and after the improvement on the computer platform, and then transplanted the improved models to the embedded hardware platform Jetson AGX Xavier, and used TensorRT to optimize the model structure to improve the inference speed. Finally, the real-time segmentation effect of the two improved models on rice weeds was further verified through the collected low-altitude remote sensing video data. The results show that on the computer platform, the MobileNetV2-UNet model reduced the amount of network parameters, model size, and floating point calculations by 89.12%, 86.16%, and 92.6%, and the inference speed also increased by 2.77 times, when compared with the U-Net model. The FFB-BiSeNetV2 model improved the segmentation accuracy compared with the BiSeNetV2 model and achieved the highest pixel accuracy and mean Intersection over Union ratio of 93.09% and 80.28%. On the embedded hardware platform, the optimized MobileNetV2-UNet model and FFB-BiSeNetV2 model inferred 45.05 FPS and 40.16 FPS for a single image under the weight accuracy of FP16, respectively, both meeting the performance requirements of real-time identification. The two methods proposed in this study realize the real-time identification of rice weeds under low-altitude remote sensing by UAV, which provide a reference for the subsequent integrated operation of plant protection drones in real-time rice weed identification and precision spraying. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Figure 1

18 pages, 6827 KiB  
Article
A Comparative Estimation of Maize Leaf Water Content Using Machine Learning Techniques and Unmanned Aerial Vehicle (UAV)-Based Proximal and Remotely Sensed Data
by Helen S. Ndlovu, John Odindi, Mbulisi Sibanda, Onisimo Mutanga, Alistair Clulow, Vimbayi G. P. Chimonyo and Tafadzwanashe Mabhaudhi
Remote Sens. 2021, 13(20), 4091; https://doi.org/10.3390/rs13204091 - 13 Oct 2021
Cited by 36 | Viewed by 4032
Abstract
Determining maize water content variability is necessary for crop monitoring and in developing early warning systems to optimise agricultural production in smallholder farms. However, spatially explicit information on maize water content, particularly in Southern Africa, remains elementary due to the shortage of efficient [...] Read more.
Determining maize water content variability is necessary for crop monitoring and in developing early warning systems to optimise agricultural production in smallholder farms. However, spatially explicit information on maize water content, particularly in Southern Africa, remains elementary due to the shortage of efficient and affordable primary sources of suitable spatial data at a local scale. Unmanned Aerial Vehicles (UAVs), equipped with light-weight multispectral sensors, provide spatially explicit, near-real-time information for determining the maize crop water status at farm scale. Therefore, this study evaluated the utility of UAV-derived multispectral imagery and machine learning techniques in estimating maize leaf water indicators: equivalent water thickness (EWT), fuel moisture content (FMC), and specific leaf area (SLA). The results illustrated that both NIR and red-edge derived spectral variables were critical in characterising the maize water indicators on smallholder farms. Furthermore, the best models for estimating EWT, FMC, and SLA were derived from the random forest regression (RFR) algorithm with an rRMSE of 3.13%, 1%, and 3.48%, respectively. Additionally, EWT and FMC yielded the highest predictive performance and were the most optimal indicators of maize leaf water content. The findings are critical towards developing a robust and spatially explicit monitoring framework of maize water status and serve as a proxy of crop health and the overall productivity of smallholder maize farms. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Graphical abstract

15 pages, 3220 KiB  
Article
Use of Oblique RGB Imagery and Apparent Surface Area of Plants for Early Estimation of Above-Ground Corn Biomass
by Kosal Khun, Nicolas Tremblay, Bernard Panneton, Philippe Vigneault, Etienne Lord, François Cavayas and Claude Codjia
Remote Sens. 2021, 13(20), 4032; https://doi.org/10.3390/rs13204032 - 09 Oct 2021
Cited by 7 | Viewed by 2044
Abstract
Estimating above-ground biomass in the context of fertilization management requires the monitoring of crops at early stages. Conventional remote sensing techniques make use of vegetation indices such as the normalized difference vegetation index (NDVI), but they do not exploit the high spatial resolution [...] Read more.
Estimating above-ground biomass in the context of fertilization management requires the monitoring of crops at early stages. Conventional remote sensing techniques make use of vegetation indices such as the normalized difference vegetation index (NDVI), but they do not exploit the high spatial resolution (ground sampling distance < 5 mm) now achievable with the introduction of unmanned aerial vehicles (UAVs) in agriculture. The aim of this study was to compare image mosaics to single images for the estimation of corn biomass and the influence of viewing angles in this estimation. Nadir imagery was captured by a high spatial resolution camera mounted on a UAV to generate orthomosaics of corn plots at different growth stages (from V2 to V7). Nadir and oblique images (30° and 45° with respect to the vertical) were also acquired from a zip line platform and processed as single images. Image segmentation was performed using the difference color index Excess Green-Excess Red, allowing for the discrimination between vegetation and background pixels. The apparent surface area of plants was then extracted and compared to biomass measured in situ. An asymptotic total least squares regression was performed and showed a strong relationship between the apparent surface area of plants and both dry and fresh biomass. Mosaics tended to underestimate the apparent surface area in comparison to single images because of radiometric degradation. It is therefore conceivable to process only single images instead of investing time and effort in acquiring and processing data for orthomosaic generation. When comparing oblique photography, an angle of 30° yielded the best results in estimating corn biomass, with a low residual standard error of orthogonal distance (RSEOD = 0.031 for fresh biomass, RSEOD = 0.034 for dry biomass). Since oblique imagery provides more flexibility in data acquisition with fewer constraints on logistics, this approach might be an efficient way to monitor crop biomass at early stages. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Graphical abstract

27 pages, 5476 KiB  
Article
Improving Biomass and Grain Yield Prediction of Wheat Genotypes on Sodic Soil Using Integrated High-Resolution Multispectral, Hyperspectral, 3D Point Cloud, and Machine Learning Techniques
by Malini Roy Choudhury, Sumanta Das, Jack Christopher, Armando Apan, Scott Chapman, Neal W. Menzies and Yash P. Dang
Remote Sens. 2021, 13(17), 3482; https://doi.org/10.3390/rs13173482 - 02 Sep 2021
Cited by 25 | Viewed by 5649
Abstract
Sodic soils adversely affect crop production over extensive areas of rain-fed cropping worldwide, with particularly large areas in Australia. Crop phenotyping may assist in identifying cultivars tolerant to soil sodicity. However, studies to identify the most appropriate traits and reliable tools to assist [...] Read more.
Sodic soils adversely affect crop production over extensive areas of rain-fed cropping worldwide, with particularly large areas in Australia. Crop phenotyping may assist in identifying cultivars tolerant to soil sodicity. However, studies to identify the most appropriate traits and reliable tools to assist crop phenotyping on sodic soil are limited. Hence, this study evaluated the ability of multispectral, hyperspectral, 3D point cloud, and machine learning techniques to improve estimation of biomass and grain yield of wheat genotypes grown on a moderately sodic (MS) and highly sodic (HS) soil sites in northeastern Australia. While a number of studies have reported using different remote sensing approaches and crop traits to quantify crop growth, stress, and yield variation, studies are limited using the combination of these techniques including machine learning to improve estimation of genotypic biomass and yield, especially in constrained sodic soil environments. At close to flowering, unmanned aerial vehicle (UAV) and ground-based proximal sensing was used to obtain remote and/or proximal sensing data, while biomass yield and crop heights were also manually measured in the field. Grain yield was machine-harvested at maturity. UAV remote and/or proximal sensing-derived spectral vegetation indices (VIs), such as normalized difference vegetation index, optimized soil adjusted vegetation index, and enhanced vegetation index and crop height were closely corresponded to wheat genotypic biomass and grain yields. UAV multispectral VIs more closely associated with biomass and grain yields compared to proximal sensing data. The red-green-blue (RGB) 3D point cloud technique was effective in determining crop height, which was slightly better correlated with genotypic biomass and grain yield than ground-measured crop height data. These remote sensing-derived crop traits (VIs and crop height) and wheat biomass and grain yields were further simulated using machine learning algorithms (multitarget linear regression, support vector machine regression, Gaussian process regression, and artificial neural network) with different kernels to improve estimation of biomass and grain yield. The artificial neural network predicted biomass yield (R2 = 0.89; RMSE = 34.8 g/m2 for the MS and R2 = 0.82; RMSE = 26.4 g/m2 for the HS site) and grain yield (R2 = 0.88; RMSE = 11.8 g/m2 for the MS and R2 = 0.74; RMSE = 16.1 g/m2 for the HS site) with slightly less error than the others. Wheat genotypes Mitch, Corack, Mace, Trojan, Lancer, and Bremer were identified as more tolerant to sodic soil constraints than Emu Rock, Janz, Flanker, and Gladius. The study improves our ability to select appropriate traits and techniques in accurate estimation of wheat genotypic biomass and grain yields on sodic soils. This will also assist farmers in identifying cultivars tolerant to sodic soil constraints. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Graphical abstract

19 pages, 6293 KiB  
Article
Extraction of Areas of Rice False Smut Infection Using UAV Hyperspectral Data
by Gangqiang An, Minfeng Xing, Binbin He, Haiqi Kang, Jiali Shang, Chunhua Liao, Xiaodong Huang and Hongguo Zhang
Remote Sens. 2021, 13(16), 3185; https://doi.org/10.3390/rs13163185 - 11 Aug 2021
Cited by 9 | Viewed by 2565
Abstract
Rice false smut (RFS), caused by Ustilaginoidea virens, is a significant grain disease in rice that can lead to reduced yield and quality. In order to obtain spatiotemporal change information, multitemporal hyperspectral UAV data were used in this study to determine the [...] Read more.
Rice false smut (RFS), caused by Ustilaginoidea virens, is a significant grain disease in rice that can lead to reduced yield and quality. In order to obtain spatiotemporal change information, multitemporal hyperspectral UAV data were used in this study to determine the sensitive wavebands for RFS identification, 665–685 and 705–880 nm. Then, two methods were used for the extraction of rice false smut-infected areas, one based on spectral similarity analysis and one based on spectral and temporal characteristics. The final overall accuracy of the two methods was 74.23 and 85.19%, respectively, showing that the second method had better prediction accuracy. In addition, the classification results of the two methods show that the areas of rice false smut infection had an expanding trend over time, which is consistent with the natural development law of rice false smut, and also shows the scientific nature of the two methods. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Figure 1

25 pages, 8349 KiB  
Article
Diurnal and Seasonal Mapping of Water Deficit Index and Evapotranspiration by an Unmanned Aerial System: A Case Study for Winter Wheat in Denmark
by Vita Antoniuk, Kiril Manevski, Kirsten Kørup, Rene Larsen, Inge Sandholt, Xiying Zhang and Mathias Neumann Andersen
Remote Sens. 2021, 13(15), 2998; https://doi.org/10.3390/rs13152998 - 30 Jul 2021
Cited by 4 | Viewed by 2320
Abstract
Precision irrigation is a promising method to mitigate the impacts of drought stress on crop production with the optimal use of water resources. However, the reliable assessment of plant water status has not been adequately demonstrated, and unmanned aerial systems (UAS) offer great [...] Read more.
Precision irrigation is a promising method to mitigate the impacts of drought stress on crop production with the optimal use of water resources. However, the reliable assessment of plant water status has not been adequately demonstrated, and unmanned aerial systems (UAS) offer great potential for spatiotemporal improvements. This study utilized UAS equipped with multispectral and thermal sensors to detect and quantify drought stress in winter wheat (Triticum aestivum L.) using the Water Deficit Index (WDI). Biennial field experiments were conducted on coarse sand soil in Denmark and analyses were performed at both diurnal and seasonal timescales. The WDI was significantly correlated with leaf stomatal conductance (R2 = 0.61–0.73), and the correlation was weaker with leaf water potential (R2 = 0.39–0.56) and topsoil water status (the highest R2 of 0.68). A semi-physical model depicting the relationship between WDI and fraction of transpirable soil water (FTSW) in the root zone was derived with R2 = 0.74. Moreover, WDI estimates were improved using an energy balance model with an iterative scheme to estimate the net radiation and land surface temperature, as well as the dual crop coefficient. The diurnal variation in WDI revealed a pattern of the ratio of actual to potential evapotranspiration, being higher in the morning, decreasing at noon hours and ‘recovering’ in the afternoon. Future work should investigate the temporal upscaling of evapotranspiration, which may be used to develop methods for site-specific irrigation recommendations. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Graphical abstract

30 pages, 4691 KiB  
Article
Evaluation of RGB and Multispectral Unmanned Aerial Vehicle (UAV) Imagery for High-Throughput Phenotyping and Yield Prediction in Barley Breeding
by Paul Herzig, Peter Borrmann, Uwe Knauer, Hans-Christian Klück, David Kilias, Udo Seiffert, Klaus Pillen and Andreas Maurer
Remote Sens. 2021, 13(14), 2670; https://doi.org/10.3390/rs13142670 - 07 Jul 2021
Cited by 18 | Viewed by 5033
Abstract
With advances in plant genomics, plant phenotyping has become a new bottleneck in plant breeding and the need for reliable high-throughput plant phenotyping techniques has emerged. In the face of future climatic challenges, it does not seem appropriate to continue to solely select [...] Read more.
With advances in plant genomics, plant phenotyping has become a new bottleneck in plant breeding and the need for reliable high-throughput plant phenotyping techniques has emerged. In the face of future climatic challenges, it does not seem appropriate to continue to solely select for grain yield and a few agronomically important traits. Therefore, new sensor-based high-throughput phenotyping has been increasingly used in plant breeding research, with the potential to provide non-destructive, objective and continuous plant characterization that reveals the formation of the final grain yield and provides insights into the physiology of the plant during the growth phase. In this context, we present the comparison of two sensor systems, Red-Green-Blue (RGB) and multispectral cameras, attached to unmanned aerial vehicles (UAV), and investigate their suitability for yield prediction using different modelling approaches in a segregating barley introgression population at three environments with weekly data collection during the entire vegetation period. In addition to vegetation indices, morphological traits such as canopy height, vegetation cover and growth dynamics traits were used for yield prediction. Repeatability analyses and genotype association studies of sensor-based traits were compared with reference values from ground-based phenotyping to test the use of conventional and new traits for barley breeding. The relative height estimation of the canopy by UAV achieved high precision (up to r = 0.93) and repeatability (up to R2 = 0.98). In addition, we found a great overlap of detected significant genotypes between the reference heights and sensor-based heights. The yield prediction accuracy of both sensor systems was at the same level and reached a maximum prediction accuracy of r2 = 0.82 with a continuous increase in precision throughout the entire vegetation period. Due to the lower costs and the consumer-friendly handling of image acquisition and processing, the RGB imagery seems to be more suitable for yield prediction in this study. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Figure 1

20 pages, 8205 KiB  
Article
Machine Learning Techniques to Predict Soybean Plant Density Using UAV and Satellite-Based Remote Sensing
by Luthfan Nur Habibi, Tomoya Watanabe, Tsutomu Matsui and Takashi S. T. Tanaka
Remote Sens. 2021, 13(13), 2548; https://doi.org/10.3390/rs13132548 - 29 Jun 2021
Cited by 14 | Viewed by 4232
Abstract
The plant density of soybean is a critical factor affecting plant canopy structure and yield. Predicting the spatial variability of plant density would be valuable for improving agronomic practices. The objective of this study was to develop a model for plant density measurement [...] Read more.
The plant density of soybean is a critical factor affecting plant canopy structure and yield. Predicting the spatial variability of plant density would be valuable for improving agronomic practices. The objective of this study was to develop a model for plant density measurement using several data sets with different spatial resolutions, including unmanned aerial vehicle (UAV) imagery, PlanetScope satellite imagery, and climate data. The model establishment process includes (1) performing the high-throughput measurement of actual plant density from UAV imagery with the You Only Look Once version 3 (YOLOv3) object detection algorithm, which was further treated as a response variable of the estimation models in the next step, and (2) developing regression models to estimate plant density in the extended areas using various combinations of predictors derived from PlanetScope imagery and climate data. Our results showed that the YOLOv3 model can accurately measure actual soybean plant density from UAV imagery data with a root mean square error (RMSE) value of 0.96 plants m−2. Furthermore, the two regression models, partial least squares and random forest (RF), successfully expanded the plant density prediction areas with RMSE values ranging from 1.78 to 3.67 plant m−2. Model improvement was conducted using the variable importance feature in RF, which improved prediction accuracy with an RMSE value of 1.72 plant m−2. These results demonstrated that the established model had an acceptable prediction accuracy for estimating plant density. Although the model could not often evaluate the within-field spatial variability of soybean plant density, the predicted values were sufficient for informing the field-specific status. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Graphical abstract

19 pages, 8084 KiB  
Article
Spatio-Temporal Estimation of Biomass Growth in Rice Using Canopy Surface Model from Unmanned Aerial Vehicle Images
by Clement Oppong Peprah, Megumi Yamashita, Tomoaki Yamaguchi, Ryo Sekino, Kyohei Takano and Keisuke Katsura
Remote Sens. 2021, 13(12), 2388; https://doi.org/10.3390/rs13122388 - 18 Jun 2021
Cited by 12 | Viewed by 2726
Abstract
The awareness of spatial and temporal variations in site-specific crop parameters, such as aboveground biomass (total dry weight: (TDW), plant length (PL) and leaf area index (LAI), help in formulating appropriate management decisions. However, conventional monitoring methods rely on time-consuming manual field operations. [...] Read more.
The awareness of spatial and temporal variations in site-specific crop parameters, such as aboveground biomass (total dry weight: (TDW), plant length (PL) and leaf area index (LAI), help in formulating appropriate management decisions. However, conventional monitoring methods rely on time-consuming manual field operations. In this study, the feasibility of using an unmanned aerial vehicle (UAV)-based remote sensing approach for monitoring growth in rice was evaluated using a digital surface model (DSM). Approximately 160 images of paddy fields were captured during each UAV survey campaign over two vegetation seasons. The canopy surface model (CSM) was developed based on the differences observed between each DSM and the first DSM after transplanting. Mean canopy height (CH) was used as a variable for the estimation models of LAI and TDW. The mean CSM of the mesh covering several hills was sufficient to explain the PL (R2 = 0.947). TDW and LAI prediction accuracy of the model were high (relative RMSE of 20.8% and 28.7%, and RMSE of 0.76 m2 m−2 and 141.4 g m−2, respectively) in the rice varieties studied (R2 = 0.937 (Basmati370), 0.837 (Nipponbare and IR64) for TDW, and 0.894 (Basmati370), 0.866 (Nipponbare and IR64) for LAI). The results of this study support the assertion of the benefits of DSM-derived CH for predicting biomass development. In addition, LAI and TDW could be estimated temporally and spatially using the UAV-based CSM, which is not easily affected by weather conditions. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Graphical abstract

32 pages, 30917 KiB  
Article
Investigating the Potential of a Newly Developed UAV-Mounted VNIR/SWIR Imaging System for Monitoring Crop Traits—A Case Study for Winter Wheat
by Alexander Jenal, Hubert Hüging, Hella Ellen Ahrends, Andreas Bolten, Jens Bongartz and Georg Bareth
Remote Sens. 2021, 13(9), 1697; https://doi.org/10.3390/rs13091697 - 27 Apr 2021
Cited by 13 | Viewed by 3182
Abstract
UAV-based multispectral multi-camera systems are widely used in scientific research for non-destructive crop traits estimation to optimize agricultural management decisions. These systems typically provide data from the visible and near-infrared (VNIR) domain. However, several key absorption features related to biomass and nitrogen (N) [...] Read more.
UAV-based multispectral multi-camera systems are widely used in scientific research for non-destructive crop traits estimation to optimize agricultural management decisions. These systems typically provide data from the visible and near-infrared (VNIR) domain. However, several key absorption features related to biomass and nitrogen (N) are located in the short-wave infrared (SWIR) domain. Therefore, this study investigates a novel multi-camera system prototype that addresses this spectral gap with a sensitivity from 600 to 1700 nm by implementing dedicated bandpass filter combinations to derive application-specific vegetation indices (VIs). In this study, two VIs, GnyLi and NRI, were applied using data obtained on a single observation date at a winter wheat field experiment located in Germany. Ground truth data were destructively sampled for the entire growing season. Likewise, crop heights were derived from UAV-based RGB image data using an improved approach developed within this study. Based on these variables, regression models were derived to estimate fresh and dry biomass, crop moisture, N concentration, and N uptake. The relationships between the NIR/SWIR-based VIs and the estimated crop traits were successfully evaluated (R2: 0.57 to 0.66). Both VIs were further validated against the sampled ground truth data (R2: 0.75 to 0.84). These results indicate the imaging system’s potential for monitoring crop traits in agricultural applications, but further multitemporal validations are needed. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Graphical abstract

28 pages, 19593 KiB  
Article
New Orthophoto Generation Strategies from UAV and Ground Remote Sensing Platforms for High-Throughput Phenotyping
by Yi-Chun Lin, Tian Zhou, Taojun Wang, Melba Crawford and Ayman Habib
Remote Sens. 2021, 13(5), 860; https://doi.org/10.3390/rs13050860 - 25 Feb 2021
Cited by 22 | Viewed by 4336
Abstract
Remote sensing platforms have become an effective data acquisition tool for digital agriculture. Imaging sensors onboard unmanned aerial vehicles (UAVs) and tractors are providing unprecedented high-geometric-resolution data for several crop phenotyping activities (e.g., canopy cover estimation, plant localization, and flowering date identification). Among [...] Read more.
Remote sensing platforms have become an effective data acquisition tool for digital agriculture. Imaging sensors onboard unmanned aerial vehicles (UAVs) and tractors are providing unprecedented high-geometric-resolution data for several crop phenotyping activities (e.g., canopy cover estimation, plant localization, and flowering date identification). Among potential products, orthophotos play an important role in agricultural management. Traditional orthophoto generation strategies suffer from several artifacts (e.g., double mapping, excessive pixilation, and seamline distortions). The above problems are more pronounced when dealing with mid- to late-season imagery, which is often used for establishing flowering date (e.g., tassel and panicle detection for maize and sorghum crops, respectively). In response to these challenges, this paper introduces new strategies for generating orthophotos that are conducive to the straightforward detection of tassels and panicles. The orthophoto generation strategies are valid for both frame and push-broom imaging systems. The target function of these strategies is striking a balance between the improved visual appearance of tassels/panicles and their geolocation accuracy. The new strategies are based on generating a smooth digital surface model (DSM) that maintains the geolocation quality along the plant rows while reducing double mapping and pixilation artifacts. Moreover, seamline control strategies are applied to avoid having seamline distortions at locations where the tassels and panicles are expected. The quality of generated orthophotos is evaluated through visual inspection as well as quantitative assessment of the degree of similarity between the generated orthophotos and original images. Several experimental results from both UAV and ground platforms show that the proposed strategies do improve the visual quality of derived orthophotos while maintaining the geolocation accuracy at tassel/panicle locations. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Graphical abstract

21 pages, 4927 KiB  
Article
Vegetation Indices Data Clustering for Dynamic Monitoring and Classification of Wheat Yield Crop Traits
by Stefano Marino and Arturo Alvino
Remote Sens. 2021, 13(4), 541; https://doi.org/10.3390/rs13040541 - 03 Feb 2021
Cited by 17 | Viewed by 4060
Abstract
Monitoring the spatial and temporal variability of yield crop traits using remote sensing techniques is the basis for the correct adoption of precision farming. Vegetation index images are mainly associated with yield and yield-related physiological traits, although quick and sound strategies for the [...] Read more.
Monitoring the spatial and temporal variability of yield crop traits using remote sensing techniques is the basis for the correct adoption of precision farming. Vegetation index images are mainly associated with yield and yield-related physiological traits, although quick and sound strategies for the classification of the areas with plants with homogeneous agronomic crop traits are still to be explored. A classification technique based on remote sensing spectral information analysis was performed to discriminate between wheat cultivars. The study analyzes the ability of the cluster method applied to the data of three vegetation indices (VIs) collected by high-resolution UAV at three different crop stages (seedling, tillering, and flowering), to detect the yield and yield component dynamics of seven durum wheat cultivars. Ground truth data were grouped according to the identified clusters for VI cluster validation. The yield crop variability recorded in the field at harvest showed values ranging from 2.55 to 7.90 t. The ability of the VI clusters to identify areas with similar agronomic characteristics for the parameters collected and analyzed a posteriori revealed an already important ability to detect areas with different yield potential at seedling (5.88 t ha−1 for the first cluster, 4.22 t ha−1 for the fourth). At tillering, an enormous difficulty in differentiating the less productive areas in particular was recorded (5.66 t ha−1 for cluster 1 and 4.74, 4.31, and 4.66 t ha−1 for clusters 2, 3, and 4, respectively). An excellent ability to group areas with the same yield production at flowering was recorded for the cluster 1 (6.44 t ha−1), followed by cluster 2 (5.6 t ha−1), cluster 3 (4.31 t ha−1), and cluster 4 (3.85 t ha−1). Agronomic crop traits, cultivars, and environmental variability were analyzed. The multiple uses of VIs have improved the sensitivity of k-means clustering for a new image segmentation strategy. The cluster method can be considered an effective and simple tool for the dynamic monitoring and assessment of agronomic traits in open field wheat crops. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Graphical abstract

Review

Jump to: Research

23 pages, 786 KiB  
Review
Automatic Identification and Monitoring of Plant Diseases Using Unmanned Aerial Vehicles: A Review
by Krishna Neupane and Fulya Baysal-Gurel
Remote Sens. 2021, 13(19), 3841; https://doi.org/10.3390/rs13193841 - 25 Sep 2021
Cited by 70 | Viewed by 11480
Abstract
Disease diagnosis is one of the major tasks for increasing food production in agriculture. Although precision agriculture (PA) takes less time and provides a more precise application of agricultural activities, the detection of disease using an Unmanned Aerial System (UAS) is a challenging [...] Read more.
Disease diagnosis is one of the major tasks for increasing food production in agriculture. Although precision agriculture (PA) takes less time and provides a more precise application of agricultural activities, the detection of disease using an Unmanned Aerial System (UAS) is a challenging task. Several Unmanned Aerial Vehicles (UAVs) and sensors have been used for this purpose. The UAVs’ platforms and their peripherals have their own limitations in accurately diagnosing plant diseases. Several types of image processing software are available for vignetting and orthorectification. The training and validation of datasets are important characteristics of data analysis. Currently, different algorithms and architectures of machine learning models are used to classify and detect plant diseases. These models help in image segmentation and feature extractions to interpret results. Researchers also use the values of vegetative indices, such as Normalized Difference Vegetative Index (NDVI), Crop Water Stress Index (CWSI), etc., acquired from different multispectral and hyperspectral sensors to fit into the statistical models to deliver results. There are still various drifts in the automatic detection of plant diseases as imaging sensors are limited by their own spectral bandwidth, resolution, background noise of the image, etc. The future of crop health monitoring using UAVs should include a gimble consisting of multiple sensors, large datasets for training and validation, the development of site-specific irradiance systems, and so on. This review briefly highlights the advantages of automatic detection of plant diseases to the growers. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Figure 1

21 pages, 3492 KiB  
Review
Reference Measurements in Developing UAV Systems for Detecting Pests, Weeds, and Diseases
by Jere Kaivosoja, Juho Hautsalo, Jaakko Heikkinen, Lea Hiltunen, Pentti Ruuttunen, Roope Näsi, Oiva Niemeläinen, Madis Lemsalu, Eija Honkavaara and Jukka Salonen
Remote Sens. 2021, 13(7), 1238; https://doi.org/10.3390/rs13071238 - 24 Mar 2021
Cited by 18 | Viewed by 5692
Abstract
The development of UAV (unmanned aerial vehicle) imaging technologies for precision farming applications is rapid, and new studies are published frequently. In cases where measurements are based on aerial imaging, there is the need to have ground truth or reference data in order [...] Read more.
The development of UAV (unmanned aerial vehicle) imaging technologies for precision farming applications is rapid, and new studies are published frequently. In cases where measurements are based on aerial imaging, there is the need to have ground truth or reference data in order to develop reliable applications. However, in several precision farming use cases such as pests, weeds, and diseases detection, the reference data can be subjective or relatively difficult to capture. Furthermore, the collection of reference data is usually laborious and time consuming. It also appears that it is difficult to develop generalisable solutions for these areas. This review studies previous research related to pests, weeds, and diseases detection and mapping using UAV imaging in the precision farming context, underpinning the applied reference measurement techniques. The majority of the reviewed studies utilised subjective visual observations of UAV images, and only a few applied in situ measurements. The conclusion of the review is that there is a lack of quantitative and repeatable reference data measurement solutions in the areas of mapping pests, weeds, and diseases. In addition, the results that the studies present should be reflected in the applied references. An option in the future approach could be the use of synthetic data as reference. Full article
(This article belongs to the Special Issue UAV Imagery for Precision Agriculture)
Show Figures

Graphical abstract

Back to TopTop