remotesensing-logo

Journal Browser

Journal Browser

UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Environmental Remote Sensing".

Deadline for manuscript submissions: closed (15 September 2020) | Viewed by 31952

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editors


E-Mail Website
Guest Editor
Engineering Department, School of Science and Technology, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
Interests: UAV; image processing algorithms (RGB, NIR, multi- and hyperspectral, thermal and LiDAR sensors); InSAR; precision agriculture; precision forestry
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Advances in UASs have aroused the interest of the agricultural community. The near-continuous observation using RGB, multispectral, hyperspectral, thermal, and/or LiDAR sensors, registering data with an ultrahigh spatial resolution, allows not only the mapping, but also monitoring and modeling characteristics of crops, which thus improves decision-making processes by farmers. It is clear then that the developing methods have improved the processing and analysis of UAS data in agricultural scenarios and will continue to help to advance the important work in the agricultural community.

This Special Issue includes original and innovative manuscripts demonstrating the use of UASs for remote sensing in agricultural areas. The selection of papers for publication will depend on the quality and rigor of the research and paper. Specific topics include but are not limited to:

  • UAS-based RGB imaging in agriculture;
  • UAS-based multispectral imaging in agriculture;
  • UAS-based hyperspectral imaging in agriculture;
  • UAS-based thermal imaging in agriculture;
  • UAS-based laser scanning in agriculture;
  • Multitemporal analysis;
  • Artificial intelligence in remote sensing;
  • Accuracy and precision evaluations of UAS-based techniques;
  • Integration of UAS data with ground-based data or other measurements;
  • Precision agriculture applications.

Prof. Dr. Francisco Javier Mesas Carrascosa
Prof. Dr. Joaquim João Moreira de Sousa
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Accuracy and precision
  • Agriculture
  • Ecosystem
  • Hyperspectral
  • Laser scanning
  • Multispectral
  • Phenology
  • Object extraction
  • Plant physiology
  • Plant structure
  • RGB
  • Spatiotemporal patterns
  • Thermal
  • UAV
  • UAS
  • Vegetation
  • Vegetation indices
  • Vitality

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

3 pages, 163 KiB  
Editorial
UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops
by Francisco Javier Mesas-Carrascosa
Remote Sens. 2020, 12(23), 3873; https://doi.org/10.3390/rs12233873 - 26 Nov 2020
Cited by 5 | Viewed by 1992
Abstract
The advances in Unmanned Aerial Vehicle (UAV) platforms and on-board sensors in the past few years have greatly increased our ability to monitor and map crops. The ability to register images at ultra-high spatial resolution at any moment has made remote sensing techniques [...] Read more.
The advances in Unmanned Aerial Vehicle (UAV) platforms and on-board sensors in the past few years have greatly increased our ability to monitor and map crops. The ability to register images at ultra-high spatial resolution at any moment has made remote sensing techniques increasingly useful in crop management. These technologies have revolutionized the way in which remote sensing is applied in precision agriculture, allowing for decision-making in a matter of days instead of weeks. However, it is still necessary to continue research to improve and maximize the potential of UAV remote sensing in agriculture. This Special Issue of Remote Sensing includes different applications of UAV remote sensing for crop management, covering RGB, multispectral, hyperspectral and LIght Detection and Ranging (LiDAR) sensor applications on-board (UAVs). The papers reveal innovative techniques involving image analysis and cloud points. It should, however, be emphasized that this Special Issue is a small sample of UAV applications in agriculture and that there is much more to investigate. Full article
(This article belongs to the Special Issue UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops)

Research

Jump to: Editorial

35 pages, 16037 KiB  
Article
Multi-Temporal Predictive Modelling of Sorghum Biomass Using UAV-Based Hyperspectral and LiDAR Data
by Ali Masjedi, Melba M. Crawford, Neal R. Carpenter and Mitchell R. Tuinstra
Remote Sens. 2020, 12(21), 3587; https://doi.org/10.3390/rs12213587 - 1 Nov 2020
Cited by 23 | Viewed by 4449
Abstract
High-throughput phenotyping using high spatial, spectral, and temporal resolution remote sensing (RS) data has become a critical part of the plant breeding chain focused on reducing the time and cost of the selection process for the “best” genotypes with respect to the trait(s) [...] Read more.
High-throughput phenotyping using high spatial, spectral, and temporal resolution remote sensing (RS) data has become a critical part of the plant breeding chain focused on reducing the time and cost of the selection process for the “best” genotypes with respect to the trait(s) of interest. In this paper, the potential of accurate and reliable sorghum biomass prediction using visible and near infrared (VNIR) and short-wave infrared (SWIR) hyperspectral data as well as light detection and ranging (LiDAR) data acquired by sensors mounted on UAV platforms is investigated. Predictive models are developed using classical regression-based machine learning methods for nine experiments conducted during the 2017 and 2018 growing seasons at the Agronomy Center for Research and Education (ACRE) at Purdue University, Indiana, USA. The impact of the regression method, data source, timing of RS and field-based biomass reference data acquisition, and the number of samples on the prediction results are investigated. R2 values for end-of-season biomass ranged from 0.64 to 0.89 for different experiments when features from all the data sources were included. Geometry-based features derived from the LiDAR point cloud to characterize plant structure and chemistry-based features extracted from hyperspectral data provided the most accurate predictions. Evaluation of the impact of the time of data acquisition during the growing season on the prediction results indicated that although the most accurate and reliable predictions of final biomass were achieved using remotely sensed data from mid-season to end-of-season, predictions in mid-season provided adequate results to differentiate between promising varieties for selection. The analysis of variance (ANOVA) of the accuracies of the predictive models showed that both the data source and regression method are important factors for a reliable prediction; however, the data source was more important with 69% significance, versus 28% significance for the regression method. Full article
(This article belongs to the Special Issue UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops)
Show Figures

Graphical abstract

19 pages, 3410 KiB  
Article
Wavelength Selection Method Based on Partial Least Square from Hyperspectral Unmanned Aerial Vehicle Orthomosaic of Irrigated Olive Orchards
by Antonio Santos-Rufo, Francisco-Javier Mesas-Carrascosa, Alfonso García-Ferrer and Jose Emilio Meroño-Larriva
Remote Sens. 2020, 12(20), 3426; https://doi.org/10.3390/rs12203426 - 19 Oct 2020
Cited by 22 | Viewed by 3373
Abstract
Identifying and mapping irrigated areas is essential for a variety of applications such as agricultural planning and water resource management. Irrigated plots are mainly identified using supervised classification of multispectral images from satellite or manned aerial platforms. Recently, hyperspectral sensors on-board Unmanned Aerial [...] Read more.
Identifying and mapping irrigated areas is essential for a variety of applications such as agricultural planning and water resource management. Irrigated plots are mainly identified using supervised classification of multispectral images from satellite or manned aerial platforms. Recently, hyperspectral sensors on-board Unmanned Aerial Vehicles (UAV) have proven to be useful analytical tools in agriculture due to their high spectral resolution. However, few efforts have been made to identify which wavelengths could be applied to provide relevant information in specific scenarios. In this study, hyperspectral reflectance data from UAV were used to compare the performance of several wavelength selection methods based on Partial Least Square (PLS) regression with the purpose of discriminating two systems of irrigation commonly used in olive orchards. The tested PLS methods include filter methods (Loading Weights, Regression Coefficient and Variable Importance in Projection); Wrapper methods (Genetic Algorithm-PLS, Uninformative Variable Elimination-PLS, Backward Variable Elimination-PLS, Sub-window Permutation Analysis-PLS, Iterative Predictive Weighting-PLS, Regularized Elimination Procedure-PLS, Backward Interval-PLS, Forward Interval-PLS and Competitive Adaptive Reweighted Sampling-PLS); and an Embedded method (Sparse-PLS). In addition, two non-PLS based methods, Lasso and Boruta, were also used. Linear Discriminant Analysis and nonlinear K-Nearest Neighbors techniques were established for identification and assessment. The results indicate that wavelength selection methods, commonly used in other disciplines, provide utility in remote sensing for agronomical purposes, the identification of irrigation techniques being one such example. In addition to the aforementioned, these PLS and non-PLS based methods can play an important role in multivariate analysis, which can be used for subsequent model analysis. Of all the methods evaluated, Genetic Algorithm-PLS and Boruta eliminated nearly 90% of the original spectral wavelengths acquired from a hyperspectral sensor onboard a UAV while increasing the identification accuracy of the classification. Full article
(This article belongs to the Special Issue UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops)
Show Figures

Graphical abstract

20 pages, 6442 KiB  
Article
UAV-Borne LiDAR Crop Point Cloud Enhancement Using Grasshopper Optimization and Point Cloud Up-Sampling Network
by Jian Chen, Zichao Zhang, Kai Zhang, Shubo Wang and Yu Han
Remote Sens. 2020, 12(19), 3208; https://doi.org/10.3390/rs12193208 - 1 Oct 2020
Cited by 9 | Viewed by 2723
Abstract
Because of low accuracy and density of crop point clouds obtained by the Unmanned Aerial Vehicle (UAV)-borne Light Detection and Ranging (LiDAR) scanning system of UAV, an integrated navigation and positioning optimization method based on the grasshopper optimization algorithm (GOA) and a point [...] Read more.
Because of low accuracy and density of crop point clouds obtained by the Unmanned Aerial Vehicle (UAV)-borne Light Detection and Ranging (LiDAR) scanning system of UAV, an integrated navigation and positioning optimization method based on the grasshopper optimization algorithm (GOA) and a point cloud density enhancement method were proposed. Firstly, a global positioning system (GPS)/inertial navigation system (INS) integrated navigation and positioning information fusion method based on a Kalman filter was constructed. Then, the GOA was employed to find the optimal solution by iterating the system noise variance matrix Q and measurement noise variance matrix R of Kalman filter. By feeding the optimal solution into the Kalman filter, the error variances of longitude were reduced to 0.00046 from 0.0091, and the error variances of latitude were reduced to 0.00034 from 0.0047. Based on the integrated navigation, an UAV-borne LiDAR scanning system was built for obtaining the crop point. During offline processing, the crop point cloud was filtered and transformed into WGS-84, the density clustering algorithm improved by the particle swarm optimization (PSO) algorithm was employed to the clustering segment. After the clustering segment, the pre-trained Point Cloud Up-Sampling Network (PU-net) was used for density enhancement of point cloud data and to carry out three-dimensional reconstruction. The features of the crop point cloud were kept under the processing of reconstruction model; meanwhile, the density of the crop point cloud was quadrupled. Full article
(This article belongs to the Special Issue UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops)
Show Figures

Figure 1

18 pages, 30221 KiB  
Article
Automatic Grapevine Trunk Detection on UAV-Based Point Cloud
by Juan M. Jurado, Luís Pádua, Francisco R. Feito and Joaquim J. Sousa
Remote Sens. 2020, 12(18), 3043; https://doi.org/10.3390/rs12183043 - 17 Sep 2020
Cited by 30 | Viewed by 3988
Abstract
The optimisation of vineyards management requires efficient and automated methods able to identify individual plants. In the last few years, Unmanned Aerial Vehicles (UAVs) have become one of the main sources of remote sensing information for Precision Viticulture (PV) applications. In fact, high [...] Read more.
The optimisation of vineyards management requires efficient and automated methods able to identify individual plants. In the last few years, Unmanned Aerial Vehicles (UAVs) have become one of the main sources of remote sensing information for Precision Viticulture (PV) applications. In fact, high resolution UAV-based imagery offers a unique capability for modelling plant’s structure making possible the recognition of significant geometrical features in photogrammetric point clouds. Despite the proliferation of innovative technologies in viticulture, the identification of individual grapevines relies on image-based segmentation techniques. In that way, grapevine and non-grapevine features are separated and individual plants are estimated usually considering a fixed distance between them. In this study, an automatic method for grapevine trunk detection, using 3D point cloud data, is presented. The proposed method focuses on the recognition of key geometrical parameters to ensure the existence of every plant in the 3D model. The method was tested in different commercial vineyards and to push it to its limit a vineyard characterised by several missing plants along the vine rows, irregular distances between plants and occluded trunks by dense vegetation in some areas, was also used. The proposed method represents a disruption in relation to the state of the art, and is able to identify individual trunks, posts and missing plants based on the interpretation and analysis of a 3D point cloud. Moreover, a validation process was carried out allowing concluding that the method has a high performance, especially when it is applied to 3D point clouds generated in phases in which the leaves are not yet very dense (January to May). However, if correct flight parametrizations are set, the method remains effective throughout the entire vegetative cycle. Full article
(This article belongs to the Special Issue UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops)
Show Figures

Graphical abstract

21 pages, 7638 KiB  
Article
Evaluation of Rapeseed Winter Crop Damage Using UAV-Based Multispectral Imagery
by Łukasz Jełowicki, Konrad Sosnowicz, Wojciech Ostrowski, Katarzyna Osińska-Skotak and Krzysztof Bakuła
Remote Sens. 2020, 12(16), 2618; https://doi.org/10.3390/rs12162618 - 13 Aug 2020
Cited by 17 | Viewed by 4084
Abstract
This research is related to the exploitation of multispectral imagery from an unmanned aerial vehicle (UAV) in the assessment of damage to rapeseed after winter. Such damage is one of a few cases for which reimbursement may be claimed in agricultural insurance. Since [...] Read more.
This research is related to the exploitation of multispectral imagery from an unmanned aerial vehicle (UAV) in the assessment of damage to rapeseed after winter. Such damage is one of a few cases for which reimbursement may be claimed in agricultural insurance. Since direct measurements are difficult in such a case, mainly because of large, unreachable areas, it is therefore important to be able to use remote sensing in the assessment of the plant surface affected by frost damage. In this experiment, UAV images were taken using a Sequoia multispectral camera that collected data in four spectral bands: green, red, red-edge, and near-infrared. Data were acquired from three altitudes above the ground, which resulted in different ground sampling distances. Within several tests, various vegetation indices, calculated based on four spectral bands, were used in the experiment (normalized difference vegetation index (NDVI), normalized difference vegetation index—red edge (NDVI_RE), optimized soil adjusted vegetation index (OSAVI), optimized soil adjusted vegetation index—red edge (OSAVI_RE), soil adjusted vegetation index (SAVI), soil adjusted vegetation index—red edge (SAVI_RE)). As a result, selected vegetation indices were provided to classify the areas which qualified for reimbursement due to frost damage. The negative influence of visible technical roads was proved and eliminated using OBIA (object-based image analysis) to select and remove roads from classified images selected for classification. Detection of damaged areas was performed using three different approaches, one object-based and two pixel-based. Different ground sampling distances and different vegetation indices were tested within the experiment, which demonstrated the possibility of using the modern low-altitude photogrammetry of a UAV platform with a multispectral sensor in applications related to agriculture. Within the tests performed, it was shown that detection using UAV-based multispectral data can be a successful alternative for direct measurements in a field to estimate the area of winterkill damage. The best results were achieved in the study of damage detection using OSAVI and NDVI and images with ground sampling distance (GSD) = 10 cm, with an overall classification accuracy of 95% and a F1-score value of 0.87. Other results of approaches with different flight settings and vegetation indices were also promising. Full article
(This article belongs to the Special Issue UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops)
Show Figures

Graphical abstract

23 pages, 99013 KiB  
Article
Crop Row Detection through UAV Surveys to Optimize On-Farm Irrigation Management
by Giulia Ronchetti, Alice Mayer, Arianna Facchi, Bianca Ortuani and Giovanna Sona
Remote Sens. 2020, 12(12), 1967; https://doi.org/10.3390/rs12121967 - 18 Jun 2020
Cited by 25 | Viewed by 5437
Abstract
Climate change and competition among water users are increasingly leading to a reduction of water availability for irrigation; at the same time, traditionally non-irrigated crops require irrigation to achieve high quality standards. In the context of precision agriculture, particular attention is given to [...] Read more.
Climate change and competition among water users are increasingly leading to a reduction of water availability for irrigation; at the same time, traditionally non-irrigated crops require irrigation to achieve high quality standards. In the context of precision agriculture, particular attention is given to the optimization of on-farm irrigation management, based on the knowledge of within-field variability of crop and soil properties, to increase crop yield quality and ensure an efficient water use. Unmanned Aerial Vehicle (UAV) imagery is used in precision agriculture to monitor crop variability, but in the case of row-crops, image post-processing is required to separate crop rows from soil background and weeds. This study focuses on the crop row detection and extraction from images acquired through a UAV during the cropping season of 2018. Thresholding algorithms, classification algorithms, and Bayesian segmentation are tested and compared on three different crop types, namely grapevine, pear, and tomato, for analyzing the suitability of these methods with respect to the characteristics of each crop. The obtained results are promising, with overall accuracy greater than 90% and producer’s accuracy over 85% for the class “crop canopy”. The methods’ performances vary according to the crop types, input data, and parameters used. Some important outcomes can be pointed out from our study: NIR information does not give any particular added value, and RGB sensors should be preferred to identify crop rows; the presence of shadows in the inter-row distances may affect crop detection on vineyards. Finally, the best methodologies to be adopted for practical applications are discussed. Full article
(This article belongs to the Special Issue UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops)
Show Figures

Graphical abstract

22 pages, 14426 KiB  
Article
Multi-Temporal Unmanned Aerial Vehicle Remote Sensing for Vegetable Mapping Using an Attention-Based Recurrent Convolutional Neural Network
by Quanlong Feng, Jianyu Yang, Yiming Liu, Cong Ou, Dehai Zhu, Bowen Niu, Jiantao Liu and Baoguo Li
Remote Sens. 2020, 12(10), 1668; https://doi.org/10.3390/rs12101668 - 22 May 2020
Cited by 36 | Viewed by 5071
Abstract
Vegetable mapping from remote sensing imagery is important for precision agricultural activities such as automated pesticide spraying. Multi-temporal unmanned aerial vehicle (UAV) data has the merits of both very high spatial resolution and useful phenological information, which shows great potential for accurate vegetable [...] Read more.
Vegetable mapping from remote sensing imagery is important for precision agricultural activities such as automated pesticide spraying. Multi-temporal unmanned aerial vehicle (UAV) data has the merits of both very high spatial resolution and useful phenological information, which shows great potential for accurate vegetable classification, especially under complex and fragmented agricultural landscapes. In this study, an attention-based recurrent convolutional neural network (ARCNN) has been proposed for accurate vegetable mapping from multi-temporal UAV red-green-blue (RGB) imagery. The proposed model firstly utilizes a multi-scale deformable CNN to learn and extract rich spatial features from UAV data. Afterwards, the extracted features are fed into an attention-based recurrent neural network (RNN), from which the sequential dependency between multi-temporal features could be established. Finally, the aggregated spatial-temporal features are used to predict the vegetable category. Experimental results show that the proposed ARCNN yields a high performance with an overall accuracy of 92.80%. When compared with mono-temporal classification, the incorporation of multi-temporal UAV imagery could significantly boost the accuracy by 24.49% on average, which justifies the hypothesis that the low spectral resolution of RGB imagery could be compensated by the inclusion of multi-temporal observations. In addition, the attention-based RNN in this study outperforms other feature fusion methods such as feature-stacking. The deformable convolution operation also yields higher classification accuracy than that of a standard convolution unit. Results demonstrate that the ARCNN could provide an effective way for extracting and aggregating discriminative spatial-temporal features for vegetable mapping from multi-temporal UAV RGB imagery. Full article
(This article belongs to the Special Issue UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops)
Show Figures

Graphical abstract

Back to TopTop