remotesensing-logo

Journal Browser

Journal Browser

Advances of Remote Sensing in Precision Agriculture

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "AI Remote Sensing".

Deadline for manuscript submissions: closed (30 September 2022) | Viewed by 30821

Special Issue Editors


E-Mail Website
Guest Editor
National Program Leader, Advanced Technology in Agriculture and Data Science, National Institute of Food and Agriculture (NIFA), United States Department of Agriculture (USDA), Kansas City, MO 64105, USA
Interests: precision agriculture; remote sensing and UAS; data science; AI and machine learning; robotics

E-Mail Website
Guest Editor
Agricultural and Biological Engineering, Purdue University, 225 S University Ave., West Lafayette, IN 47907, USA
Interests: ICT; spatial and temporal modeling; UAS remote sensing; farm data processing

Special Issue Information

Dear Colleagues,

Information and data are key in Digital and Precision Agriculture. It is very important to collect quality data, transforming to useful format to optimize input resources for increase in agricultural production, mitigating climate change by reducing GHG emission. Remote sensing plays a significant role in ‘Geospatial and Temporal data’ for sustainable crop and animal management systems. Recent advances in remote sensing platforms such as unmanned aerial systems (UAS), creating bigdata has revolutionized the path for Artificial Intelligence (AI) and Machine Learning (ML) in digital and precision agriculture.

This special issue of “Remote Sensing” would deal with broad aspects of non-invasive sensing for data collection to promote automation and precision crop and livestock farming

Dr. Ganesh Bora
Dr. Dharmendra Saraswat
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Precision Agriculture
  • Remote Sensing
  • UAV/UAS
  • Internet of Things (IoT)
  • Artificial Intelligence (AI) and Machine Learning
  • Geospatial Technology
  • Robotics

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

18 pages, 6856 KiB  
Article
Feasibility of Early Yield Prediction per Coffee Tree Based on Multispectral Aerial Imagery: Case of Arabica Coffee Crops in Cauca-Colombia
by Julian Bolaños, Juan Carlos Corrales and Liseth Viviana Campo
Remote Sens. 2023, 15(1), 282; https://doi.org/10.3390/rs15010282 - 03 Jan 2023
Cited by 1 | Viewed by 2276
Abstract
Crop yield is an important factor for evaluating production processes and determining the profitability of growing coffee. Frequently, the total number of coffee beans per area unit is estimated manually by physically counting the coffee cherries, the branches, or the flowers. However, estimating [...] Read more.
Crop yield is an important factor for evaluating production processes and determining the profitability of growing coffee. Frequently, the total number of coffee beans per area unit is estimated manually by physically counting the coffee cherries, the branches, or the flowers. However, estimating yield requires an investment in time and work, so it is not usual for small producers. This paper studies a non-intrusive and attainable alternative to predicting coffee crop yield through multispectral aerial images. The proposal is designed for small low-tech producers monitored by capturing aerial photos with a MapIR camera on an unmanned aerial vehicle. This research shows how to predict yields in the early stages of the coffee tree productive cycle, such as at flowering by using aerial imagery. Physical and spectral descriptors were evaluated as predictors for yield prediction models. The results showed correlations between the selected predictors and 370 yield samples of a Colombian Arabica coffee crop. The coffee tree volume, the Normalized Difference Vegetation Index (NDVI), and the Coffee Ripeness Index (CRI) showed the highest values with 71%, 55%, and 63%, respectively. Further, these predictors were used as the inputs for regression models to analyze their precision in predicting coffee crop yield. The validation stage concluded that Linear Regression and Stochastic Descending Gradient Regression were better models with determination coefficient values of 56% and 55%, respectively, which are promising for predicting yield. Full article
(This article belongs to the Special Issue Advances of Remote Sensing in Precision Agriculture)
Show Figures

Figure 1

23 pages, 4043 KiB  
Article
Evaluation of the Use of UAV-Derived Vegetation Indices and Environmental Variables for Grapevine Water Status Monitoring Based on Machine Learning Algorithms and SHAP Analysis
by Hsiang-En Wei, Miles Grafton, Mike Bretherton, Matthew Irwin and Eduardo Sandoval
Remote Sens. 2022, 14(23), 5918; https://doi.org/10.3390/rs14235918 - 23 Nov 2022
Cited by 3 | Viewed by 1765
Abstract
Monitoring and management of grapevine water status (GWS) over the critical period between flowering and veraison plays a significant role in producing grapes of premium quality. Although unmanned aerial vehicles (UAVs) can provide efficient mapping across the entire vineyard, most commercial UAV-based multispectral [...] Read more.
Monitoring and management of grapevine water status (GWS) over the critical period between flowering and veraison plays a significant role in producing grapes of premium quality. Although unmanned aerial vehicles (UAVs) can provide efficient mapping across the entire vineyard, most commercial UAV-based multispectral sensors do not contain a shortwave infrared band, which makes the monitoring of GWS problematic. The goal of this study is to explore whether and which of the ancillary variables (vegetation characteristics, temporal trends, weather conditions, and soil/terrain data) may improve the accuracy of GWS estimation using multispectral UAV and provide insights into the contribution, in terms of direction and intensity, for each variable contributing to GWS variation. UAV-derived vegetation indices, slope, elevation, apparent electrical conductivity (ECa), weekly or daily weather parameters, and day of the year (DOY) were tested and regressed against stem water potential (Ψstem), measured by a pressure bomb, and used as a proxy for GWS using three machine learning algorithms (elastic net, random forest regression, and support vector regression). Shapley Additive exPlanations (SHAP) analysis was used to assess the relationship between selected variables and Ψstem. The results indicate that the root mean square error (RMSE) of the transformed chlorophyll absorption reflectance index-based model improved from 213 to 146 kPa when DOY and elevation were included as ancillary inputs. RMSE of the excess green index-based model improved from 221 to 138 kPa when DOY, elevation, slope, ECa, and daily average windspeed were included as ancillary inputs. The support vector regression best described the relationship between Ψstem and selected predictors. This study has provided proof of the concept for developing GWS estimation models that potentially enhance the monitoring capacities of UAVs for GWS, as well as providing individual GWS mapping at the vineyard scale. This may enable growers to improve irrigation management, leading to controlled vegetative growth and optimized berry quality. Full article
(This article belongs to the Special Issue Advances of Remote Sensing in Precision Agriculture)
Show Figures

Graphical abstract

19 pages, 15925 KiB  
Article
GeoDLS: A Deep Learning-Based Corn Disease Tracking and Location System Using RTK Geolocated UAS Imagery
by Aanis Ahmad, Varun Aggarwal, Dharmendra Saraswat, Aly El Gamal and Gurmukh S. Johal
Remote Sens. 2022, 14(17), 4140; https://doi.org/10.3390/rs14174140 - 23 Aug 2022
Cited by 11 | Viewed by 2074
Abstract
Deep learning-based solutions for precision agriculture have recently achieved promising results. Deep learning has been used to identify crop diseases at the initial stages of disease development in an effort to create effective disease management systems. However, the use of deep learning and [...] Read more.
Deep learning-based solutions for precision agriculture have recently achieved promising results. Deep learning has been used to identify crop diseases at the initial stages of disease development in an effort to create effective disease management systems. However, the use of deep learning and unmanned aerial system (UAS) imagery to track the spread of diseases, identify diseased regions within cornfields, and notify users with actionable information remains a research gap. Therefore, in this study, high-resolution, UAS-acquired, real-time kinematic (RTK) geotagged, RGB imagery at an altitude of 12 m above ground level (AGL) was used to develop the Geo Disease Location System (GeoDLS), a deep learning-based system for tracking diseased regions in corn fields. UAS images (resolution 8192 × 5460 pixels) were acquired in cornfields located at Purdue University’s Agronomy Center for Research and Education (ACRE), using a DJI Matrice 300 RTK UAS mounted with a 45-megapixel DJI Zenmuse P1 camera during corn stages V14 to R4. A dataset of 5076 images was created by splitting the UAS-acquired images using tile and simple linear iterative clustering (SLIC) segmentation. For tile segmentation, the images were split into tiles of sizes 250 × 250 pixels, 500 × 500 pixels, and 1000 × 1000 pixels, resulting in 1804, 1112, and 570 image tiles, respectively. For SLIC segmentation, 865 and 725 superpixel images were obtained using compactness (m) values of 5 and 10, respectively. Five deep neural network architectures, VGG16, ResNet50, InceptionV3, DenseNet169, and Xception, were trained to identify diseased, healthy, and background regions in corn fields. DenseNet169 identified diseased, healthy, and background regions with the highest testing accuracy of 100.00% when trained on images of tile size 1000 × 1000 pixels. Using a sliding window approach, the trained DenseNet169 model was then used to calculate the percentage of diseased regions present within each UAS image. Finally, the RTK geolocation information for each image was used to update users with the location of diseased regions with an accuracy of within 2 cm through a web application, a smartphone application, and email notifications. The GeoDLS could be a potential tool for an automated disease management system to track the spread of crop diseases, identify diseased regions, and provide actionable information to the users. Full article
(This article belongs to the Special Issue Advances of Remote Sensing in Precision Agriculture)
Show Figures

Figure 1

20 pages, 5646 KiB  
Article
Prediction of the Nitrogen, Phosphorus and Potassium Contents in Grape Leaves at Different Growth Stages Based on UAV Multispectral Remote Sensing
by Xuelian Peng, Dianyu Chen, Zhenjiang Zhou, Zhitao Zhang, Can Xu, Qing Zha, Fang Wang and Xiaotao Hu
Remote Sens. 2022, 14(11), 2659; https://doi.org/10.3390/rs14112659 - 02 Jun 2022
Cited by 18 | Viewed by 3188
Abstract
The rapid and accurate acquisition of nitrogen, phosphorus and potassium nutrient contents in grape leaves is critical for improving grape yields and quality and for industrial development. In this study, crop growth was non-destructively monitored based on unmanned aerial vehicle (UAV) remote sensing [...] Read more.
The rapid and accurate acquisition of nitrogen, phosphorus and potassium nutrient contents in grape leaves is critical for improving grape yields and quality and for industrial development. In this study, crop growth was non-destructively monitored based on unmanned aerial vehicle (UAV) remote sensing technology. Three irrigation levels (W1, W2 and W3) and four fertilization levels (F3, F2, F1 and F0) were set in this study, and drip irrigation fertilization treatments adopted a complete block design. A correlation analysis was conducted using UAV multispectral image data obtained from 2019 to 2021 and the field-measured leaf nitrogen content (LNC), leaf potassium content (LKC) and leaf phosphorus content (LPC) values; from the results, the vegetation indices (VIs) that were sensitive to LNC, LKC and LPC were determined. By combining spectral indices with partial least squares (PLS), random forest (RF), support vector machine (SVM) and extreme learning machine (ELM) machine-learning algorithms, prediction models were established. Finally, the optimal combinations of spectral variables and machine learning models for predicting LNC, LPC and LKC in each grape growth period were determined. The results showed that: (1) there were high demands for nitrogen during the new shoot growth and flowering periods, potassium was the main nutrient absorbed in the fruit expansion period, and phosphorus was the main nutrient absorbed in the veraison and maturity periods; (2) combining multiple spectral variables with the RF, SVM and ELM models could result in improved LNC, LPC and LKC predictions. The optimal prediction model determination coefficient (R2) derived during the new shoot growth period was above 0.65, and that obtained during the other growth periods was above 0.75. The relative root mean square error (RRMSE) of the above models was below 0.20, and the Willmott consistency index (WIA) was above 0.88. In conclusion, UAV multispectral images have good application effects when predicting nutrient contents in grape leaves. This study can provide technical support for accurate vineyard nutrient management using UAV platforms. Full article
(This article belongs to the Special Issue Advances of Remote Sensing in Precision Agriculture)
Show Figures

Figure 1

17 pages, 6829 KiB  
Article
Combining Spectral and Textural Information from UAV RGB Images for Leaf Area Index Monitoring in Kiwifruit Orchard
by Youming Zhang, Na Ta, Song Guo, Qian Chen, Longcai Zhao, Fenling Li and Qingrui Chang
Remote Sens. 2022, 14(5), 1063; https://doi.org/10.3390/rs14051063 - 22 Feb 2022
Cited by 25 | Viewed by 3813
Abstract
The use of a fast and accurate unmanned aerial vehicle (UAV) digital camera platform to estimate leaf area index (LAI) of kiwifruit orchard is of great significance for growth, yield estimation, and field management. LAI, as an ideal parameter for estimating vegetation growth, [...] Read more.
The use of a fast and accurate unmanned aerial vehicle (UAV) digital camera platform to estimate leaf area index (LAI) of kiwifruit orchard is of great significance for growth, yield estimation, and field management. LAI, as an ideal parameter for estimating vegetation growth, plays a significant role in reflecting crop physiological process and ecosystem function. At present, LAI estimation mainly focuses on winter wheat, corn, soybean, and other food crops; in addition, LAI on forest research is also predominant, but there are few studies on the application of orchards such as kiwifruit. Concerning this study, high-resolution UAV images of three growth stages of kiwifruit orchard were acquired from May to July 2021. The extracted significantly correlated spectral and textural parameters were used to construct univariate and multivariate regression models with LAI measured for corresponding growth stages. The optimal model was selected for LAI estimation and mapping by comparing the stepwise regression (SWR) and random forest regression (RFR). Results showed the model combining texture features was superior to that only based on spectral indices for the prediction accuracy of the modeling set, with the R2 of 0.947 and 0.765, RMSE of 0.048 and 0.102, and nRMSE of 7.99% and 16.81%, respectively. Moreover, the RFR model (R2 = 0.972, RMSE = 0.035, nRMSE = 5.80%) exhibited the best accuracy in estimating LAI, followed by the SWR model (R2 = 0.765, RMSE = 0.102, nRMSE = 16.81%) and univariate linear regression model (R2 = 0.736, RMSE = 0.108, nRMSE = 17.84%). It was concluded that the estimation method based on UAV spectral parameters combined with texture features can provide an effective method for kiwifruit growth process monitoring. It is expected to provide scientific guidance and practical methods for the kiwifruit management in the field for low-cost UAV remote sensing technology to realize large area and high-quality monitoring of kiwifruit growth, thus providing a theoretical basis for kiwifruit growth investigation. Full article
(This article belongs to the Special Issue Advances of Remote Sensing in Precision Agriculture)
Show Figures

Graphical abstract

21 pages, 21095 KiB  
Article
Exploratory Analysis on Pixelwise Image Segmentation Metrics with an Application in Proximal Sensing
by Paul Melki, Lionel Bombrun, Estelle Millet, Boubacar Diallo, Hakim ElChaoui ElGhor and Jean-Pierre Da Costa
Remote Sens. 2022, 14(4), 996; https://doi.org/10.3390/rs14040996 - 18 Feb 2022
Cited by 2 | Viewed by 3308
Abstract
A considerable number of metrics can be used to evaluate the performance of machine learning algorithms. While much work is dedicated to the study and improvement of data quality and models’ performance, much less research is focused on the study of these evaluation [...] Read more.
A considerable number of metrics can be used to evaluate the performance of machine learning algorithms. While much work is dedicated to the study and improvement of data quality and models’ performance, much less research is focused on the study of these evaluation metrics, their intrinsic relationship, the interplay of the influence among the metrics, the models, the data, and the environments and conditions in which they are to be applied. While some works have been conducted on general machine learning tasks such as classification, fewer efforts have been dedicated to more complex problems such as object detection and image segmentation, in which the evaluation of performance can vary drastically depending on the objectives and domains of application. Working in an agricultural context, specifically on the problem of the automatic detection of plants in proximal sensing images, we studied twelve evaluation metrics that we used to evaluate three image segmentation models recently presented in the literature. After a unified presentation of these metrics, we carried out an exploratory analysis of their relationships using a correlation analysis, a clustering of variables, and two factorial analyses (namely principal component analysis and multiple factorial analysis). We distinguished three groups of highly linked metrics and, through visual inspection of the representative images of each group, identified the aspects of segmentation that each group evaluates. The aim of this exploratory analysis was to provide some clues to practitioners for understanding and choosing the metrics that are most relevant to their agricultural task. Full article
(This article belongs to the Special Issue Advances of Remote Sensing in Precision Agriculture)
Show Figures

Figure 1

17 pages, 3754 KiB  
Article
Machine Learning in Evaluating Multispectral Active Canopy Sensor for Prediction of Corn Leaf Nitrogen Concentration and Yield
by Razieh Barzin, Hossein Lotfi, Jac J. Varco and Ganesh C. Bora
Remote Sens. 2022, 14(1), 120; https://doi.org/10.3390/rs14010120 - 28 Dec 2021
Cited by 14 | Viewed by 2368
Abstract
Applying the optimum rate of fertilizer nitrogen (N) is a critical factor for field management. Multispectral information collected by active canopy sensors can potentially indicate the leaf N status and aid in predicting grain yield. Crop Circle multispectral data were acquired with the [...] Read more.
Applying the optimum rate of fertilizer nitrogen (N) is a critical factor for field management. Multispectral information collected by active canopy sensors can potentially indicate the leaf N status and aid in predicting grain yield. Crop Circle multispectral data were acquired with the purpose of measuring the reflectance data to calculate vegetation indices (VIs) at different growth stages. Applying the optimum rate of fertilizer N can have a considerable impact on grain yield and profitability. The objectives of this study were to evaluate the reliability of a handheld Crop Circle ACS-430, to estimate corn leaf N concentration and predict grain yield of corn using machine learning (ML) models. The analysis was conducted using four ML models to identify the best prediction model for measurements acquired with a Crop Circle ACS-430 field sensor at three growth stages. Four fertilizer N levels from deficient to excessive in 50/50 spilt were applied to corn at 1–2 leaves, with visible leaf collars (V1–V2 stage) and at the V6–V7 stage to establish widely varying N nutritional status. Crop Circle spectral observations were used to derive 25 VIs for different growth stages (V4, V6, and VT) of corn at the W. B. Andrews Agricultural Systems farm of Mississippi State University. Multispectral raw data, along with Vis, were used to quantify leaf N status and predict the yield of corn. In addition, the accuracy of wavelength-based and VI-based models were compared to examine the best model inputs. Due to limited observed data, the stratification approach was used to split data to train and test set to obtain balanced data for each stage. Repeated cross validation (RCV) was then used to train the models. Results showed that the Simplified Canopy Chlorophyll Content Index (SCCCI) and Red-edge ratio vegetation index (RERVI) were the most effective VIs for estimating leaf N% and that SCCCI, Red-edge chlorophyll index (CIRE), RERVI, Soil Adjusted Vegetation Index (SAVI), and Normalized Difference Vegetation Index (NDVI) were the most effective VIs for predicting corn grain yield. Additionally, among the four ML models utilized in this research, support vector regression (SVR) achieved the most accurate results for estimating leaf N concentration using either spectral bands or VIs as the model inputs. Full article
(This article belongs to the Special Issue Advances of Remote Sensing in Precision Agriculture)
Show Figures

Graphical abstract

22 pages, 128572 KiB  
Article
Deep Learning-Based Object Detection System for Identifying Weeds Using UAS Imagery
by Aaron Etienne, Aanis Ahmad, Varun Aggarwal and Dharmendra Saraswat
Remote Sens. 2021, 13(24), 5182; https://doi.org/10.3390/rs13245182 - 20 Dec 2021
Cited by 25 | Viewed by 4850
Abstract
Current methods of broadcast herbicide application cause a negative environmental and economic impact. Computer vision methods, specifically those related to object detection, have been reported to aid in site-specific weed management procedures for targeted herbicide application within a field. However, a major challenge [...] Read more.
Current methods of broadcast herbicide application cause a negative environmental and economic impact. Computer vision methods, specifically those related to object detection, have been reported to aid in site-specific weed management procedures for targeted herbicide application within a field. However, a major challenge to developing a weed detection system is the requirement for a properly annotated database to differentiate between weeds and crops under field conditions. This research involved creating an annotated database of 374 red, green, and blue (RGB) color images organized into monocot and dicot weed classes. The images were acquired from corn and soybean research plots located in north-central Indiana using an unmanned aerial system (UAS) flown at 30 and 10 m heights above ground level (AGL). A total of 25,560 individual weed instances were manually annotated. The annotated database consisted of four different subsets (Training Image Sets 1–4) to train the You Only Look Once version 3 (YOLOv3) deep learning model for five separate experiments. The best results were observed with Training Image Set 4, consisting of images acquired at 10 m AGL. For monocot and dicot weeds, respectively, an average precision (AP) score of 91.48 % and 86.13% was observed at a 25% IoU threshold (AP @ T = 0.25), as well as 63.37% and 45.13% at a 50% IoU threshold (AP @ T = 0.5). This research has demonstrated a need to develop large, annotated weed databases to evaluate deep learning models for weed identification under field conditions. It also affirms the findings of other limited research studies utilizing object detection for weed identification under field conditions. Full article
(This article belongs to the Special Issue Advances of Remote Sensing in Precision Agriculture)
Show Figures

Figure 1

Review

Jump to: Research

25 pages, 46175 KiB  
Review
Remote Sensing on Alfalfa as an Approach to Optimize Production Outcomes: A Review of Evidence and Directions for Future Assessments
by Danilo Tedesco, Luciana Nieto, Carlos Hernández, Juan F. Rybecky, Doohong Min, Ajay Sharda, Kevin J. Hamilton and Ignacio A. Ciampitti
Remote Sens. 2022, 14(19), 4940; https://doi.org/10.3390/rs14194940 - 03 Oct 2022
Cited by 1 | Viewed by 2338
Abstract
Alfalfa (Medicago sativa L.) is one of the most relevant forage crops due to its importance for livestock. Timely harvesting is critical to secure adequate forage quality. However, farmers face challenges not only to decide the optimal harvesting time but to predict [...] Read more.
Alfalfa (Medicago sativa L.) is one of the most relevant forage crops due to its importance for livestock. Timely harvesting is critical to secure adequate forage quality. However, farmers face challenges not only to decide the optimal harvesting time but to predict the optimum levels for both forage production and quality. Fortunately, remote sensing technologies can significantly contribute to obtaining production and quality insights, providing scalability, and supporting complex farming decision-making. Therefore, we aim to develop a systematic review of the current scientific literature to identify the current status of research in remote sensing for alfalfa and to evaluate new perspectives for enhancing prediction of both biomass and quality (herein defined as crude protein and fibers) for alfalfa. Twelve papers were included in the database from a total of 198 studies included in the initial screening process. The main findings were (i) more than two-thirds of the studies focused on predicting biomass; (ii) half of the studies used terrestrial platforms, with only 33% using drones and 17% using satellite for remote sensing; (iii) no studies have used satellites assessed alfalfa quality traits; (iv) improved biomass and quality estimations were obtained when remote sensing data was combined with environmental information; (v) due to a direct relationship between biomass and quality, modeling them algorithmically improves the accuracy of estimation as well; (vi) from spectral wavelengths, dry biomass was better estimated in regions near 398, 551, 670, 730, 780, 865, and 1077 nm, wet biomass in regions near 478, 631, 670, 730, 780, 834, 933, 1034, and 1538 nm, and quality traits identified with narrow and very specific wavelengths (e.g., 398, 461, 551, 667, 712, and 1077 nm). Our findings might serve as a foundation to guide further research and the development of handheld sensors for assessing alfalfa biomass and quality. Full article
(This article belongs to the Special Issue Advances of Remote Sensing in Precision Agriculture)
Show Figures

Figure 1

24 pages, 4113 KiB  
Review
Precision Oliviculture: Research Topics, Challenges, and Opportunities—A Review
by Eliseo Roma and Pietro Catania
Remote Sens. 2022, 14(7), 1668; https://doi.org/10.3390/rs14071668 - 30 Mar 2022
Cited by 14 | Viewed by 3171
Abstract
Since the beginning of the 21st century, there has been an increase in the agricultural area devoted to olive growing and in the consumption of extra virgin olive oil (EVOO). The continuous change in cultivation techniques implemented poses new challenges to ensure environmental [...] Read more.
Since the beginning of the 21st century, there has been an increase in the agricultural area devoted to olive growing and in the consumption of extra virgin olive oil (EVOO). The continuous change in cultivation techniques implemented poses new challenges to ensure environmental and economic sustainability. In this context, precision oliviculture (PO) is having an increasing scientific interest and impact on the sector. Its implementation depends on various technological developments: sensors for local and remote crop monitoring, global navigation satellite system (GNSS), equipment and machinery to perform site-specific management through variable rate application (VRA), implementation of geographic information systems (GIS), and systems for analysis, interpretation, and decision support (DSS). This review provides an overview of the state of the art of technologies that can be employed and current applications and their potential. It also discusses the challenges and possible solutions and implementations of future technologies such as IoT, unmanned ground vehicles (UGV), and machine learning (ML). Full article
(This article belongs to the Special Issue Advances of Remote Sensing in Precision Agriculture)
Show Figures

Figure 1

Back to TopTop