Sensors and Remote Sensing in Precision Horticulture

A special issue of Agriculture (ISSN 2077-0472). This special issue belongs to the section "Digital Agriculture".

Deadline for manuscript submissions: closed (20 December 2022) | Viewed by 2409

Special Issue Editors


E-Mail Website
Co-Guest Editor
Geosystems Research Institute, Mississippi State University, Starkville, MS 39762, USA
Interests: signal processing; learning problems in wireless; sensor networks using optimization; probability

Special Issue Information

Dear Colleagues,

Precision horticulture is a data-driven management method that collects site- or plant-specific information of fruits and vegetables in order to (1) make in-growth decisions to improve production and (2) postharvest process management. Precision horticulture is particularly advantageous to the farmer due to the high value of their products and the high quantities of crop inputs required to produce horticultural crops. Clearly any cost reduction significantly boosts producer profits and effective utilization of crop inputs may lessen the environmental impact of horticultural crop production.

In horticulture, analysis of the product's quality is more crucial than in any other crop. Typically, the field size is less than that of agricultural output. Even single plants may be handled individually in accordance with the spatial or temporal pattern, as the planting density is reduced. Precision horticulture implementation relies primarily on sensors and systems that can collect weather, soil, and plant-specific data at a reasonable cost. Optical sensors are the most prevalent, and many approaches have demonstrated the promise for effective, quick, non-invasive in-situ disease diagnosis and yield estimate. The most common applications are biotic and abiotic stress detection at asymptomatic or early stages, canopy size and density, yield estimation, and crop quality, among other data.

Dr. Alessandro Matese
Dr. Santhana Krishnan Boopalan
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Agriculture is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • precision horticulture
  • yield monitor
  • quality monitor
  • agricultural decision support systems (AgriDSS)
  • remote sensing applications
  • proximal sensors
  • artificial intelligence (AI) and machine learning (ML) methodologies
  • internet of Things (IoT)
  • variable-rate input applications
  • automated machinery and agricultural robots

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

35 pages, 7748 KiB  
Article
Automation of Crop Disease Detection through Conventional Machine Learning and Deep Transfer Learning Approaches
by Houda Orchi, Mohamed Sadik, Mohammed Khaldoun and Essaid Sabir
Agriculture 2023, 13(2), 352; https://doi.org/10.3390/agriculture13020352 - 31 Jan 2023
Cited by 5 | Viewed by 1866
Abstract
With the rapid population growth, increasing agricultural productivity is an extreme requirement to meet demands. Early identification of crop diseases is essential to prevent yield loss. Nevertheless, it is a tedious task to manually monitor leaf diseases, as it demands in-depth knowledge of [...] Read more.
With the rapid population growth, increasing agricultural productivity is an extreme requirement to meet demands. Early identification of crop diseases is essential to prevent yield loss. Nevertheless, it is a tedious task to manually monitor leaf diseases, as it demands in-depth knowledge of plant pathogens as well as a lot of work, and excessive processing time. For these purposes, various methods based on image processing, deep learning, and machine learning are developed and examined by researchers for crop leaf disease identification and often have obtained significant results. Motivated by this existing work, we conducted an extensive comparative study between traditional machine learning (SVM, LDA, KNN, CART, RF, and NB) and deep transfer learning (VGG16, VGG19, InceptionV3, ResNet50, and CNN) models in terms of precision, accuracy, f1-score, and recall on a dataset taken from the PlantVillage Dataset composed of diseased and healthy crop leaves for binary classification. Moreover, we applied several activation functions and deep learning optimizers to further enhance these CNN architectures’ performance. The classification accuracy (CA) of leaf diseases that we obtained by experimentation is quite impressive for all models. Our findings reveal that NB gives the least CA at 60.09%, while the InceptionV3 model yields the best CA, reaching an accuracy of 98.01%. Full article
(This article belongs to the Special Issue Sensors and Remote Sensing in Precision Horticulture)
Show Figures

Figure 1

Back to TopTop