Drones in Sustainable Agriculture

A special issue of Drones (ISSN 2504-446X). This special issue belongs to the section "Drones in Agriculture and Forestry".

Deadline for manuscript submissions: closed (25 November 2023) | Viewed by 6211

Special Issue Editors


E-Mail Website
Guest Editor
College of Engineering, South China Agricultural University, Guangzhou 510642, China
Interests: onboard sensor design; sensor fusion; signal/image processing; agriculture; controlling system; navigation and position/orientation; autonomous take-off and landing
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Key Laboratory of Key Technology on Agricultural Machine and Equipment, South China Agricultural University,Ministry of Education, Guangzhou 510642, China
Interests: geometric and radiometric sensors; onboard sensor design; sensor fusion; calibration of imageries; agriculture

Special Issue Information

Dear Colleagues,

As UAVs/drones have become an integral part of intelligent (or "sustainable") agriculture, they can help farmers cope with various challenges and obtain substantial benefits. The success of agriculture usually depends on many factors. Farmers have little or no control over weather and soil conditions, temperature, precipitation, etc. Here, the use of UAV technology can really change the rules of the game. By obtaining a large amount of data, farmers can improve crop yields, save time, and reduce costs. UAVs have the potential to change modern agriculture in many ways and become intrinsic to smart agriculture. With an increasing number of UAV-based agricultural applications (UAV-based sensing, spraying, seeding, fertilizing, weeding, etc.), more and more farmers will benefit in the future.

This Special Issue aims to collect new developments and methodologies, best practices, and applications of UAVs/drones in sustainable agriculture. We welcome submissions which provide the community with the most recent advancements in all aspects of UAVs in sustainable agriculture, including (but not limited to) the following:

  • Sensor fusion;
  • Signal/image processing;
  • Agriculture/farm;
  • Controlling system/flight control system;
  • Navigation and position/orientation;
  • On-line and real time processing of remote sensing data;
  • UAV-based sensing, spraying, seeding, fertilizing, weeding, etc.;
  • Novel sensing system/method for phenotype;
  • Autonomous take-off and landing;
  • Geometric and radiometric sensors;
  • UAV control, obstacle sensing, and avoidance in farms;
  • The calibration of imageries.

We require relevant articles and reviews on the study of UAVs/drones in sustainable agriculture.

Prof. Dr. Zhiyan Zhou
Dr. Rui Jiang
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Drones is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • sensor fusion
  • signal/image processing
  • agriculture/farm
  • controlling system/flight control system
  • navigation and position/orientation
  • on-line and real-time processing of remote sensing data
  • UAV-based sensing, spraying, seeding, fertilizing, weeding, etc.
  • novel sensing system/method for phenotypes
  • autonomous take-off and landing
  • geometric and radiometric sensors
  • UAV control, obstacle sensing, and avoidance in farm
  • the calibration of imageries

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

25 pages, 8258 KiB  
Article
Quantity Monitor Based on Differential Weighing Sensors for Storage Tank of Agricultural UAV
by Junhao Huang, Weizhuo He, Deshuai Yang, Jianqin Lin, Yuanzhen Ou, Rui Jiang and Zhiyan Zhou
Drones 2024, 8(3), 92; https://doi.org/10.3390/drones8030092 - 7 Mar 2024
Viewed by 1028
Abstract
Nowadays, unmanned aerial vehicles (UAVs) play a pivotal role in agricultural production. In scenarios involving the release of particulate materials, the precision of quantity monitors for the storage tank of UAVs directly impacts its operational accuracy. Therefore, this paper introduces a novel noise-mitigation [...] Read more.
Nowadays, unmanned aerial vehicles (UAVs) play a pivotal role in agricultural production. In scenarios involving the release of particulate materials, the precision of quantity monitors for the storage tank of UAVs directly impacts its operational accuracy. Therefore, this paper introduces a novel noise-mitigation design for agricultural UAVs’ quantity monitors, utilizing differential weighing sensors. The design effectively addresses three primary noise sources: sensor-intrinsic noise, vibration noise, and weight-loading uncertainty. Additionally, two comprehensive data processing methods are proposed for noise reduction: the first combines the Butterworth low-pass filter, the Kalman filter, and the moving average filter (BKM), while the second integrates the Least Mean Squares (LMS) adaptive filter, the Kalman filter, and the moving average filter (LKM). Rigorous data processing has been conducted, and the monitor’s performance has been assessed in three UAV typical states: static, hovering, and flighting. Specifically, compared to the BKM, the LKM’s maximum relative error ranges between 1.24% and 2.74%, with an average relative error of 0.31%~0.58% when the UAV was in a hovering state. In flight mode, the LKM’s maximum relative error varies from 1.68% to 10.06%, while the average relative error ranges between 0.74% and 2.54%. Furthermore, LKM can effectively suppress noise interference near 75 Hz and 150 Hz. The results reveal that the LKM technology demonstrated superior adaptability to noise and effectively mitigates its impact in the quantity monitoring for storage tank of agricultural UAVs. Full article
(This article belongs to the Special Issue Drones in Sustainable Agriculture)
Show Figures

Figure 1

18 pages, 6362 KiB  
Article
Deep Learning-Based Weed Detection Using UAV Images: A Comparative Study
by Tej Bahadur Shahi, Sweekar Dahal, Chiranjibi Sitaula, Arjun Neupane and William Guo
Drones 2023, 7(10), 624; https://doi.org/10.3390/drones7100624 - 7 Oct 2023
Cited by 3 | Viewed by 3370
Abstract
Semantic segmentation has been widely used in precision agriculture, such as weed detection, which is pivotal to increasing crop yields. Various well-established and swiftly evolved AI models have been developed of late for semantic segmentation in weed detection; nevertheless, there is insufficient information [...] Read more.
Semantic segmentation has been widely used in precision agriculture, such as weed detection, which is pivotal to increasing crop yields. Various well-established and swiftly evolved AI models have been developed of late for semantic segmentation in weed detection; nevertheless, there is insufficient information about their comparative study for optimal model selection in terms of performance in this field. Identifying such a model helps the agricultural community make the best use of technology. As such, we perform a comparative study of cutting-edge AI deep learning-based segmentation models for weed detection using an RGB image dataset acquired with UAV, called CoFly-WeedDB. For this, we leverage AI segmentation models, ranging from SegNet to DeepLabV3+, combined with five backbone convolutional neural networks (VGG16, ResNet50, DenseNet121, EfficientNetB0 and MobileNetV2). The results show that UNet with EfficientNetB0 as a backbone CNN is the best-performing model compared with the other candidate models used in this study on the CoFly-WeedDB dataset, imparting Precision (88.20%), Recall (88.97%), F1-score (88.24%) and mean Intersection of Union (56.21%). From this study, we suppose that the UNet model combined with EfficientNetB0 could potentially be used by the concerned stakeholders (e.g., farmers, the agricultural industry) to detect weeds more accurately in the field, thereby removing them at the earliest point and increasing crop yields. Full article
(This article belongs to the Special Issue Drones in Sustainable Agriculture)
Show Figures

Figure 1

20 pages, 9862 KiB  
Article
Recognition of Rubber Tree Powdery Mildew Based on UAV Remote Sensing with Different Spatial Resolutions
by Tiwei Zeng, Jihua Fang, Chenghai Yin, Yuan Li, Wei Fu, Huiming Zhang, Juan Wang and Xirui Zhang
Drones 2023, 7(8), 533; https://doi.org/10.3390/drones7080533 - 16 Aug 2023
Cited by 2 | Viewed by 1365
Abstract
Rubber tree is one of the essential tropical economic crops, and rubber tree powdery mildew (PM) is the most damaging disease to the growth of rubber trees. Accurate and timely detection of PM is the key to preventing the large-scale spread of PM. [...] Read more.
Rubber tree is one of the essential tropical economic crops, and rubber tree powdery mildew (PM) is the most damaging disease to the growth of rubber trees. Accurate and timely detection of PM is the key to preventing the large-scale spread of PM. Recently, unmanned aerial vehicle (UAV) remote sensing technology has been widely used in the field of agroforestry. The objective of this study was to establish a method for identifying rubber trees infected or uninfected by PM using UAV-based multispectral images. We resampled the original multispectral image with 3.4 cm spatial resolution to multispectral images with different spatial resolutions (7 cm, 14 cm, and 30 cm) using the nearest neighbor method, extracted 22 vegetation index features and 40 texture features to construct the initial feature space, and then used the SPA, ReliefF, and Boruta–SHAP algorithms to optimize the feature space. Finally, a rubber tree PM monitoring model was constructed based on the optimized features as input combined with KNN, RF, and SVM algorithms. The results show that the simulation of images with different spatial resolutions indicates that, with resolutions higher than 7 cm, a promising classification result (>90%) is achieved in all feature sets and three optimized feature subsets, in which the 3.4 cm resolution is the highest and better than 7 cm, 14 cm, and 30 cm. Meanwhile, the best classification accuracy was achieved by combining the Boruta–SHAP optimized feature subset and SVM model, which were 98.16%, 96.32%, 95.71%, and 88.34% at 3.4 cm, 7 cm, 14 cm, and 30 cm resolutions, respectively. Compared with SPA–SVM and ReliefF–SVM, the classification accuracy was improved by 6.14%, 5.52%, 12.89%, and 9.2% and 1.84%, 0.61%, 1.23%, and 6.13%, respectively. This study’s results will guide rubber tree plantation management and PM monitoring. Full article
(This article belongs to the Special Issue Drones in Sustainable Agriculture)
Show Figures

Figure 1

Back to TopTop