UAS in Smart Agriculture: 2nd Edition

A special issue of Drones (ISSN 2504-446X). This special issue belongs to the section "Drones in Agriculture and Forestry".

Deadline for manuscript submissions: 15 September 2024 | Viewed by 1752

Special Issue Editors


E-Mail Website
Guest Editor
College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
Interests: smart agriculture; UAS; remote sensing; plant phenotype and disease-pest monitoring; crop yield prediction; variable spraying system; deep learning; imaging processing technology
Special Issues, Collections and Topics in MDPI journals

grade E-Mail Website
Guest Editor

Special Issue Information

Dear Colleagues,

Continuing from the last Special Issue on “UAS in Smart Agriculture”, we are pleased to announce the Special Issue on “UAS in Smart Agriculture: 2nd Edition”.

With the development of emerging information and digital technology, unmanned technology and equipment have become more important for developing sustainable agriculture. UASs are widely used in smart agriculture, including unmanned control systems, remote sensing information collection, and variable operation systems. Unmanned control systems mainly include intelligent control algorithms, communication technology, environmental awareness, and autonomous obstacle avoidance technology, which aim to improve the level of intelligent control. Remote sensing technology mainly includes plant phenotype, disease and pest monitoring, yield estimation, 3D information acquisition, multispectral and hyperspectral imaging sensors, and intelligence modelling technology, which provide more efficient and dynamic data. Variable forms of operation technology mainly include intelligent decision making, prescription chart technology, and variable spraying and sowing operations, which provide precise management and operations. Unmanned systems are widely used for field crops, orchards, and unmanned ecological farms. UASs offer strong support to promoting the green, healthy, ecological, and sustainable development of smart agriculture.

In this Special Issue on “UAS in Smart Agriculture: 2nd Edition”, original research articles and reviews are welcome. Research areas may include (but are not limited to) the following keyword topics.

Prof. Dr. Fei Liu
Prof. Dr. Yangquan Chen
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Drones is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • smart agriculture
  • digital agriculture
  • remote sensing
  • unmanned control and operation systems
  • multi-resource image processing technology
  • deep learning
  • plant phenotype
  • plant disease and pest diagnosis
  • crop and orchard yield monitoring
  • weed detection
  • soil monitoring
  • 3D digital technology
  • spraying and sowing systems
  • variable operation prescription technology

Related Special Issue

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 9376 KiB  
Article
Comparison and Optimal Method of Detecting the Number of Maize Seedlings Based on Deep Learning
by Zhijie Jia, Xinlong Zhang, Hongye Yang, Yuan Lu, Jiale Liu, Xun Yu, Dayun Feng, Kexin Gao, Jianfu Xue, Bo Ming, Chenwei Nie and Shaokun Li
Drones 2024, 8(5), 175; https://doi.org/10.3390/drones8050175 (registering DOI) - 28 Apr 2024
Viewed by 108
Abstract
Effective agricultural management in maize production operations starts with the early quantification of seedlings. Accurately determining plant presence allows growers to optimize planting density, allocate resources, and detect potential growth issues early on. This study provides a comprehensive analysis of the performance of [...] Read more.
Effective agricultural management in maize production operations starts with the early quantification of seedlings. Accurately determining plant presence allows growers to optimize planting density, allocate resources, and detect potential growth issues early on. This study provides a comprehensive analysis of the performance of various object detection models in maize production, with a focus on the effects of planting density, growth stages, and flight altitudes. The findings of this study demonstrate that one-stage models, particularly YOLOv8n and YOLOv5n, demonstrated superior performance with AP50 scores of 0.976 and 0.951, respectively, outperforming two-stage models in terms of resource efficiency and seedling quantification accuracy. YOLOv8n, along with Deformable DETR, Faster R-CNN, and YOLOv3-tiny, were identified for further examination based on their performance metrics and architectural features. The study also highlights the significant impact of plant density and growth stage on detection accuracy. Increased planting density and advanced growth stages (particularly V6) were associated with decreased model accuracy due to increased leaf overlap and image complexity. The V2–V3 growth stages were identified as the optimal periods for detection. Additionally, flight altitude negatively affected image resolution and detection accuracy, with higher altitudes leading to poorer performance. In field applications, YOLOv8n proved highly effective, maintaining robust performance across different agricultural settings and consistently achieving rRMSEs below 1.64% in high-yield fields. The model also demonstrated high reliability, with Recall, Precision, and F1 scores exceeding 99.00%, affirming its suitability for practical agricultural use. These findings suggest that UAV-based image collection systems employing models like YOLOv8n can significantly enhance the accuracy and efficiency of seedling detection in maize production. The research elucidates the critical factors that impact the accuracy of deep learning detection models in the context of corn seedling detection and selects a model suited for this specific task in practical agricultural production. These findings offer valuable insights into the application of object detection technology and lay a foundation for the future development of precision agriculture, particularly in optimizing deep learning models for varying environmental conditions that affect corn seedling detection. Full article
(This article belongs to the Special Issue UAS in Smart Agriculture: 2nd Edition)
Show Figures

Figure 1

21 pages, 12187 KiB  
Article
Establishment and Verification of the UAV Coupled Rotor Airflow Backward Tilt Angle Controller
by Han Wu, Dong Liu, Yinwei Zhao, Zongru Liu, Yunting Liang, Zhijie Liu, Taoran Huang, Ke Liang, Shaoqiang Xie and Jiyu Li
Drones 2024, 8(4), 146; https://doi.org/10.3390/drones8040146 - 08 Apr 2024
Viewed by 585
Abstract
At present, all the flight controllers of agricultural UAVs cannot accurately and quickly control the influencing factors of the UAV coupled rotor airflow backward tilt angle during the application process. To solve the above problem, a Rotor Airflow Backward Tilt Angle (RABTA) controller [...] Read more.
At present, all the flight controllers of agricultural UAVs cannot accurately and quickly control the influencing factors of the UAV coupled rotor airflow backward tilt angle during the application process. To solve the above problem, a Rotor Airflow Backward Tilt Angle (RABTA) controller is established in this paper. The RABTA controller integrates advanced sensor technology with a novel algorithmic approach, utilizing real-time data acquisition and state–space analysis to dynamically adjust the UAV’s rotor airflow, ensuring precise control of the backward tilt angle. The control effect of the traditional flight controller and RABTA controller in the process of pesticide application and the corresponding operation effect are compared and analyzed. The comparison results show that the RABTA controller reduces the control error to less than 1 degree, achieving a 48.3% improvement in the uniformity of the distribution of pesticides droplets across the crop canopy, which means that the UAV field application effect is implemented and the innovation of the UAV field application control mode is realized. Full article
(This article belongs to the Special Issue UAS in Smart Agriculture: 2nd Edition)
Show Figures

Figure 1

20 pages, 14112 KiB  
Article
Mapping Maize Planting Densities Using Unmanned Aerial Vehicles, Multispectral Remote Sensing, and Deep Learning Technology
by Jianing Shen, Qilei Wang, Meng Zhao, Jingyu Hu, Jian Wang, Meiyan Shu, Yang Liu, Wei Guo, Hongbo Qiao, Qinglin Niu and Jibo Yue
Drones 2024, 8(4), 140; https://doi.org/10.3390/drones8040140 - 03 Apr 2024
Viewed by 708
Abstract
Maize is a globally important cereal and fodder crop. Accurate monitoring of maize planting densities is vital for informed decision-making by agricultural managers. Compared to traditional manual methods for collecting crop trait parameters, approaches using unmanned aerial vehicle (UAV) remote sensing can enhance [...] Read more.
Maize is a globally important cereal and fodder crop. Accurate monitoring of maize planting densities is vital for informed decision-making by agricultural managers. Compared to traditional manual methods for collecting crop trait parameters, approaches using unmanned aerial vehicle (UAV) remote sensing can enhance the efficiency, minimize personnel costs and biases, and, more importantly, rapidly provide density maps of maize fields. This study involved the following steps: (1) Two UAV remote sensing-based methods were developed for monitoring maize planting densities. These methods are based on (a) ultrahigh-definition imagery combined with object detection (UHDI-OD) and (b) multispectral remote sensing combined with machine learning (Multi-ML) for the monitoring of maize planting densities. (2) The maize planting density measurements, UAV ultrahigh-definition imagery, and multispectral imagery collection were implemented at a maize breeding trial site. Experimental testing and validation were conducted using the proposed maize planting density monitoring methods. (3) An in-depth analysis of the applicability and limitations of both methods was conducted to explore the advantages and disadvantages of the two estimation models. The study revealed the following findings: (1) UHDI-OD can provide highly accurate estimation results for maize densities (R2 = 0.99, RMSE = 0.09 plants/m2). (2) Multi-ML provides accurate maize density estimation results by combining remote sensing vegetation indices (VIs) and gray-level co-occurrence matrix (GLCM) texture features (R2 = 0.76, RMSE = 0.67 plants/m2). (3) UHDI-OD exhibits a high sensitivity to image resolution, making it unsuitable for use with UAV remote sensing images with pixel sizes greater than 2 cm. In contrast, Multi-ML is insensitive to image resolution and the model accuracy gradually decreases as the resolution decreases. Full article
(This article belongs to the Special Issue UAS in Smart Agriculture: 2nd Edition)
Show Figures

Figure 1

Back to TopTop