Drones for Rural Areas Management

A special issue of Drones (ISSN 2504-446X). This special issue belongs to the section "Drones in Agriculture and Forestry".

Deadline for manuscript submissions: closed (31 October 2022) | Viewed by 9975

Special Issue Editors


E-Mail Website
Guest Editor
Research Institute for Integrated Management of Coastal Areas (IGIC), Universitat Politècnica de València, 46730 Grau de Gandia, Spain
Interests: environmental monitoring; precision agriculture; image processing; crop management; smart cities; physical sensors
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Instituto Madrileño de Investigación y Desarrollo Rural, Agrario y Alimentario (IMIDRA), Finca “El Encin”, A-2, Km 38, 2, 28805 Alcalá de Henares, Spain
Interests: water management; water quality; crop management; precision agriculture; satellite and drone imagery

E-Mail Website
Guest Editor
Network and Telecommunication Research Group, University of Haute Alsace, 34 Rue du Grillenbreit, 68008 Colmar, France
Interests: precision agriculture; ambient monitoring; water quality monitoring systems; communication protocols

Special Issue Information

Dear Colleagues,

The use of drones is growing worldwide, with new applications and usages emerging regularly. In rural areas, drones can conduct many different actions, such as monitoring based on cameras or other sensors, providing connectivity as a mobile gateway, or even transporting products in hard-to-reach areas. Many of these uses can be directly linked with primary sector activity (such as agriculture, farming, etc.), but their use can be extended to other industries and population centers.

This Special Issue aims to collect the latest advances in this broad topic, including the possible applications and the required infrastructure to ensure its operation. The particularities of rural areas mean that some applications should consider certain limitations to ensure their successful operation. Some of these limitations might be lack of internet access, low population density, large areas to be monitored, heterogeneity of the relief, etc. The fields of application of drone usage in rural areas are broad. They might include agriculture, farming, forestry, mining, other industries, environmental monitoring, water quality monitoring of rivers and lakes, firefighting, or assistance to the population, especially older adults. In this Special Issue, original research articles and reviews are welcome. Research areas may include (but are not limited to) the following:

  • Environmental monitoring;
  • Remote sensing;
  • Remote monitoring;
  • Wireless sensor networks;
  • Time-series analysis;
  • 3D mapping;
  • Satellite and drone imagery integration;
  • Primary sector;
  • Rural areas management;
  • Drones mesh.

We look forward to receiving your contributions.

Dr. Lorena Parra
Dr. David Mostaza-Colado
Dr. Laura Garcia
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Drones is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • precision agriculture
  • remote sensing
  • unmanned aerial vehicles
  • photogrammetry
  • image processing
  • wireless transmission
  • object detection

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

23 pages, 4184 KiB  
Article
UAS Hyperspatial LiDAR Data Performance in Delineation and Classification across a Gradient of Wetland Types
by Narcisa Gabriela Pricope, Asami Minei, Joanne Nancie Halls, Cuixian Chen and Yishi Wang
Drones 2022, 6(10), 268; https://doi.org/10.3390/drones6100268 - 22 Sep 2022
Cited by 6 | Viewed by 2164
Abstract
Wetlands play a critical role in maintaining stable and productive ecosystems, and they continue to be at heightened risk from anthropogenic and natural degradation, especially along the rapidly developing Atlantic Coastal Plain of North America. As such, strategies to develop up-to-date and high-resolution [...] Read more.
Wetlands play a critical role in maintaining stable and productive ecosystems, and they continue to be at heightened risk from anthropogenic and natural degradation, especially along the rapidly developing Atlantic Coastal Plain of North America. As such, strategies to develop up-to-date and high-resolution wetland inventories and classifications remain highly relevant in the context of accelerating sea-level rise and coastal changes. Historically, satellite and airborne remote sensing data along with traditional field-based methods have been used for wetland delineation, yet, more recently, the advent of Uncrewed Aerial Systems (UAS) platforms and sensors is opening new avenues of performing rapid and accurate wetland classifications. To test the relative advantages and limitations of UAS technologies for wetland mapping and classification, we developed wetland classification models using UAS-collected multispectral and UAS-collected light detection and ranging (LiDAR) data relative to airborne-derived LiDAR models of wetland types ranging from palustrine to estuarine. The models were parameterized through a pixel-based random forest algorithm to evaluate model performance systematically and establish variable importance for a suite of variables including topographic, hydrologic, and vegetation-based indices. Based on our experimental results, the average overall classification accuracy and kappa coefficients for the UAS LiDAR-derived models are 75.29% and 0.74, respectively, compared to 79.80% and 0.75 for the airborne LiDAR-derived models, with significant differences in the spatial representation of final wetland classes. The resulting classification maps for the UAS models capture more precise wetland delineations than those of airborne models when trained with ground reference data collected at the same time as the UAS flights. The similar accuracy between the airborne and UAS models suggest that the UAS LiDAR is comparable to the airborne LiDAR. However, given poor revisit time of the airborne surveys and the high spatial resolution and precision of the UAS data, UAS-collected LiDAR provides excellent complementary data to statewide airborne missions or for specific applications that require hyperspatial data. For more structurally complex wetland types (such as the palustrine scrub shrub), UAS hyperspatial LiDAR data performs better and is much more advantageous to use in delineation and classification models. The results of this study contribute towards enhancing wetland delineation and classification models using data collected from multiple UAS platforms. Full article
(This article belongs to the Special Issue Drones for Rural Areas Management)
Show Figures

Figure 1

12 pages, 3573 KiB  
Article
A Faster Approach to Quantify Large Wood Using UAVs
by Daniel Sanhueza, Lorenzo Picco, Alberto Paredes and Andrés Iroumé
Drones 2022, 6(8), 218; https://doi.org/10.3390/drones6080218 - 22 Aug 2022
Cited by 7 | Viewed by 1767
Abstract
Large wood (LW, log at least 1 m-long and 0.1 m in diameter) in river channels has great relevance in fluvial environments. Historically, the most used approach to estimate the volume of LW has been through field surveys, measuring all the pieces of [...] Read more.
Large wood (LW, log at least 1 m-long and 0.1 m in diameter) in river channels has great relevance in fluvial environments. Historically, the most used approach to estimate the volume of LW has been through field surveys, measuring all the pieces of wood, both as single elements and those forming accumulation. Lately, the use of aerial photographs and data obtained from remote sensors has increased in the study of the amount, distribution, and dynamics of LW. The growing development of unmanned aerial vehicle (UAV) technology allows for acquisition of high-resolution data. By applying the structure from motion approach, it is possible to reconstruct the 3D geometry through the acquisition of point clouds and then generate high-resolution digital elevation models of the same area. In this short communication, the aim was to improve a recently developed procedure using aerial photo and geographic information software to analyze LW wood stored in wood jams (WJ), shortening the entire process. Digital measurement was simplified using only AgiSoft Metashape® software, greatly speeding up the entire process. The proposed improvement is more than five times faster in terms of measuring LW stored in jams. Full article
(This article belongs to the Special Issue Drones for Rural Areas Management)
Show Figures

Figure 1

12 pages, 2246 KiB  
Article
The Time of Day Is Key to Discriminate Cultivars of Sugarcane upon Imagery Data from Unmanned Aerial Vehicle
by Marcelo Rodrigues Barbosa Júnior, Danilo Tedesco, Vinicius dos Santos Carreira, Antonio Alves Pinto, Bruno Rafael de Almeida Moreira, Luciano Shozo Shiratsuchi, Cristiano Zerbato and Rouverson Pereira da Silva
Drones 2022, 6(5), 112; https://doi.org/10.3390/drones6050112 - 29 Apr 2022
Cited by 3 | Viewed by 3226
Abstract
Remote sensing can provide useful imagery data to monitor sugarcane in the field, whether for precision management or high-throughput phenotyping (HTP). However, research and technological development into aerial remote sensing for distinguishing cultivars is still at an early stage of development, driving the [...] Read more.
Remote sensing can provide useful imagery data to monitor sugarcane in the field, whether for precision management or high-throughput phenotyping (HTP). However, research and technological development into aerial remote sensing for distinguishing cultivars is still at an early stage of development, driving the need for further in-depth investigation. The primary objective of this study was therefore to analyze whether it could be possible to discriminate market-grade cultivars of sugarcane upon imagery data from an unmanned aerial vehicle (UAV). A secondary objective was to analyze whether the time of day could impact the expressiveness of spectral bands and vegetation indices (VIs) in the biophysical modeling. The remote sensing platform acquired high-resolution imagery data, making it possible for discriminating cultivars upon spectral bands and VIs without computational unfeasibility. 12:00 PM especially proved to be the most reliable time of day to perform the flight on the field and model the cultivars upon spectral bands. In contrast, the discrimination upon VIs was not specific to the time of flight. Therefore, this study can provide further information about the division of cultivars of sugarcane merely as a result of processing UAV imagery data. Insights will drive the knowledge necessary to effectively advance the field’s prominence in developing low-altitude, remotely sensing sugarcane. Full article
(This article belongs to the Special Issue Drones for Rural Areas Management)
Show Figures

Graphical abstract

Back to TopTop