Next Issue
Volume 4, June
Previous Issue
Volume 3, December
 
 

Drones, Volume 4, Issue 1 (March 2020) – 8 articles

Cover Story (view full-size image): Unmanned Aerial Vehicles (UAVs) are widely used in remote sensing applications due to their ability to cover large areas in a relatively short time and capture high-resolution aerial images. In recent years, UAVs have also been used for water quality assessment. UAVs can provide water quality data with high spatial resolution and offer a safe method for sample collection from remote locations. A water sampling device (WSD) and a UAV platform were developed to integrate both a water sampling apparatus and a sensor array to enable adaptive water sampling. The WSD-integrated UAV measures water quality parameters of dissolved oxygen, pH, electrical conductivity, temperature, water depth, and turbidity. The activation of WSD to capture water samples is achieved with an onboard computer by comparing measured parameters with pre-programmed permissible limits.View this paper.
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
15 pages, 5032 KiB  
Article
Applications of Unmanned Aerial Systems (UAS): A Delphi Study Projecting Future UAS Missions and Relevant Challenges
by Alberto Sigala and Brent Langhals
Drones 2020, 4(1), 8; https://doi.org/10.3390/drones4010008 - 10 Mar 2020
Cited by 18 | Viewed by 5281
Abstract
Over recent decades, the world has experienced a growing demand for and reliance upon unmanned aerial systems (UAS) to perform a broad spectrum of applications to include military operations such as surveillance/reconnaissance and strike/attack. As UAS technology matures and capabilities expand, especially with [...] Read more.
Over recent decades, the world has experienced a growing demand for and reliance upon unmanned aerial systems (UAS) to perform a broad spectrum of applications to include military operations such as surveillance/reconnaissance and strike/attack. As UAS technology matures and capabilities expand, especially with respect to increased autonomy, acquisition professionals and operational decision makers must determine how best to incorporate advanced capabilities into existing and emerging mission areas. This research seeks to predict which autonomous UAS capabilities are most likely to emerge over the next 20 years as well as the key challenges for implementation for each capability. Employing the Delphi method and relying on subject matter experts from operations, acquisitions and academia, future autonomous UAS mission areas and the corresponding level of autonomy are forecasted. The study finds consensus for a broad range of increased UAS capabilities with ever increasing levels of autonomy, but found the most promising areas for research and development to include intelligence, surveillance, and reconnaissance (ISR) mission areas and sense and avoid and data link technologies. Full article
Show Figures

Figure 1

14 pages, 2002 KiB  
Letter
Deep Neural Networks and Transfer Learning for Food Crop Identification in UAV Images
by Robert Chew, Jay Rineer, Robert Beach, Maggie O’Neil, Noel Ujeneza, Daniel Lapidus, Thomas Miano, Meghan Hegarty-Craver, Jason Polly and Dorota S. Temple
Drones 2020, 4(1), 7; https://doi.org/10.3390/drones4010007 - 26 Feb 2020
Cited by 63 | Viewed by 8400
Abstract
Accurate projections of seasonal agricultural output are essential for improving food security. However, the collection of agricultural information through seasonal agricultural surveys is often not timely enough to inform public and private stakeholders about crop status during the growing season. Acquiring timely and [...] Read more.
Accurate projections of seasonal agricultural output are essential for improving food security. However, the collection of agricultural information through seasonal agricultural surveys is often not timely enough to inform public and private stakeholders about crop status during the growing season. Acquiring timely and accurate crop estimates can be particularly challenging in countries with predominately smallholder farms because of the large number of small plots, intense intercropping, and high diversity of crop types. In this study, we used RGB images collected from unmanned aerial vehicles (UAVs) flown in Rwanda to develop a deep learning algorithm for identifying crop types, specifically bananas, maize, and legumes, which are key strategic food crops in Rwandan agriculture. The model leverages advances in deep convolutional neural networks and transfer learning, employing the VGG16 architecture and the publicly accessible ImageNet dataset for pretraining. The developed model performs with an overall test set F1 of 0.86, with individual classes ranging from 0.49 (legumes) to 0.96 (bananas). Our findings suggest that although certain staple crops such as bananas and maize can be classified at this scale with high accuracy, crops involved in intercropping (legumes) can be difficult to identify consistently. We discuss the potential use cases for the developed model and recommend directions for future research in this area. Full article
(This article belongs to the Special Issue Deep Learning for Drones and Its Applications)
Show Figures

Figure 1

15 pages, 11848 KiB  
Article
Measuring Height Characteristics of Sagebrush (Artemisia sp.) Using Imagery Derived from Small Unmanned Aerial Systems (sUAS)
by Ryan G. Howell, Ryan R. Jensen, Steven L. Petersen and Randy T. Larsen
Drones 2020, 4(1), 6; https://doi.org/10.3390/drones4010006 - 19 Feb 2020
Cited by 10 | Viewed by 4602
Abstract
In situ measurements of sagebrush have traditionally been expensive and time consuming. Currently, improvements in small Unmanned Aerial Systems (sUAS) technology can be used to quantify sagebrush morphology and community structure with high resolution imagery on western rangelands, especially in sensitive habitat of [...] Read more.
In situ measurements of sagebrush have traditionally been expensive and time consuming. Currently, improvements in small Unmanned Aerial Systems (sUAS) technology can be used to quantify sagebrush morphology and community structure with high resolution imagery on western rangelands, especially in sensitive habitat of the Greater sage-grouse (Centrocercus urophasianus). The emergence of photogrammetry algorithms to generate 3D point clouds from true color imagery can potentially increase the efficiency and accuracy of measuring shrub height in sage-grouse habitat. Our objective was to determine optimal parameters for measuring sagebrush height including flight altitude, single- vs. double- pass, and continuous vs. pause features. We acquired imagery using a DJI Mavic Pro 2 multi-rotor Unmanned Aerial Vehicle (UAV) equipped with an RGB camera, flown at 30.5, 45, 75, and 120 m and implementing single-pass and double-pass methods, using continuous flight and paused flight for each photo method. We generated a Digital Surface Model (DSM) from which we derived plant height, and then performed an accuracy assessment using on the ground measurements taken at the time of flight. We found high correlation between field measured heights and estimated heights, with a mean difference of approximately 10 cm (SE = 0.4 cm) and little variability in accuracy between flights with different heights and other parameters after statistical correction using linear regression. We conclude that higher altitude flights using a single-pass method are optimal to measure sagebrush height due to lower requirements in data storage and processing time. Full article
(This article belongs to the Special Issue Ecological Applications of Drone-Based Remote Sensing)
Show Figures

Graphical abstract

16 pages, 4905 KiB  
Article
Adaptive Water Sampling Device for Aerial Robots
by Cengiz Koparan, A. Bulent Koc, Charles V. Privette and Calvin B. Sawyer
Drones 2020, 4(1), 5; https://doi.org/10.3390/drones4010005 - 06 Feb 2020
Cited by 26 | Viewed by 8943
Abstract
Water quality monitoring and predicting the changes in water characteristics require the collection of water samples in a timely manner. Water sample collection based on in situ measurable water quality indicators can increase the efficiency and precision of data collection while reducing the [...] Read more.
Water quality monitoring and predicting the changes in water characteristics require the collection of water samples in a timely manner. Water sample collection based on in situ measurable water quality indicators can increase the efficiency and precision of data collection while reducing the cost of laboratory analyses. The objective of this research was to develop an adaptive water sampling device for an aerial robot and demonstrate the accuracy of its functions in laboratory and field conditions. The prototype device consisted of a sensor node with dissolved oxygen, pH, electrical conductivity, temperature, turbidity, and depth sensors, a microcontroller, and a sampler with three cartridges. Activation of water capturing cartridges was based on in situ measurements from the sensor node. The activation mechanism of the prototype device was tested with standard solutions in the laboratory and with autonomous water sampling flights over the 11-ha section of a lake. A total of seven sampling locations were selected based on a grid system. Each cartridge collected 130 mL of water samples at a 3.5 m depth. Mean water quality parameters were measured as 8.47 mg/L of dissolved oxygen, pH of 5.34, 7 µS/cm of electrical conductivity, temperature of 18 °C, and 37 Formazin Nephelometric Unit (FNU) of turbidity. The dissolved oxygen was within allowable limits that were pre-set in the self-activation computer program while the pH, electrical conductivity, and temperature were outside of allowable limits that were specified by Environmental Protection Agency (EPA). Therefore, the activation mechanism of the device was triggered and water samples were collected from all the sampling locations successfully. The adaptive water sampling with Unmanned Aerial Vehicle-assisted water sampling device was proved to be a successful method for water quality evaluation. Full article
Show Figures

Figure 1

7 pages, 204 KiB  
Communication
The Quality of Blood is not Affected by Drone Transport: An Evidential Study of the Unmanned Aerial Vehicle Conveyance of Transfusion Material in Japan
by Fumiatsu Yakushiji, Koki Yakushiji, Mikio Murata, Naoki Hiroi, Keiji Takeda and Hiroshi Fujita
Drones 2020, 4(1), 4; https://doi.org/10.3390/drones4010004 - 22 Jan 2020
Cited by 18 | Viewed by 5962
Abstract
Unmanned aerial vehicles (UAVs), or drones, are used in Rwanda for transfusion transport, but they have not yet been used in Japan. This technology holds promise for transporting medical supplies during disasters or to remote places where the terrain makes it difficult to [...] Read more.
Unmanned aerial vehicles (UAVs), or drones, are used in Rwanda for transfusion transport, but they have not yet been used in Japan. This technology holds promise for transporting medical supplies during disasters or to remote places where the terrain makes it difficult to travel by land. One of the difficulties in using UAVs is the temperature-control requirements for red blood cell (RBC) solutions, i.e., 2 °C to 6 °C according to Japanese regulations. This study aimed to describe the effectiveness of UAV-based transport of RBC solution. For testing, we gradually increased the UAV travel distance, monitored the temperature of the RBC solution, and conducted laboratory tests to check the integrity of the blood sample. Lactate dehydrogenase (LD) was used as a hemolytic index to indicate the effect of the UAV flight on the blood samples. The UAV was able to exceed 7 km of travel distance despite the relatively heavy load needed for the RBC solution storage. The LD level was not significantly different between the flight and non-flight (control) samples. However, we were not able to completely maintain a temperature of 2 °C to 6 °C; nonetheless, the deviation was within the safe range. Full article
3 pages, 380 KiB  
Editorial
Acknowledgement to Reviewers of Drones in 2019
by Drones Editorial Office
Drones 2020, 4(1), 3; https://doi.org/10.3390/drones4010003 - 21 Jan 2020
Viewed by 1695
Abstract
The editorial team greatly appreciates the reviewers who have dedicated their considerable time and expertise to the journal’s rigorous editorial process over the past 12 months, regardless of whether the papers are finally published or not [...] Full article
25 pages, 16691 KiB  
Article
EyeTrackUAV2: A Large-Scale Binocular Eye-Tracking Dataset for UAV Videos
by Anne-Flore Perrin, Vassilios Krassanakis, Lu Zhang, Vincent Ricordel, Matthieu Perreira Da Silva and Olivier Le Meur
Drones 2020, 4(1), 2; https://doi.org/10.3390/drones4010002 - 08 Jan 2020
Cited by 12 | Viewed by 6246
Abstract
The fast and tremendous evolution of the unmanned aerial vehicle (UAV) imagery gives place to the multiplication of applications in various fields such as military and civilian surveillance, delivery services, and wildlife monitoring. Combining UAV imagery with study of dynamic salience further extends [...] Read more.
The fast and tremendous evolution of the unmanned aerial vehicle (UAV) imagery gives place to the multiplication of applications in various fields such as military and civilian surveillance, delivery services, and wildlife monitoring. Combining UAV imagery with study of dynamic salience further extends the number of future applications. Indeed, considerations of visual attention open the door to new avenues in a number of scientific fields such as compression, retargeting, and decision-making tools. To conduct saliency studies, we identified the need for new large-scale eye-tracking datasets for visual salience in UAV content. Therefore, we address this need by introducing the dataset EyeTrackUAV2. It consists of the collection of precise binocular gaze information (1000 Hz) over 43 videos (RGB, 30 fps, 1280 × 720 or 720 × 480). Thirty participants observed stimuli under both free viewing and task conditions. Fixations and saccades were then computed with the dispersion-threshold identification (I-DT) algorithm, while gaze density maps were calculated by filtering eye positions with a Gaussian kernel. An analysis of collected gaze positions provides recommendations for visual salience ground-truth generation. It also sheds light upon variations of saliency biases in UAV videos when opposed to conventional content, especially regarding the center bias. Full article
Show Figures

Graphical abstract

21 pages, 7725 KiB  
Article
Prediction of Optical and Non-Optical Water Quality Parameters in Oligotrophic and Eutrophic Aquatic Systems Using a Small Unmanned Aerial System
by Juan G. Arango and Robert W. Nairn
Drones 2020, 4(1), 1; https://doi.org/10.3390/drones4010001 - 24 Dec 2019
Cited by 21 | Viewed by 4446
Abstract
The purpose of this study was to create different statistically reliable predictive algorithms for trophic state or water quality for optical (total suspended solids (TSS), Secchi disk depth (SDD), and chlorophyll-a (Chl-a)) and non-optical (total phosphorus (TP) and total nitrogen (TN)) water quality [...] Read more.
The purpose of this study was to create different statistically reliable predictive algorithms for trophic state or water quality for optical (total suspended solids (TSS), Secchi disk depth (SDD), and chlorophyll-a (Chl-a)) and non-optical (total phosphorus (TP) and total nitrogen (TN)) water quality variables or indicators in an oligotrophic system (Grand River Dam Authority (GRDA) Duck Creek Nursery Ponds) and a eutrophic system (City of Commerce, Oklahoma, Wastewater Lagoons) using remote sensing images from a small unmanned aerial system (sUAS) equipped with a multispectral imaging sensor. To develop these algorithms, two sets of data were acquired: (1) In-situ water quality measurements and (2) the spectral reflectance values from sUAS imagery. Reflectance values for each band were extracted under three scenarios: (1) Value to point extraction, (2) average value extraction around the stations, and (3) point extraction using kriged surfaces. Results indicate that multiple variable linear regression models in the visible portion of the electromagnetic spectrum best describe the relationship between TSS (R2 = 0.99, p-value = <0.01), SDD (R2 = 0.88, p-value = <0.01), Chl-a (R2 = 0.85, p-value = <0.01), TP (R2 = 0.98, p-value = <0.01) and TN (R2 = 0.98, p-value = <0.01). In addition, this study concluded that ordinary kriging does not improve the fit between the different water quality parameters and reflectance values. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop