Next Article in Journal
Thermal Image Tracking for Search and Rescue Missions with a Drone
Previous Article in Journal
Research on Unmanned Aerial Vehicle Path Planning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Comparative Study of Multi-Rotor Unmanned Aerial Vehicles (UAVs) with Spectral Sensors for Real-Time Turbidity Monitoring in the Coastal Environment

1
Environmental Process Modelling Centre, Nanyang Environment and Water Research Institute, Nanyang Technological University, Singapore 637141, Singapore
2
Interdisciplinary Graduate Programme, Graduate College, Nanyang Technological University, Singapore 637335, Singapore
3
Maritime and Port Authority of Singapore, Singapore 119963, Singapore
4
School of Civil and Environmental Engineering, Nanyang Technological University, Singapore 639798, Singapore
*
Author to whom correspondence should be addressed.
Drones 2024, 8(2), 52; https://doi.org/10.3390/drones8020052
Submission received: 29 December 2023 / Revised: 26 January 2024 / Accepted: 29 January 2024 / Published: 5 February 2024
(This article belongs to the Special Issue Unconventional Drone-Based Surveying 2nd Edition)

Abstract

:
Complex coastal environments pose unique logistical challenges when deploying unmanned aerial vehicles (UAVs) for real-time image acquisition during monitoring operations of marine water quality. One of the key challenges is the difficulty in synchronizing the images acquired by UAV spectral sensors and ground-truth in situ water quality measurements for calibration, due to a typical time delay between these two modes of data acquisition. This study investigates the logistics for the concurrent deployment of the UAV-borne spectral sensors and a sampling vessel for water quality measurements and the effects on the turbidity predictions due to the time delay between these two operations. The results show that minimizing the time delay can significantly enhance the efficiency of data acquisition and consequently improve the calibration process. In particular, the outcomes highlight notable improvements in the model’s predictive accuracy for turbidity distribution derived from UAV-borne spectral images. Furthermore, a comparative analysis based on a pilot study is conducted between two multirotor UAV configurations: the DJI M600 Pro with a hyperspectral camera and the DJI M300 RTK with a multispectral camera. The performance evaluation includes the deployment complexity, image processing productivity, and sensitivity to environmental noises. The DJI M300 RTK, equipped with a multispectral camera, is found to offer higher cost-effectiveness, faster setup times, and better endurance while yielding good image quality at the same time. It is therefore a more compelling choice for widespread industry adoption. Overall, the results from this study contribute to advancement in the deployment of UAVs for marine water quality monitoring.

1. Introduction

Regulatory monitoring is of paramount importance to Environmental Monitoring and Management Plans (EMMPs) for coastal engineering, particularly in safeguarding coastal ecosystems and for environmental protection during marine operations such as reclamation and dredging. Immediate access to high-quality data is crucial for timely decision-making, ensuring that interventions are prompt and effective in minimizing potential ecological and environmental risks. As coastal areas continue to experience heightened human activities, such as land reclamation and resource exploration, the demand for real-time monitoring solutions becomes increasingly pronounced.
Coastal environments, characterized by their complexities and dynamic nature, introduce significant challenges to the acquisition of real-time monitoring data. The complex interplay of sea surface conditions (e.g, wave, sunglint, marine traffic), varying topographies and bathymetries, and dynamic weather conditions demands innovative solutions for effective data collection. Presently, water quality monitoring relies on using in situ samples, fixed water quality sensors, and acoustic doppler current profilers (ADCPs) mounted on traversing vessels or on stationary buoys [1]. These methods typically entail prolonged preparation and setup times, primarily offering point measurements (as shown in Figure 1) that inadequately capture the dynamic and rapidly changing coastal environment. This limitation becomes evident when monitoring the spatial distribution of turbidity plumes generated during marine operations, such as land reclamation or dredging activities. The urgency of monitoring and safeguarding coastal ecosystems requires a different approach to remote sensing [2,3]. Satellite imaging can cover large areas. However, it does have certain disadvantages. One notable limitation is the relatively lower spatial resolution inherent in satellite imagery compared to UAV imagery, which may hinder detailed observations of smaller-scale phenomena in coastal environments. Additionally, cloud cover and atmospheric conditions can impact the quality and the uncertainties in extracting and processing satellite images [4]. Furthermore, low frequency of satellite data acquisition leads to gaps in monitoring and potential delays in obtaining crucial information. This is because the fixed orbital paths of satellites may not be able to capture events where rapid changes in water quality can occur [5]. These drawbacks highlight the need for complementary remote sensing approaches, such as unmanned aerial vehicles (UAVs)-based systems, to achieve on-demand monitoring in the context of marine operations in coastal waters [6,7,8].
UAVs, with their versatility and adaptability for remote sensing, have emerged as invaluable tools. Their flexibility and quick deployment enable on-demand monitoring of coastal ecosystems, providing a distinct advantage over satellite imagery [9,10]. The UAV deployment also supports environmental protection efforts in marine activities by allowing closer proximity to the areas of interest compared to traditional point-based methods [11]. UAV-based remote sensing has been utilized in various applications in the coastal environment in the past decade [12,13,14,15,16]. There have been numerous efforts to establish a protocol or framework for coastal environment remote sensing using UAVs [17,18,19,20]. However, there have been limited efforts in streamlining UAV planning and on-site image processing, due to several critical challenges. The first challenge involves the difficulty in image processing and mosaicking over the largely featureless and homogenous coastal waters, which has been addressed in recent studies [21,22,23]. The second challenge is the fast dispersion of suspended sediment in the open sea, which creates significant amplification of noises in the measured reflectance of UAV imaging [24]. The intricacies posed by coastal environments can be effectively addressed through a strategic approach that streamlines UAV deployment and ground-truth measurements. This challenge has not been comprehensively studied in prior research, to the best of our knowledge, underscoring the need for a deeper understanding of the advantages gained from a more efficient UAV deployment strategy.
Throughout this pilot study, two critical aspects have been identified for streamlining UAV operations: (1) pre-UAV flights preparation involving equipment settings and logistics planning and (2) the procedure during data acquisition. Ground-truth measurements using vessels in conjunction with UAV deployment serve as an important validation procedure to validate water quality properties retrieved from the UAV imagery, contributing to a comprehensive understanding of the coastal environment. This study is twofold: first, to identify and comprehend the challenges inherent in deploying UAVs for image acquisition in complex coastal environments, and second, to develop a solution-oriented approach by integrating UAVs with sampling vessels. This integration aims to minimize time delays between UAV image acquisition and in situ water quality measurements, improve data collection efficiency, and ultimately optimizing the efficacy of EMMPs in coastal regions.

2. Materials and Methods

2.1. Study Area

The study area encompasses the southwestern water region of Singapore, as illustrated in Figure 2. Throughout the survey period, the study area witnessed extensive coastal operations, notably dredging and reclamation activities, resulting in the generation of sediment plumes and elevated turbidity levels in the water. The UAV flights and field sampling were strategically carried out during diverse marine conditions and operational scenarios. One such scenario took place within the inner basin of the southwestern water region of Singapore, characterized by a water depth of less than 20 m. This contributed to relatively calm marine conditions and a reduced impact of the tidal currents compared to the open basin. In addition, certain UAV surveys were conducted over the open basin, specifically at the open channel of the Singapore Strait, where the hydrodynamics play a significant role. This approach allows for a comprehensive exploration of varying environmental conditions within the study area.
Throughout our comprehensive study, a total of 21 surveys were conducted, comprising 62 UAV flights. These surveys encompassed 12 flights utilizing the UAV hyperspectral camera system and 50 flights employing the UAV multispectral camera system. The 12 hyperspectral flights were strategically scheduled in conjunction with the multispectral flights, facilitating a thorough comparative analysis and calibration. Subsequently, the remaining 38 UAV flights with only the multispectral system were specifically designed for verifying the sensor’s performance and refining the model training process. Following each survey, a meticulous analysis of the acquired images took place, driving ongoing refinement of logistics planning procedures to elevate image quality and survey efficiency in subsequent flight activations. Each survey day comprised two to three UAV flights, and each flight lasted approximately 10 to 20 min, coupled with additional water sampling for ground-truth measurements. These surveys were strategically focused on monitoring high turbidity concentrations, prompting swift activation of each flight and sampling session within 5 to 10 min after dumping operations commenced. The data collected from the water sampling processes served as valuable ground-truth data for both UAV-borne hyperspectral and multispectral images, enhancing the reliability and accuracy of the survey outcome.

2.2. Equipment

2.2.1. DJI M600 Pro with a Hyperspectral Camera

To conduct the UAV survey with a hyperspectral sensor, a rotary-wing hexacopter DJI Matrice 600 (M600) Pro (SZ DJI Technology Co., Ltd., Shenzhen, China) was chosen as the airborne system to fly over the survey area. The UAV carried a Bayspec OCI™-F push-broom hyperspectral camera (BaySpec Incorporated, San Jose, CA, USA) to capture a broad range of spectral data in the visible and near-infrared (VIS-NIR) spectrum, as shown in Figure 3.
The specifications of the DJI M600 Pro and the Bayspec OCI™-F hyperspectral system are indicated in Table A1. Additional accessories were deployed to improve the quality of data acquired during the UAV flights, as listed below.
  • A spectrometer, Ocean Optics FLAME-S (Ocean Insight, Florida, USA), was used to record the absolute downwelling irradiation during the flight mission, which was used as reference values for the reflectance calculation during image processing (Figure 3c).
  • A gimbal, DJI Ronin MX (DJI, Shenzhen, China), was used to stabilize the camera during flight, thereby reducing distortion and misalignment of images.
  • Both D-RTK GNSS systems (Figure 3b) mounted on the top of the UAV and the RTK base station (Figure 3e) placed near the take-off point were deployed. The RTK base station serves as an additional high-precision navigation system to improve the positioning of the UAV system.

2.2.2. DJI M300 RTK with a Multispectral Camera

The UAV multispectral imagery system employed in this study comprises a DJI Matrice 300 RTK (Figure 4a) equipped with a MicaSense Rededge-MX Duo multispectral camera (refer to Table A1, Figure 4b). The DJI M300 RTK boasts a wingspan of 1.2 m and a maximum take-off mass of 9 kg. The seamless integration of the multispectral sensor, the MicaSense Rededge-MX Dual camera, is achieved through a built-in DJI SkyPort, ensuring adequate stabilization during the UAV flight. This camera captures images across 10 spectral bands in the visible and near-infrared (VIS-NIR) spectrum. With a total external payload capacity of 2.7 kg and a payload of less than 1 kg for the camera and other accessories, this system allows for 45 min of continuous operation. The simplified setup facilitates plug-and-play operation, enabling UAV pilots to operate the drone efficiently.
Throughout the survey, the multispectral camera settings were consistently maintained to ensure optimal data acquisition. The frontal/along-track overlap was set at 80–85%, and the side/cross overlap was set at 70–75% to promote robust data consistency across adjacent swaths. These standardized settings ensured the systematic and precise acquisition of multispectral imagery, yielding high-quality data for subsequent analysis and interpretation.
For improvement of images acquired during ever-changing ambient and light condition, a Downwelling Light Sensor (DLS) as shown in Figure 4c was installed on top of the aircraft to record the downwelling irradiance for each of the 10 bands of the MicenseRedEdge-MX Duo camera and for each image captured throughout the flight (the DLS information is embedded within the metadata of each image for each band). The DLS unit was secured on a mounting plate on top of the DJI M300 RTK to ensure no interference with the downwelling light conditions. The DLS unit was removed from the UAV before transportation to minimize damage.
A Calibration Reflectance Panel (CRP) was used to capture the reference reflectance of all 10 spectral bands. The reference reflectance was used in conjunction with the DLS data to perform radiometric correction due to changing illumination conditions. The CRP has a prescribed reflectance value at 10 wavelengths, namely 444 ± 28 (coastal blue), 475 ± 32 (blue), 531 ± 14, 560 ± 27 (green), 650 ± 16, 668 ± 14 (red), 705 ± 10, 717 ± 12, 740 ± 18 (red edge), and 842 ± 57 (NIR). In order to conduct radiometric corrections, the CRP reference image is taken pre and post UAV flights to cover the change in illumination conditions during the entire UAV operations. In particular, the UAV should be held at 1 m above the reflectance panel to capture the CRP while avoiding shadows, as shown in Figure 5.
The installation of the MicaSense RedEdge-MX Duo Camera System adhered to the manufacturer’s recommended guidelines (https://support.micasense.com/hc/article_attachments/360053314653/RedEdge-MX_Dual_Camera_Integration_Guide_Rev2.pdf, accessed on 25 December 2023). After installing the camera system, it is crucial to conduct a thorough check to ensure the stability and secure positioning of the cameras, avoiding any obstructions that could impact the quality of captured images. The camera settings are shown in Figure A2.

2.3. Survey Setup and Logistics Planning

2.3.1. UAV Operation Planning

The logistics planning for the coastal field deployment of UAV spectral imaging systems followed the framework developed in our previous study [25], which included permits and licenses acquisition, weather monitoring, standby drone salvage, field coordination during the UAV surveys, and data transfer and management after the surveys. This framework had proven effective through end-to-end procedural handling in our recent surveys. The flight planning for both hyperspectral and multispectral imaging followed the same pattern, as illustrated in Figure 6. Frontal and side overlaps were meticulously adjusted, taking into account the flight course angle and speed during each survey. Following multiple trial flights to optimize image stitching, specific values were established. Notably, the frontal overlap, representing the overlap between consecutive images in the same flight line, was set at 85% for multispectral imaging and 75.6% for hyperspectral imaging. Simultaneously, the side overlap, indicating the overlap between images in two consecutive flight lines, was configured at 75% for multispectral imaging and 35.6% for hyperspectral imaging.
The flight altitude was maintained at a steady 60 m above mean sea level (AMSL) at the maximum allowable altitude in Singapore. The prescribed range of flight speeds span from 4 to 4.8 m/s. Simultaneously, the time between captures exhibited variability, fluctuating between 1.5 and 2.5 s to achieve the assigned overlap rate. The total flight time, typically falling within the range of 10 to 12 min for hyperspectral imaging and 15 to 20 min for multispectral imaging, was contingent upon the flight speed and the predetermined overlap ratio.

2.3.2. UAV Deployment and Field Measurement

Ground-truth measurements obtained through field sampling play a pivotal role in providing essential calibration data, significantly enhancing the accuracy and reliability of information obtained through UAV imaging. The calibration process involves concurrent water sampling and turbidity logging, executed in tandem to ensure precise correlation. Simultaneous in situ sampling and UAV imaging are discouraged to prevent potential obstructions caused by vessel presence in the images, which could introduce noise in the spectral data. Consequently, sampling activities commence only after the UAV has completed its passage. For grab sampling, a pump continually drew water from the surface, and laboratory measurements of total suspended solids (TSS) in milligrams per liter (mg/L) were conducted according to APHA standards (Pt 2540D).
A YSI ProDSS sonde equipped with an auto-logging turbidity probe (refer to Figure 7 for specifications) was affixed to a sampling vessel. This instrument continuously recorded turbidity measurements in Formazin Nephelometric Units (FNU) and corresponding location coordinates every second during the vessel’s movement.
The placement of instruments during field measurement is critical in ensuring accurate data acquisition. The turbidity probe is consistently positioned at the front of the vessel, as shown in Figure 8a, to minimize potential turbulence caused by the vessel’s motion. To achieve optimal data collection, the probe is fully submerged at a depth of approximately 0.5 m below the sea surface (Figure 8b). The collection time and coordinates of each sample are recorded by the probe for model calibration in the later stage.

2.3.3. Improvement in Marine Operation Tracking for UAV Deployment

The primary objective of UAV monitoring was to rapidly and efficiently capture high turbidity generated during short-lived marine operations, typically lasting less than 10 min. To achieve this, UAV operations needed to commence promptly after the barge released the sediments. However, in the early stages of our study, UAV operations were conducted without coordination with the ground control team (i.e, control room), resulting in a lack of information about the schedule of marine activities. Consequently, fewer high-concentration plume events were captured, and UAV operators faced challenges activating UAV flights on short notice. To address this issue, a new protocol for land–marine communication was established, outlined in Figure 9 and summarized in the following steps.
  • About 1 to 1.5 h before UAV operation (T1), operators would communicate with the on-site control room to coordinate vessel and UAV mobilization. Using the FindShip app [26] shown in Figure A3, the UAV operator and the marine navigator would track the marine traffic and barge schedule to estimate the dumping time and hence flight commencement.
  • As the barge approaches the dumping location after undergoing a material quality check at the marine inspection station, the UAV operator powers up batteries, configures camera settings, and performs calibration 30 min prior to the operation (T2).
  • Once dumping operations are completed, the barge departs (T3), and the sampling vessel starts moving toward the plume center, with the UAV following closely.
  • The UAV hovers above the plume center (i.e, home point), while the operator generates a flight plan for comprehensive coverage of the entire plume area (T4).
  • After the flight plan is established, both the UAV and sampling vessel move to the plume edge, and the survey begins (T5).
This protocol enhances coordination and communication, increasing the likelihood of capturing high-turbidity events during marine operations.

2.4. Image Processing

The stitching and mosaicking of acquired images are critically important for generating high-spatial coverage images of the monitoring area for analysis. However, unlike land surfaces, which always have distinct features [5], coastal water surfaces are relatively homogenous, and their images are typically featureless. This property poses a challenge for stitching and mosaicking aerial image frames captured during UAV flights, as highlighted in the literature [27]. Existing commercial image stitching methods that rely on feature detection often fail to identify distinctive features in the coastal hydro-environment images captured at the study sites, which leads to subpar quality of the stitched image or even failure in the feature-based processing required for image stitching. To overcome this challenge, we adopted two software solutions, namely CoastalWQL (version 1.0) and Pontuspectra (version 1.0) in this study. These software solutions have successfully developed a GPS-based stitching algorithm to automate the processing and stitching of featureless water surface images.
The new stitching algorithm was embedded in two software, CoastalWQL version 1.0 [23] for hyperspectral imaging and Pontuspectra version 1.0 (Nanyang Technological University—NTUitive Pte Ltd., Singapore, 2022) for multispectral imaging. Both software operate as standalone programs, requiring no third-party software dependencies. They utilize GPS coordinates obtained from the embedded GPS module in the imager to calculate the overlap rate of consecutive images. This method has direct benefits for the processing of coastal hydro-environmental images, as it can overcome the challenges associated with stitching featureless water surfaces, which are common in coastal imageries. Both software follow the same approach for image processing, utilizing a GPS-based stitching algorithm for image stitching over homogeneous water surfaces. Despite employing different methods for calculating reflectance, their performance under identical hydro-environment and weather conditions will not introduce significant variations.

2.4.1. Software for Hyperspectral Image Processing

This study adopted the CoastalWQL software (version 1.0) for the processing of hyperspectral images acquired using the hyperspectral system (BaySpec OCI-F). CoastalWQL is an open-source software available on GitHub (https://github.com/pakhuiying/CoastalWQL, accessed on 29 December 2023) and is a platform for automated hyperspectral image stitching and processing of coastal water images. This software provides an end-to-end procedure for stitching the hyperspectral push-broom images (Figure A4). Some highlighted features of CoastalWQL are described below.
  • A GPS-based image stitching method for improved image mosaicking over homogeneous water surfaces.
  • Automatic radiometric correction to account for varying irradiance during field surveys.
  • Encompassing visualization features such as image alignment, correction of stripe noises, masking/classification of non-water objects, and sunglint correction, which contribute to an improved representation of stitched images.
  • A turbidity map generated based on expertly trained machine learning models.
  • Visualization platform to display turbidity maps.

2.4.2. Software for Multispectral Image Processing

Pontuspectra software provides a fully automated process to generate turbidity distribution maps from images of coastal waters taken from a UAV-borne multispectral image, as shown in Figure A5. Pontuspectra (version 1.0) has the following unique features.
  • Intelligent dual GPS image stitching with smart blending for improved image alignment over homogeneous water surfaces.
  • Automated radiometric and sunglint corrections on images.
  • Isolation and masking of non-water objects to focus on turbidity concentration in the water body.
  • Embedded turbidity prediction model trained using machine learning with extensive ground-truth data of over 110,000 samples.
  • Generation of quantitative turbidity maps.

3. Results

3.1. Analysis of the Capabilities of the DJI M600 Pro and DJI M300 RTK Systems

This section will present a comparative analysis between two multirotor UAV configurations, the DJI M600 Pro with a hyperspectral camera and the DJI M300 RTK with a multispectral camera, based on the pilot study comprising 62 UAV flights.

3.1.1. Trade-off between Payload Capacity and Flight Endurance

Considering the trade-off between total payload and flight time is crucial when determining the most suitable UAV system for coastal remote sensing applications. The loading capacity or maximum payload is a major factor in determining the sensors use during monitoring and significantly influences UAV capabilities such as flight endurance, maneuverability, and wind resistance. For instance, the flight time for the DJI M600 Pro decreases from 32–38 min without payload to 16–18 min with maximum payload. The DJI M300 RTK achieves up to 55 min without payload and 30–45 min with a payload of 0.5–2.7 kgs. Figure 10 illustrates the correlation between extended flight time and external payload variability for both systems.
As shown in Table 1, the DJI M300 RTK has more robust wind resistance (15 m/s) compared to the DJI M600 Pro (9 m/s), making it exceptionally well-suited for coastal environments with strong winds. In addition, the DJI M300 RTK equipped with the selected multispectral camera has a compact design, featuring four lighter-weight propellers, which enhances its maneuverability—a critical advantage in coastal settings. It offers superior cost-effectiveness, faster setup, and extended endurance, making the DJI M300 RTK a compelling choice for widespread industry adoption.

3.1.2. Setup Time and Operation Planning

Operating the hyperspectral imaging system requires a minimum of one hour of preparation with a team of at least three UAV operators. The process involves the following steps:
  • Hardware Setup (15–20 min): Configure the UAV and onboard spectrometer, balance the camera and gimbal, and install the ground-based D-RTK for enhancing UAV positioning.
  • Software Setup (20–30 min): Connect and calibrate the spectrometer using the OceanView software version 1.4.1 installed in the mini-computer and adjust camera settings using the SpecGrabber software version 1100 (Figure A1).
  • Flight Planning (15–20 min): Due to the lack of a built-in RGB camera on the DJI M600 Pro and limited flight endurance (up to 18 min), the UAV only takes off after receiving the GPS coordinates of the plume center from the vessel (Figure 11a). The scanning area is adjusted on the DJI Go Pro app based on the settings of the gimbal and camera angle.
Meanwhile, the procedure for DJI M300 RTK—Micasense RedEdge-MX Duo is simpler and more automated, requiring less than 25 min with a team of two operators. The entire process is described below.
  • Hardware Setup (up to 10 min): Configure the UAV and onboard DLS and seamlessly attach the camera to the compatible DJI SkyPort.
  • Camera Calibration (less than 5 min): Follow the procedures described in Section 2.2.2, Figure 5.
  • Flight Planning (7–10 min): Unlike hyperspectral imaging, UAV multispectral imaging does not require GPS coordinates from the vessel. The UAV can launch directly to the plume center, hover above the sampling vessel, and map the scanning area while the vessel moves to its starting point at the edge of the plume area (Figure 11b).

3.1.3. Performance of Spectral Sensors

  • Sensitivity to noise
In open field survey areas, the light conditions can vary due to the movement of clouds or the sun angle. The stitching of water surface images was found to be very sensitive to the light intensity variation during flights. The intensity correction can address the variation of ambient light intensity during the UAV survey. To obtain accurate reflectance data, radiometric correction has to be conducted using additional spectrometers that are set up either onboard the UAV or on the ground [9]. The spectrometers record the absolute downwelling irradiation during the flight mission, which is later used as a reference for the reflectance calculation.
In the context of challenging weather conditions and varying light, the stitching of hyperspectral images faced disruptions in areas with low light conditions. Radiometric correction, crucial for both spectral cameras, introduces uncertainties, particularly in instances of poor illumination and extreme wavelengths. This correction becomes essential with illumination changes, such as those caused by passing clouds darkening the scene. For hyperspectral imaging with Bayspec, exposure time adjustment was conducted manually before flights, and radiometric correction was performed during post-processing using the downwelling irradiance measurements by the spectrometer. In contrast, the Micasense RedEdge-MX Duo includes a DLS, and radiometric correction was conducted using the CRP, with the downwelling irradiance measured by the DLS.
It is imperative to recognize that radiometric correction is imperfect, with uncertainties heightened under suboptimal illumination conditions such as during cloudy conditions. The calibration curve in Figure A6 illustrates the relationship between normalized dark number (DN) and irradiance, revealing challenges at extreme wavelengths. Poor illumination conditions affect image stitching, reducing frames per second (fps) and compromising overlap ratios and image alignment, as depicted in Figure 12. Both sensors are affected by poor light conditions, with the hyperspectral camera more sensitive due to adjustable exposure time and spectrometer advancements in improving reflectance calibrations under changing ambient light conditions. Moreover, manual adjustment of light intensity during hyperspectral camera calibration can introduce higher uncertainty compared to automated calibration using the calibration reflectance panel for multispectral cameras. As evident in Figure A7, excessive exposure time led to oversaturated images during data acquisition, impacting the stitching and masking processes adversely. The masking algorithm misclassified oversaturated areas, hindering the extraction of reflectance over those masked regions, particularly challenging in areas with numerous water quality points.
2.
Processing time for data processing
When comparing UAV-borne hyperspectral systems, such as the Bayspec OCI-F, with UAV-borne multispectral systems like the Micasense RedEdge-MX Duo, a critical distinction arises in terms of spectral resolution. The Bayspec OCI-F captures a considerably higher spectral resolution, boasting an extensive range of 61 spectral bands, providing finer details for image processing across a broader spectrum. However, this heightened spectral detail comes at the cost of increased data complexity, necessitating a more intricate data processing approach to handle the larger number of spectral bands.
Conversely, multispectral cameras like the Micasense RedEdge-MX Duo operate by capturing data in distinct bands within the visible and near-infrared spectrum, offering a more straightforward approach with only 10 spectral bands. This discrete band method simplifies data processing, resulting in faster processing speeds, which is advantageous for applications requiring frequent UAV deployment and real-time monitoring in dynamic marine operations.
As illustrated in Figure A8, for a UAV flight duration of 12 min at a consistent speed of 4 m/s, the hyperspectral imaging system acquired a substantial 26,050 images. The subsequent data processing for this hyperspectral dataset took over one hour using CoastalWQL software version 1.0. In contrast, the multispectral imaging system captured a significantly lower number of images (7070 images), and the corresponding data processing required approximately 22 min. This comparison underscores the trade-off between spectral detail and processing efficiency, emphasizing the suitability of each system for specific operational requirements.
3.
Performance combability of two platforms
Due to operational restrictions for UAV flights in the reclamation area of Singapore, simultaneous operation of two UAV systems was not feasible. However, we successfully minimized the time interval between two flights to below 15 min and conducted surveys in the same survey area. Figure 13 illustrates the turbidity distribution acquired by both hyperspectral (Figure 13a) and multispectral (Figure 13b) imaging systems at the same location in the open sea of the Singapore Strait, where noticeable hydrodynamic changes occur. Both systems effectively captured low- and high-concentration sediment plumes discharged from the barge, ranging from 2 FNU to over 30 FNU. Remarkably, the predicted turbidity results from multispectral datasets align closely with the actual turbidity recorded by the probe. It is worth noting that due to the distance from the take-off location to the plume area being up to 600 m and the limited battery capacity of the DJI M600—Bayspec imaging system, hyperspectral imaging could only cover a 100 m × 100 m area. In contrast, the DJI M300—Micasense system, with a larger coverage area, could capture a plume area twice as extensive, measuring 200 m × 200 m, resulting in broader coverage during multispectral imaging.
Table 2 summarizes the comparative analysis between two UAV imaging systems.
As described in Table 2, the UAV hyperspectral imaging system with the Bayspec OCI-F captures a much wider frequency range with 61 spectral bands, yielding spectral data with fine resolutions for the prediction model. However, this type of camera also comes with high costs and is heavier, thus requiring larger UAVs that can be unfavorable for frequent usage in the industry. Thus, the adaptation of UAV hyperspectral imagery for engineering applications is very limited. Meanwhile, UAV multispectral remote sensing is more practical and cost-effective, but it only has 10 discrete bands to cover the non-linear spectral signal expected from high TSS concentrations during land reclamation.
In the pursuit of high accuracy in remote sensing with multispectral imaging, Kieu et al. (2023) [24] conducted a meticulous spectral analysis from 61 spectral bands of Bayspec OCI-F to pinpoint the optimal bands for monitoring turbidity generated by marine operations, particularly land reclamation. Leveraging a dataset comprising over 2300 data points obtained from UAV-borne hyperspectral surveys, the study identified seven spectral ranges, including 448~494 nm, 504~559 nm, 715~760 nm, 806~833 nm, 896~905 nm, 596~651 nm, and 568~586 nm. This comprehensive analysis led to the selection of the MicaSense Rededge-MX Duo multispectral camera system, which precisely met spectral band requirements for turbidity monitoring in coastal waters. The subsequent turbidity distribution prediction demonstrated compatibility with hyperspectral imaging performance and consistency with the field measurements, as shown in Figure 13. For the frequent turbidity monitoring for land reclamation projects, our team exclusively focused on deploying the UAV-borne multispectral camera, MicaSense Rededge-MX Duo. The following section elaborates on the improved logistics planning implemented to enhance the accuracy in data acquisition during UAV-borne multispectral imaging.

3.2. Improvement to the Operations for Data Acquisition

3.2.1. Enhancement of Equipment Setup during Field Measurements

To improve measurement accuracy during multispectral imaging, the probe installation point and the water sampling point were relocated. In the initial setup, these two instruments were positioned 2 m apart, as depicted in Figure 14a. Due to the continuous dispersion of the sediment plume, the mounting distance between these two instruments could introduce noises in the measurements. This observation also suggests the potentially non-uniform dispersion of the plume following dumping operations. For example, the plume could be more concentrated closer to the source of the sediment dumping. In the improved configuration, the instruments were brought closer together to minimize measurement errors, as illustrated in Figure 14b. This adjustment ensures better synchronization between the two measurement methods.
As seen in Figure 15a, discrepancies have been observed between TSS and turbidity during some surveys before April 2022, when the YSI probe installation point and the water sampling point were 2 m apart from each other. For surveys conducted from May 2022 onward, the location of the in situ sampling was shifted closer to the YSI probe, and this setup has been maintained for all the following surveys. Hence, the correlation between the TSS measured by water sampling and turbidity recorded by the probe exhibited improvement over time, as shown in Figure 15b. Specifically, the R-squared (R2) value demonstrated an increase from 0.4414 in surveys conducted on 28 April 2022 to 0.6971 in surveys conducted on 18 May 2022, as depicted in Figure 15c,d.

3.2.2. Reducing the Time Delay for UAV—Sampling Deployment

The dynamic nature of the coastal environment, influenced by waves and near-shore currents, introduces temporal variations to turbidity levels as sediment plumes disperse over time. Unlike inland water bodies [28,29,30,31], where turbidity dispersion can last for hours, coastal environments exhibit sub-hourly dispersion time scales [24]. Consequently, significant differences may arise between data collected through UAV imaging and field measurements, primarily attributed to the dispersion of sediment plumes during the time delay between in situ water sampling and the capturing of the UAV images. Hence, if the time delay is extensive, the change in turbidity might be significant, especially during the near-field stage when coarse materials descend rapidly [32]. However, the concentration change becomes minimal after 10 min during the far-field stage, with a decreasing concentration rate of around 2% per min [24]. Therefore, conducting turbidity monitoring after this crucial window, where concentration has plateaued, is imperative. As UAV and sampling vessel operations cannot occur simultaneously to avoid vessel disturbance noise in UAV images, it is essential to minimize the time delay between these two measurement modes. This approach enhances the accuracy of model training based on ground-truth measurements.
In the initial surveys (Figure 16), staggered UAV flights and field measurements were conducted with a 20 min gap to prevent vessel capture in UAV images. However, this temporal gap introduced inconsistency in spectral-turbidity data due to sediment plume dispersion, as depicted in Figure 17a. To achieve reliable synchronization between the operation of UAVs and vessel sampling, a refined protocol in Figure 16b was implemented. In this improved approach, turbidity logging commenced no later than 3 min after the UAV initiated its flight mission, concluding no later than 5 min after the UAV mission ended to avoid excessive time delays between UAV imaging and water sampling. The refined protocol significantly reduced the time gap between UAV imaging and water sampling, resulting in enhanced consistency between UAV-captured turbidity distribution images and actual turbidity measurements, as demonstrated in Figure 17b.
Furthermore, to minimize turbulence and prevent disturbance to the existing sediment plume during UAV scanning, the sampling vessel maintained a consistent speed of 4–5 m/s and followed at a distance of 8–10 m behind the UAV, ensuring its exclusion from the captured images (Figure 18a). The vessel moved in a single direction without backtracking, and samples were not collected from the area behind the vessel, as the sediments in the area were already disturbed by the motor and movement of the vessel. In the survey depicted in Figure 17a, following a random route, inconsistencies were observed between the turbidity generated by UAV imaging and the actual ground-truth values. In contrast, both flights shown in Figure 17b and Figure 18b followed a uniform sampling route, resulting in greater consistency of turbidity predictions derived from the UAV multispectral datasets and actual turbidity values.
By implementing the aforementioned refined protocol, which addresses the reduction in time delay (t) between UAV imaging and field sampling, an enhancement in the model’s performance for predicting TSS from multispectral imaging has been observed. As depicted in Figure 19, R2 between the predicted turbidity from UAV imaging and the measured turbidity obtained from field measurements decreases as the time delay increases, particularly beyond the 5 min threshold. More precisely, the R2 drops from 0.6848 at t = 3 min to 0.669 at t = 5 min. Subsequently, there is a gradual decline after 5 min, with the R2 decreasing from 0.6512 to 0.6497 at t = 8 min and 10 min.
Figure 19 also reveals outliers in both high- (>200 mg/L) and low-turbidity (below 50 mg/L) regions, highlighting the challenges in fully synchronizing UAV imagery with in situ measurements due to the logistical limitations and complex coastal conditions. The observed outliers suggest that the predictive model can be more reliable for higher turbidity concentrations but less accurate for low turbidity levels. This, again, underscores the importance of streamlining UAV operation with field sampling (as discussed in Section 2.3) to achieve higher accuracy in model prediction.

4. Conclusions

This study reports a comparison of two UAV-based remote sensing systems, namely, hyperspectral and multispectral, for the monitoring of turbidity in coastal waters. Notably, the UAV multispectral remote sensing system outperforms due to its ability to cover coastal areas up to four times larger, accompanied by a simpler setup and shorter processing time compared to the UAV hyperspectral system, while maintaining a comparable accuracy. Therefore, it is the preferred choice for industrial adoption, particularly in the context of real-time and on-demand coastal monitoring applications. In addition, the results show that the time delay between UAV deployment and in situ measurements during calibration can be significantly reduced thorough meticulous logistics planning. This reduction improves accuracy, with R2 increasing from 0.6497 to 0.6848 as the time delay reduces from 10 min to 3 min. In other words, despite the presence of many other inherent uncertainties in coastal environments, the reduction in time delay alone can lead to a noticeable improvement in the model’s prediction performance based on remote sensing images.
The adaptability of UAVs with spectral sensors can extend beyond coastal EMMP for turbidity monitoring and include other applications such as the assessment of algal blooms in coastal waters and the identification of black smoke emissions from marine vessels. Furthermore, large-scale monitoring can potentially be achieved through the fusion of UAV and satellite imagery data. We hope that the outcomes from this study can contribute towards the wider adoption of real-time data acquisition with UAV-borne spectral sensors in the coastal environment in the future.

Author Contributions

Conceptualization, H.L.T. and A.W.-K.L.; methodology, H.L.T., H.T.K., H.Y.P. and A.W.-K.L.; formal analysis, H.L.T., H.T.K. and H.Y.P.; data curation, H.L.T., H.T.K. and H.Y.P.; software, H.T.K. and H.Y.P.; writing—original draft preparation, H.L.T.; writing—review and editing, H.T.K., H.Y.P., D.S.C.P. and A.W.-K.L.; visualization, H.L.T., H.T.K. and H.Y.P.; supervision, A.W.-K.L.; project administration, H.L.T. and D.S.C.P.; funding acquisition, A.W.-K.L., E.K. and W.W.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Singapore Maritime Institute (SMI) under the research project “UAV-based Remote Sensing of Turbidity in Coastal Waters,” grant number SMI-2020-MA-02.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Restrictions apply to the datasets.

Acknowledgments

The authors would like to acknowledge the contributions of the Maritime and Port Authority of Singapore (MPA) and DHI Water and Environment (S) Pte Ltd., as well as Surbana Jurong Pte Ltd. (SJ).

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. The specifications of all equipment used for the UAV deployment.
Table A1. The specifications of all equipment used for the UAV deployment.
EquipmentModelSpecification
Survey with Hyperspectral Flights
Portable rotary-UAVDJI Matrice 600 Pro (SZ DJI Technology Co., Ltd., Shenzhen, China)
-
Maximum flight height: 2500 m
-
Maximum payload: 5.5 kg
-
Maximum flight time (with max payload): 18 min
-
Maximum wind resistance: 8 m/s
Hyperspectral sensorBaySpec OCI-F Hyperspectral Imager (Bayspec, Inc., San Jose, CA, USA)
-
Sensor type: push-broom scanner
-
Spectral range: 400–1000 nm
-
No. of spectral bands: 61
-
Spectral resolution: 10–12 nm
-
16 mm lens
-
Spatial pixel resolution: 1024 × scan length
Onboard calibration spectrometerOcean Optics Flame-S-VIS-NIR Spectrometer
-
Wavelength range: 350–1000 nm
-
Optical resolution: 1.33 nm FWHM
-
Integration time: 1 ms–65 s
Survey with Multispectral flights
Portable rotary-UAVDJI Matrice 300 RTK (SZ DJI Technology Co., Ltd., Shenzhen, China)
-
Maximum flight height: 7000 m
-
Maximum payload: 2.7 kg
-
Maximum flight time (with max payload): 31 min
-
Maximum wind resistance: 12 m/s
Multispectral sensorMicasense RedEdge-MX Dual Camera
(MicaSense, Inc., Seattle, WA, USA)
-
Sensor type: scanner
-
No. of spectral bands: 10
-
Spectral bands: coastal blue 444 ± 28, blue 475 ± 32, green 531 ± 14, green 560 ± 27, red 650 ± 16, red 668 ± 14, red edge 705 ± 10, red edge 717 ± 12, red edge 740 ± 18, NIR 842 ± 57
-
Sensor resolution: 1280 × 960
Field measurement
Turbidity probeYSI ProDSS Multiparameter Digital Water Quality Meter with ProDSS turbidity Sensor (YSI, Yellow Springs, OH, USA)
-
Measurement range: 0 to 4000 FNU
-
Resolution: 0.1 FNU
Figure A1. Example of SpecGrabber GUI software for Bayspec camera settings prior to UAV operations with the manual adjustment of exposure time.
Figure A1. Example of SpecGrabber GUI software for Bayspec camera settings prior to UAV operations with the manual adjustment of exposure time.
Drones 08 00052 g0a1
Figure A2. Settings of Micasense RedEdge RX Duo system during one example flight.
Figure A2. Settings of Micasense RedEdge RX Duo system during one example flight.
Drones 08 00052 g0a2
Figure A3. GUI of FindShip app to check the status of marine vehicles for UAV flight planning. The location of the dumping barge is shown at the pinpoint, including the tracking time and the average speed of the barge.
Figure A3. GUI of FindShip app to check the status of marine vehicles for UAV flight planning. The location of the dumping barge is shown at the pinpoint, including the tracking time and the average speed of the barge.
Drones 08 00052 g0a3
Figure A4. (a) User interface of CoastalWQL, (b) Data optimization of georeferenced stitched image, and (c) Visualization of extracted spectral information from water sampling points [23].
Figure A4. (a) User interface of CoastalWQL, (b) Data optimization of georeferenced stitched image, and (c) Visualization of extracted spectral information from water sampling points [23].
Drones 08 00052 g0a4
Figure A5. GUI of Pontuspectra software embedded with all features (i.e., automated flight line detection, sunglint correction, object masking, and feature-less image stitching).
Figure A5. GUI of Pontuspectra software embedded with all features (i.e., automated flight line detection, sunglint correction, object masking, and feature-less image stitching).
Drones 08 00052 g0a5
Figure A6. Relationship between the ratio of DN to exposure time and irradiance for each wavelength in the hyperspectral camera [23].
Figure A6. Relationship between the ratio of DN to exposure time and irradiance for each wavelength in the hyperspectral camera [23].
Drones 08 00052 g0a6
Figure A7. Example of how manual adjustment of exposure time using SpecGrabber before hyperspectral flights causes the missing reflectance values in many acquired images.
Figure A7. Example of how manual adjustment of exposure time using SpecGrabber before hyperspectral flights causes the missing reflectance values in many acquired images.
Drones 08 00052 g0a7
Figure A8. Processing time of (a) CoastalWQL for hyperspectral imaging and (b) Pontuspectra for multispectral imaging for 12 min UAV flights.
Figure A8. Processing time of (a) CoastalWQL for hyperspectral imaging and (b) Pontuspectra for multispectral imaging for 12 min UAV flights.
Drones 08 00052 g0a8

References

  1. Gholizadeh, M.H.; Melesse, A.M.; Reddi, L. A Comprehensive Review on Water Quality Parameters Estimation Using Remote Sensing Techniques. Sensors 2016, 16, 1298. [Google Scholar] [CrossRef] [PubMed]
  2. Iacobelli, M.; Orlandi, M.; Cimini, D.; Marzano, F.S. Remote Sensing of Coastal Water-Quality Parameters from Sentinel-2 Satellite Data in the Tyrrhenian and Adriatic Seas. In Proceedings of the 2019 PhotonIcs & Electromagnetics Research Symposium-Spring (PIERS-Spring), Rome, Italy, 17–20 June 2019; pp. 2783–2788. [Google Scholar]
  3. Splinter, K.D.; Harley, M.D.; Turner, I.L. Remote Sensing Is Changing Our View of the Coast: Insights from 40 Years of Monitoring at Narrabeen-Collaroy, Australia. Remote Sens. 2018, 10, 1744. [Google Scholar] [CrossRef]
  4. Zheng, G.; DiGiacomo, P.M. Uncertainties and Applications of Satellite-Derived Coastal Water Quality Products. Prog. Oceanogr. 2017, 159, 45–72. [Google Scholar] [CrossRef]
  5. Cillero Castro, C.; Domínguez Gómez, J.A.; Delgado Martín, J.; Hinojo Sánchez, B.A.; Cereijo Arango, J.L.; Cheda Tuya, F.A.; Díaz-Varela, R. An UAV and Satellite Multispectral Data Approach to Monitor Water Quality in Small Reservoirs. Remote Sens. 2020, 12, 1514. [Google Scholar] [CrossRef]
  6. Castelvecchi, D. Invasion of the Drones. Sci. Am. 2010, 302, 25–27. [Google Scholar] [CrossRef] [PubMed]
  7. Marris, E. Fly, and Bring Me Data. Nature 2013, 498, 156. [Google Scholar] [PubMed]
  8. Turner, I.L.; Harley, M.D.; Drummond, C.D. UAVs for Coastal Surveying. Coast. Eng. 2016, 114, 19–24. [Google Scholar] [CrossRef]
  9. Becker, R.H.; Sayers, M.; Dehm, D.; Shuchman, R.; Quintero, K.; Bosse, K.; Sawtell, R. Unmanned Aerial System Based Spectroradiometer for Monitoring Harmful Algal Blooms: A New Paradigm in Water Quality Monitoring. J. Great Lakes Res. 2019, 45, 444–453. [Google Scholar] [CrossRef]
  10. Nex, F.; Armenakis, C.; Cramer, M.; Cucci, D.A.; Gerke, M.; Honkavaara, E.; Kukko, A.; Persello, C.; Skaloud, J. UAV in the advent of the twenties: Where we stand and what is next. ISPRS J. Photogramm. Remote Sens. 2022, 184, 215–242. [Google Scholar] [CrossRef]
  11. Koparan, C.; Koc, A.B.; Privette, C.V.; Sawyer, C.B. In Situ Water Quality Measurements Using an Unmanned Aerial Vehicle (UAV) System. Water 2018, 10, 264. [Google Scholar] [CrossRef]
  12. Klemas, V.V. Coastal and Environmental Remote Sensing from Unmanned Aerial Vehicles: An Overview. J. Coast. Res. 2015, 31, 1260–1267. [Google Scholar] [CrossRef]
  13. Kieu, H.T.; Law, A.W.-K. Remote Sensing of Coastal Hydro-Environment with Portable Unmanned Aerial Vehicles (PUAVs) a State-of-the-Art Review. J. Hydro-Environ. Res. 2021, 37, 32–45. [Google Scholar] [CrossRef]
  14. Cheng, K.H.; Chan, S.N.; Lee, J.H.W. Remote sensing of coastal algal blooms using unmanned aerial vehicles (UAVs). Mar. Pollut. Bull. 2020, 152, 110889. [Google Scholar] [CrossRef] [PubMed]
  15. Wu, D.; Li, D.; Zhang, F.; Liu, J. A review on drone-based harmful algae blooms monitoring. Environ. Monit. Assess. 2019, 191, 4. [Google Scholar] [CrossRef] [PubMed]
  16. Hayes, M.; Puckett, B.; Deaton, C.; Ridge, J. Estimating Dredge-Induced Turbidity using Drone Imagery. Preprints 2022, 2022010424. [Google Scholar] [CrossRef]
  17. Doukari, M.; Batsaris, M.; Papakonstantinou, A.; Topouzelis, K. A Protocol for Aerial Survey in Coastal Areas Using UAS. Remote Sens. 2019, 11, 1913. [Google Scholar] [CrossRef]
  18. Duffy, J.P.; Cunliffe, A.M.; DeBell, L.; Sandbrook, C.; Wich, S.A.; Shutler, J.D.; Myers-Smith, I.H.; Varela, M.R.; Anderson, K. Location, Location, Location: Considerations When Using Lightweight Drones in Challenging Environments. Remote Sens. Ecol. Conserv. 2018, 4, 7–19. [Google Scholar] [CrossRef]
  19. Ratcliffe, N.; Guihen, D.; Robst, J.; Crofts, S.; Stanworth, A.; Enderlein, P. A Protocol for the Aerial Survey of Penguin Colonies Using UAVs. J. Unmanned Veh. Syst. 2015, 3, 95–101. [Google Scholar] [CrossRef]
  20. Vize, S.; Coggan, R. Review of Standards and Protocols for Seabed Habitats Mapping, 2nd ed; MESH: Dublin, Ireland, 2005. [Google Scholar]
  21. McEliece, R.; Hinz, S.; Guarini, J.-M.; Coston-Guarini, J. Evaluation of Nearshore and Offshore Water Quality Assessment Using UAV Multispectral Imagery. Remote Sens. 2020, 12, 2258. [Google Scholar] [CrossRef]
  22. Taha, A.; Rabah, M.; Mohie, R.; Elhadary, A.; Ghanem, E. Assessment of Using UAV Imagery over Featureless Surfaces for Topographic Applications. MEJ Mansoura Eng. J. 2022, 47, 25–34. [Google Scholar] [CrossRef]
  23. Pak, H.Y.; Kieu, H.T.; Law, A.W.K.; Lin, W.; Khoo, E. CoastalWQL: An Open-Source Tool for Drone-Based Mapping of Coastal Water Quality Using Push Broom Hyperspectral Imagery. Remote Sens. 2023. under revision. [Google Scholar]
  24. Kieu, H.T.; Pak, H.Y.; Trinh, H.L.; Pang, D.S.C.; Khoo, E.; Law, A.W.-K. UAV-Based Remote Sensing of Turbidity in Coastal Environment for Regulatory Monitoring and Assessment. Mar. Pollut. Bull. 2023, 196, 115482. [Google Scholar] [CrossRef]
  25. Trinh, H.L.; Kieu, H.T.; Pak, H.Y.; Pang, D.S.C.; Cokro, A.A.; Law, A.W.-K. A Framework for Survey Planning Using Portable Unmanned Aerial Vehicles (p UAVs) in Coastal Hydro-Environment. Remote Sens. 2022, 14, 2283. [Google Scholar] [CrossRef]
  26. Wang, L. FindShip—Track Vessels. China, 2018, Version 5.2.22. Available online: https://apps.apple.com/us/app/findship-track-vessels/id768240068 (accessed on 29 January 2024).
  27. Kislik, C.; Dronova, I.; Kelly, M. UAVs in Support of Algal Bloom Research: A Review of Current Applications and Future Opportunities. Drones 2018, 2, 35. [Google Scholar] [CrossRef]
  28. Olivetti, D.; Roig, H.; Martinez, J.-M.; Borges, H.; Ferreira, A.; Casari, R.; Salles, L.; Malta, E. Low-Cost Unmanned Aerial Multispectral Imagery for Siltation Monitoring in Reservoirs. Remote Sens. 2020, 12, 1185. [Google Scholar] [CrossRef]
  29. Matsui, K.; Shirai, H.; Kageyama, Y.; Yokoyama, H. Improving the resolution of UAV-based remote sensing data of water quality of Lake Hachiroko, Japan by neural networks. Ecol. Inform. 2021, 62, 101276. [Google Scholar] [CrossRef]
  30. Logan, R.D.; Torrey, M.A.; Feijó-Lima, R.; Colman, B.P.; Valett, H.M.; Shaw, J.A. UAV-Based Hyperspectral Imaging for River Algae Pigment Estimation. Remote Sens. 2023, 15, 12. [Google Scholar] [CrossRef]
  31. Larson, M.D.; Simic Milas, A.; Vincent, R.K.; Evans, J.E. Multi-depth suspended sediment estimation using high-resolution remote-sensing UAV in Maumee River, Ohio. Int. J. Remote Sens. 2018, 39, 5472–5489. [Google Scholar] [CrossRef]
  32. Er, J.W.; Law, A.W.K.; Adams, E.E. Spreading and Deposition of Turbidity Currents: Application to Open-Water Sediment Disposal. J. Waterw. Port Coast. Ocean. Eng. 2020, 146, 4020002. [Google Scholar] [CrossRef]
Figure 1. Simulation of the coverage of UAV imaging comparing to in situ measurement.
Figure 1. Simulation of the coverage of UAV imaging comparing to in situ measurement.
Drones 08 00052 g001
Figure 2. Survey areas in the southwestern water region of Singapore.
Figure 2. Survey areas in the southwestern water region of Singapore.
Drones 08 00052 g002
Figure 3. (a) UAV-borne imager system used in the study consisted of the DJI Matrice M600 Pro with (b) a D-RTK GNSS mounted on top of the UAV body, (c) BaySpec OCI-F hyperspectral camera secured on (d) a fabricated metal frame mounted in a Ronin RX gimbal, and (e) RTK base system for ground control points.
Figure 3. (a) UAV-borne imager system used in the study consisted of the DJI Matrice M600 Pro with (b) a D-RTK GNSS mounted on top of the UAV body, (c) BaySpec OCI-F hyperspectral camera secured on (d) a fabricated metal frame mounted in a Ronin RX gimbal, and (e) RTK base system for ground control points.
Drones 08 00052 g003
Figure 4. Airframe configuration of the UAV multispectral system (a) UAV-borne multispectral imager system DJI Matrice 300 RTK and (b) Micasense Rededge-MX Duo multispectral sensor and (c) Downwelling Light Sensor mounted on top of the UAV.
Figure 4. Airframe configuration of the UAV multispectral system (a) UAV-borne multispectral imager system DJI Matrice 300 RTK and (b) Micasense Rededge-MX Duo multispectral sensor and (c) Downwelling Light Sensor mounted on top of the UAV.
Drones 08 00052 g004
Figure 5. Demonstration of the standard position for the DJI M300 RTK—Micasense Rededge-MX Duo with (a) the camera at 1 m above the reflectance panel and (b) the placement of the panel with no shadows cast on its surface.
Figure 5. Demonstration of the standard position for the DJI M300 RTK—Micasense Rededge-MX Duo with (a) the camera at 1 m above the reflectance panel and (b) the placement of the panel with no shadows cast on its surface.
Drones 08 00052 g005
Figure 6. Flight pattern for UAV operation with spectral cameras. Sequence of an UAV flight: Take off, Head to starting point, Patterned scanning, Return to home, and Landing.
Figure 6. Flight pattern for UAV operation with spectral cameras. Sequence of an UAV flight: Take off, Head to starting point, Patterned scanning, Return to home, and Landing.
Drones 08 00052 g006
Figure 7. Specifications of the YSI ProDSS sonde for turbidity measurement deployed during UAV spectral imaging surveys.
Figure 7. Specifications of the YSI ProDSS sonde for turbidity measurement deployed during UAV spectral imaging surveys.
Drones 08 00052 g007
Figure 8. Setup on the sampling vessel: (a) placement of the probe at the front of the sampling vessel and (b) the submerged depth of the YSI probe.
Figure 8. Setup on the sampling vessel: (a) placement of the probe at the front of the sampling vessel and (b) the submerged depth of the YSI probe.
Drones 08 00052 g008
Figure 9. A protocol for land–marine coordination for monitoring marine activities using UAV.
Figure 9. A protocol for land–marine coordination for monitoring marine activities using UAV.
Drones 08 00052 g009
Figure 10. Correlation of payload and flight endurance between two platforms.
Figure 10. Correlation of payload and flight endurance between two platforms.
Drones 08 00052 g010
Figure 11. Comparative illustration of the flight planning between two UAV systems. For hyperspectral imaging (a), the UAV is still at the shoreline at T1′ and T2′ while planning the flight path. Meanwhile, for multispectral imaging (b), the UAV takes off at T1 to head to the center of the plume and starts planning the flight path at T2, (T2 < T2′). At T3 and T3′, both UAV systems start their flight missions.
Figure 11. Comparative illustration of the flight planning between two UAV systems. For hyperspectral imaging (a), the UAV is still at the shoreline at T1′ and T2′ while planning the flight path. Meanwhile, for multispectral imaging (b), the UAV takes off at T1 to head to the center of the plume and starts planning the flight path at T2, (T2 < T2′). At T3 and T3′, both UAV systems start their flight missions.
Drones 08 00052 g011
Figure 12. Examples of the missing flight lines and/or image misalignment during stitching process at low light intensity areas during (a) hyperspectral imaging flight and (b) multispectral imaging flight.
Figure 12. Examples of the missing flight lines and/or image misalignment during stitching process at low light intensity areas during (a) hyperspectral imaging flight and (b) multispectral imaging flight.
Drones 08 00052 g012
Figure 13. Field measurements and plume distribution from (a) Coastal WQL and (b) Pontuspectra for the hyperspectral imaging flight at 1:22–1:33 pm and multispectral imaging flight at 1:45–2:03 pm on 25 August 2022, with the dots representing the measured turbidity acquired from the YSI probe.
Figure 13. Field measurements and plume distribution from (a) Coastal WQL and (b) Pontuspectra for the hyperspectral imaging flight at 1:22–1:33 pm and multispectral imaging flight at 1:45–2:03 pm on 25 August 2022, with the dots representing the measured turbidity acquired from the YSI probe.
Drones 08 00052 g013
Figure 14. Placement of the YSI probe and seawater pump (a) with a 2 m gap (before April 2022) and (b) on a steel pole with a 50 cm gap (from May 2022 onwards).
Figure 14. Placement of the YSI probe and seawater pump (a) with a 2 m gap (before April 2022) and (b) on a steel pole with a 50 cm gap (from May 2022 onwards).
Drones 08 00052 g014
Figure 15. Comparison of the TSS from water sampling and turbidity from the YSI probe before (a,c) and after (b,d) shifting of the water sampling point.
Figure 15. Comparison of the TSS from water sampling and turbidity from the YSI probe before (a,c) and after (b,d) shifting of the water sampling point.
Drones 08 00052 g015
Figure 16. (a) Previous and (b) improved operation protocol adopted in the surveys.
Figure 16. (a) Previous and (b) improved operation protocol adopted in the surveys.
Drones 08 00052 g016
Figure 17. Comparison of turbidity generated by UAV—multispectral system and ground-truth values during UAV multispectral surveys on (a) 28 April 2022 with inconsistencies presented in the red box and (b) 16 August 2022 with higher consistency. The map color shows the predicted turbidity value from multispectral images, and the dashed line shows the measured turbidity level.
Figure 17. Comparison of turbidity generated by UAV—multispectral system and ground-truth values during UAV multispectral surveys on (a) 28 April 2022 with inconsistencies presented in the red box and (b) 16 August 2022 with higher consistency. The map color shows the predicted turbidity value from multispectral images, and the dashed line shows the measured turbidity level.
Drones 08 00052 g017
Figure 18. (a) Standard route of the sampling vessel following the UAV flight path indicated by the blue arrow and (b) examples of sampling points (shown as dots) during a UAV multispectral flight on 11 August 2022.
Figure 18. (a) Standard route of the sampling vessel following the UAV flight path indicated by the blue arrow and (b) examples of sampling points (shown as dots) during a UAV multispectral flight on 11 August 2022.
Drones 08 00052 g018
Figure 19. Model performance (indicated as the blue/green lines) of the different time delays (t) between UAV imaging and field sampling (a) t = 3 min, (b) t = 5 min, (c) t = 8 min, and (d) t = 10 min based on the multispectral image of all UAV multispectral imaging flights.
Figure 19. Model performance (indicated as the blue/green lines) of the different time delays (t) between UAV imaging and field sampling (a) t = 3 min, (b) t = 5 min, (c) t = 8 min, and (d) t = 10 min based on the multispectral image of all UAV multispectral imaging flights.
Drones 08 00052 g019
Table 1. Comparison between the specifications of the two UAV platforms.
Table 1. Comparison between the specifications of the two UAV platforms.
ParameterDJI M600 Pro SystemDJI M300 RTK System
Max flight endurance (without payload)32 min (TB 47S battery)
38 min (TB 48S battery)
55 min
Max flight endurance (with max payload)16 min (TB 47S battery)
18 min (TB 48S battery)
45 min
Wingspan1.12 m1.2 m
Number of rotors
(propellors)
64
Number of batteries6
(TB 47S or TB 48S)
2
(TB 60)
Max take-off weight
(with batteries)
15.5 kgs9 kgs
Max flight speed17.8 m/s17 m/s (P mode)
23 m/s (S mode)
Max payload5.5–6 kgs2.7 kgs
Max wind resistance9 m/s15 m/s
Table 2. Parameters of two UAV platforms and imaging systems operated at 58 m AMSL height during surveys.
Table 2. Parameters of two UAV platforms and imaging systems operated at 58 m AMSL height during surveys.
ParameterDJI M600 Pro and Bayspec OCI-F Hyperspectral ImagerDJI M300 RTK and Micasense Rededge-RX Duo
Multispectral Imager
Flight endurance10–12 min15–20 min
Average coverage areas100 × 100 m200 × 200 m
Payload (camera, batteries, gimbal)6 kgs0.6 kgs
Flight velocity5 m/s4–5 m/s
Side overlap~1 line/10 m of lateral
dimension
(35.6%)
~1 line/15 m of lateral dimension
(85%)
Frontal overlap75.61%75%
Setup time~1 h10–15 min
Spectral bands61 bands10 bands
Ground resolution2 cm/pixel4 cm/pixel
Processing timeup to 4 h35–55 min
Sensitivity to noiseModerate–HighModerate
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Trinh, H.L.; Kieu, H.T.; Pak, H.Y.; Pang, D.S.C.; Tham, W.W.; Khoo, E.; Law, A.W.-K. A Comparative Study of Multi-Rotor Unmanned Aerial Vehicles (UAVs) with Spectral Sensors for Real-Time Turbidity Monitoring in the Coastal Environment. Drones 2024, 8, 52. https://doi.org/10.3390/drones8020052

AMA Style

Trinh HL, Kieu HT, Pak HY, Pang DSC, Tham WW, Khoo E, Law AW-K. A Comparative Study of Multi-Rotor Unmanned Aerial Vehicles (UAVs) with Spectral Sensors for Real-Time Turbidity Monitoring in the Coastal Environment. Drones. 2024; 8(2):52. https://doi.org/10.3390/drones8020052

Chicago/Turabian Style

Trinh, Ha Linh, Hieu Trung Kieu, Hui Ying Pak, Dawn Sok Cheng Pang, Wai Wah Tham, Eugene Khoo, and Adrian Wing-Keung Law. 2024. "A Comparative Study of Multi-Rotor Unmanned Aerial Vehicles (UAVs) with Spectral Sensors for Real-Time Turbidity Monitoring in the Coastal Environment" Drones 8, no. 2: 52. https://doi.org/10.3390/drones8020052

Article Metrics

Back to TopTop