Next Article in Journal
THz CMOS On-Chip Antenna Array Using Defected Ground Structure
Previous Article in Journal
SMS: A Secure Healthcare Model for Smart Cities
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Improvement in Target Range Estimation and the Range Resolution Using Drone

Division of Human IT Convergence, Daejin University, 1007 Hoguk-ro, Pocheon-si, Gyeonggi-do 11159, Korea
Electronics 2020, 9(7), 1136; https://doi.org/10.3390/electronics9071136
Submission received: 2 June 2020 / Revised: 11 July 2020 / Accepted: 11 July 2020 / Published: 13 July 2020
(This article belongs to the Section Systems & Control Engineering)

Abstract

:
This study measured the speed of a moving vehicle in multiple lanes using a drone. The existing methods for measuring a vehicle’s speed while driving on the road measure the speed of moving automobiles by means of a sensor that is mounted on a structure. In another method, a person measures the speed of a vehicle at the edge of a road using a speed-measuring tool. The existing method for measuring a vehicle’s speed requires the installation of a gentry-structure; however, this produces a high risk for traffic accidents, which makes it impossible to measure a vehicle’s speed in multiple lanes at once. In this paper, a method that used a drone to measure the speed of moving vehicles in multiple lanes was proposed. The suggested method consisted of two LiDAR sets mounted on the drone, with each LiDAR sensor set measuring the speed of vehicles moving in one lane; that is, estimating the speed of moving vehicles in multiple lanes was possible by moving the drone over the road. The proposed method’s performance was compared with that of existing equipment in order to measure the speed of moving vehicles using the manufactured drone. The results of the experiment, in which the speed of moving vehicles was measured, showed that the Root Mean Square Error (RMSE) of the first lane and the second lane was 3.30 km/h and 2.27 km/h, respectively. The vehicle detection rate was 100% in the first lane. In the second lane, the vehicle detection rate was 94.12%, but the vehicle was not detected twice in the experiment. The average vehicle detection rate is 97.06%. Compared with the existing measurement system, the multi-lane moving vehicle speed measurement method that used the drone developed in this study reduced the risk of accidents, increased the convenience of movement, and measured the speed of vehicles moving in multiple lanes using a drone. In addition, it was more efficient than current measurement systems because it allowed an accurate measurement of speed in bad environmental conditions.

1. Introduction

Recently, drone technology has developed rapidly in various fields. The drone is highly useful for aerial surveillance because of its remote sensing capability. Besides, multiple target detection is essential to recognize harmful threats in advance.
It is difficult to estimate the desired target due to signal noise and interference from a multipath fading of transmitted signals in a wireless communication environment [1]. The strength of the signal that carries information is decreased by multipath fading, which is caused by natural and man-made structures [2]. Target estimation [3,4,5,6] using radio waves has been studied in the radar, sonar, mobile communication, and laser fields. In particular, a method for estimating targets using radar has been developed to protect military personnel from enemy threats [7,8,9].
Moving target detection is an essential preprocessing step to mark regions in the application, such as abandoned animal detection, surveillance tracking, and vehicle estimation. Since it affects subsequent steps, robust and accurate moving target detection is essential to ensure optimal performance in its application. Technological advances in areas, such as signal processing technology, sensors, and position tracking, have affected various applications of UAVs.
In recent years, target detection has been studied using a camera mounted on a drone. Drones can be significantly useful in many fields, and moving target detection is essential to identify harmful threats in advance. Drones can acquire data by detecting the vehicles from the point after flying from one point to another with a program or remote controller. In Reference [10], various applications have been proposed for the automatic video analysis technology of smart surveillance by drones. In Reference [11], aerial image acquisition has been studied by developing an all-in-one camera-based target detection and positioning system for search and rescue missions. In Reference [12], the speed estimation for moving objects has been studied by registering and subtracting frames after capturing images using a camera mounted on a drone. Image registration is the process of matching two or more images of the same scene to the same coordinates. In Reference [13], pedestrian movement detection using a small drone camera consists of frame subtraction, threshold, morphological filter, and false alarm reduction in consideration of the actual target size. In Reference [14], the paper has studied the Ground Penetrating Radar (GPR) system to detect land mines and Improvised Explosive Devices (IEDs) using vehicles and drones. The system consists of a transmitter mounted on a vehicle and a receiver mounted on a drone. This method uses a Synthetic Aperture Radar (SAR) algorithm to reduce surface clusters and target detection, but it is impossible to detect objects with drones without vehicles. In Reference [15], the paper has studied the methodology of acquiring a growth deficit map with an accuracy of up to 5 cm and a spatial resolution of 1 m using Diffusion Interferometry-SAR (DIN-SAR). In order to form high-resolution maps for crop growth monitoring, satellite-based radar optical sensors and drone-borne optical sensors are necessary.
In past studies, drones have used cameras to detect objects. This method is difficult to apply to object detection in environments, such as foggy and dark ones. Using a LiDAR sensor mounted on a drone, it detects the speed of vehicles driving on the road. The LiDAR sensor uses LightWare’s SF30C. The moving target is detected for vehicles driving on the first and second lanes of the road, and the position of the drone detection is vertically upward to the edge of the road. After the drone is moved to the measuring point, it detects the target in the stopped state. The drone hovering range is within 1 m. The speed of moving vehicles in multiple lanes is measured using a LiDAR sensor mounted on a drone. Recently, the amount of research on Intelligent Transportation Systems (ITS) for smart roads has been increasing, and information [16,17,18] about automobiles in relation to driving on roads has been found to be closely related to road safety. Currently, it is not possible to measure the speed of vehicles in multiple lanes at once with a single piece of equipment. Therefore, it is necessary to develop an ITS system that estimates the speed of moving vehicles in multiple lanes using one piece of equipment.
This study presented a drone-based speed measurement system that estimated the speed of moving vehicles in multiple lanes [19,20,21,22]. There are two methods for measuring the speed of a moving vehicle in multiple lanes. The first method measures the speed of a moving vehicle on the outskirts of a road using speed-measuring equipment [23,24,25]. The second method gauges a vehicle’s speed by mounting a sensor on a fixed structure and measuring the vehicle’s speed [26,27]. The method by which a person obtains information about a vehicle may differ depending on the climate and the person who performs the measurements. The method for measuring the speed of a vehicle equipped with a sensor in a fixed mechanism obtains information about that vehicle on the outskirts of the road, so it is impossible to measure a moving vehicle’s speed over the entire lane. It is impossible to measure the speed of moving vehicles in all lanes because the existing measurement method only measures the speed of moving vehicles at the edge of a lane. In addition, there is a risk of roadside traffic accidents when a person measures the speed of moving vehicles in multiple lanes; thus, a method that minimizes this risk is needed.
This paper is organized as follows. In Section 2, the methodology of vehicle detection and speed estimation is discussed. The component of the developed drone system that measures a vehicle’s speed is described in Section 3. In Section 4, the analysis of information about moving vehicles detected using the developed drone that measures a vehicle’s speed is represented. Vehicle speed measurements are analyzed together with the developed drone through field experiments in Section 5. In Section 6, the proposed method is compared with the existing method, and the paper is concluded.

2. The Methodology of Vehicle Detection and Speed Estimation

The speed of the vehicles was estimated using a drone that was equipped with a set of LiDAR sensors that measure distance. This LiDAR sensor set detected moving vehicles in multiple lanes.
Figure 1 shows the proposed method for estimating the speed of vehicles in multiple lanes using multiple LiDAR sensors. Each LiDAR [28] sensor obtained information about vehicles passing along each lane. A set of LiDAR sensors for detecting vehicles in one lane consisted of two sensors. The front sensor in Figure 1 scanned point A, and the other sensor scanned point B. The sensor at the front of the LiDAR indicated that point A was perpendicular to the ground. Hf was the distance between the sensor at the front of the LiDAR and point A, and Hr was the distance from the rear sensor to point B, and θ was the angle between the two LiDAR sensors. Figure 2 shows the flow for the drone to detect the vehicle and calculate the vehicle speed. The sensors counter read stored data by measuring the distance from the front and rear sensors at each measurement point. When the vehicle entered/exited the measurement point, the Hf and Hr distances differed from the reference distance. At this point, how the vehicle has driven the measurement point was determined. The distance between measuring points was calculated using Hf, Hr, and sensor angle. The vehicle speed was obtained by calculating the vehicle entry/exit time difference and the distance between measuring points. The distance between point A and point B was represented by L , and the angle and distance between the sensors were proportional. In Figure 1, the distance L between point A and point B could be expressed as follows [29,30,31,32].
L = ( H f ) 2 + ( H r ) 2 2 H f   H r   c o s θ
A moving vehicle was detected by the front and rear sensors, which transmitted data to the computer on the ground. The computer on the ground estimated the vehicle’s speed by calculating the time difference of the measuring vehicle from two sensors. The speed of the vehicle shown in Figure 1 was calculated as follows
V = L Δ t
where V is the vehicle’s speed, and Δ t is the time difference between the time measured at point A and the time measured at point B in Figure 1. This paper proposed a method for measuring the speed of vehicles driving along up to two lanes. The proposed method for calculating the speed of vehicles obtained information about vehicles on the road differently compared with the existing method for measuring the speed of vehicles on the roadside. It simultaneously measured the speed of operating vehicles that drove in multiple lanes and assumed that vehicles moving in one lane did not interfere with vehicles moving in other lanes.
Four LiDAR sensors mounted on a drone were used to detect vehicles moving in the first and second lanes. Two LiDAR sensors were used in one lane to detect the vehicle entry/exit at the measuring point. The front sensor scanned the vehicle entry measurement point, and the rear sensor scanned the vehicle exit measurement point. The data measured in each lane was transmitted to the ground computer by including the header information for each lane in a data frame with the TCP protocol setting. The ground computer could distinguish measurement data of each lane based on header information of the frame.
The angle ( θ ) between the drone LiDAR sensors was fixed. The moving vehicle was measured while the angle was fixed. In Figure 1, the distance ( L ) increased when the drone was flying upward, and the distance ( L ) decreased when the drone was flying downward. The distance between the measuring points was controlled by the drone’s height position. In Figure 1, the drone LiDAR sensor measured two points (entry/exit) in each lane. This data was transmitted to the ground computer system using a transceiver. When the vehicle entered/exited to two points, the distance between Hf and Hr changed. It was determined that the vehicle moved at two points, and the vehicle speed was calculated by the ground computer system when the distance was changed.
When the drone LiDAR sensor detected the moving vehicle, it could not verify that it was pointing correctly at the road measurement point. The measurement point of the drone LiDAR sensor could be identified as the measurement point of the Infrared (IR) laser sensor by aligning the drone LiDAR sensor and the IR laser sensor. The IR laser sensor measurement point could be identified by the human eye. The IR laser sensor measuring point became the LiDAR sensor measuring point when the LiDAR sensor measurement point and the IR laser sensor measurement point were aligned.

3. Developed Drone System Component

This drone development system is comprised of a drone system and ground equipment and can be used to obtain information about traffic in multiple lanes. The drone for obtaining information about traffic in multiple lanes include a multi-lane LiDAR distance detection system [33,34,35,36] based on a LiDAR distance measurement sensor. The drone system consists of an airframe, an autonomous flight control system, a LiDAR sensor, a gimbal device, a communication and video device, and a video camera that can transmit video, vehicle speed, and traffic data. The multi-lane LiDAR detection system calculates a vehicle’s speed by calculating the distance between two points, as well as the distance between two points by reference to the distance to the measurement point, and by aligning the sensor’s constant angle with a sensor set consisting of two LiDAR distance measurement sensors. Therefore, it is possible to accurately measure the speed of a vehicle regardless of the change in altitude of the UAV during flight. In order to minimize damage in the event that the drone crashes, the direction of the sensor is set in such a way that it can be measured by stopping and hovering in the direction directly above the shoulder.
Figure 3 shows the drone that was developed in this study. It consisted of a propeller, a Flight Control Computer (FCC), a gimbal, a multi-lane detection system, and a camera. The developed drone was designed to be a quadcopter drone with four propellers due to weight considerations. The FCC controlled the drone’s flight and various devices that were mounted on the drone. The communication system in the drone was equipped with a transceiver so that a remote controller could control the drone’s flight and transmit data from the drone to the computer on the ground using Radio Frequency (RF) communication. It was difficult to continuously measure at a specific point because the drone aircraft would shake due to changes in direction. The gimbal was equipped with a three-axis gimbal to maintain the payload attached to the drone in constant balance. The vehicle detection module consisted of a LiDAR sensor, a camera, and a transceiver. The multi-lane vehicle detection system consisted of a LiDAR distance measurement sensor and a vehicle detection module. The vehicle detection module detected a moving vehicle in a lane, and two LiDAR sensors were needed to detect a moving vehicle in one lane. The drone of this study was composed of four sensors, with the vehicle detecting the 1st and 2nd lanes simultaneously.
The multi-lane LiDAR vehicle detection system had an interface port, by which the system could be connected to the sensors for each lane, a Central Processing Unit (CPU) circuit part that could perform calculations, a memory slot for data storage, and an external communication port. The camera transmitted images of the road to the computer on the ground to check the traffic flow.
Figure 4 presents a block diagram of the control system for the multi-lane traffic information detection module. The diagram shows information about a vehicle that was derived from the calculation of the distance between two points, the front/rear sensor measurement values, and information about the installation angle for the calculation of a vehicle’s speed. Figure 4 shows that the control system for estimating a vehicle’s speed had two parts: a vehicle detection part and a ground control part. The vehicle detection part consisted of a camera and a Detection Information Module (DIM). The ground control part comprised a Detection Control Module (DCM) and a server. The camera of the vehicle detection part transmitted images of a vehicle’s entry into and exit from the measurement point to the ground control part using 5.8 GHz RF communication. The Detection Information Module transmitted the information obtained by the drone’s sensor set to the ground control part using 915 MHz RF communication. The Detection Control Module of the ground control part shown in Figure 3 calculated the vehicle’s speed and the traffic volume using the information received from the detection information module and stored the data in the server. In addition, the Detection Control Module transmitted a vehicle detection message to the detection information module and saved the video image data in the server in order to process data received from the camera of the vehicle detection part. The multi-lane speed and traffic detection function calculated the traffic volume in each lane and the speed of vehicles, using the information obtained by the front/rear sensors and transmitted from the detection information module, and stored the acquired data. The detection confirmation displayed a video taken during the data collection time and a still screen at the time a vehicle was detected through the video capture device to provide information about the time of the video and the still screen. The detection information generation parameter remotely set various parameters in order for the Detection Information Module to detect vehicles.
  • The generation of information about speed was indicated by the setting of the angle of the front/rear sensor for each lane.
  • The indicator was set on/off to check the exact measurement point indicated by the sensor when the drone moved to the measurement point.
  • The drone moved to a measurement point and collected measurements about traffic during stationary flight. In this mode, the speed of vehicles moving in multiple lanes was measured.
  • The debug mode was set to display the raw information obtained by the multi-lane detection sensors and to check for the presence or absence of a vehicle’s detection.
  • When the calculation of a vehicle’s speed was completed, the information about the speed was logged.

4. Vehicle Detection and Speed Analysis Measurement

The detection information analysis employed the Detection Information Module’s vehicle detection and the Detection Control Module’s speed measurement. The file containing the original information that was collected by the detection information module extracted the traffic volume and the speed of the vehicle. The multi-lane moving vehicle detection parameters, such as Hf, Hr, and sensor angle, were vehicle detection algorithm parameters that were included in the header of the file containing the original information.
The speed measurement for each lane extracted the speed of vehicles in each lane at the time when the front/rear sensor sensed the vehicle. The vehicle detection parameters consisted of a base slope to determine entry/exit from the measurement point, an on-off wait switch, and front and rear sensor angles for each lane. Once vehicle detection and a speed calculation were complete, the data were stored in the same log file used by the Detection Control Module function, and the detection information about all vehicles was displayed in a graph.
Figure 5 shows a block diagram of the vehicle detection information for the vehicle detection algorithm and the speed calculation algorithm. The parameters for the vehicle detection algorithm were set in BtnModify. BtnPlayCtrl operated the LiDAR front/rear sensors for the vehicle detection and speed information generation using the parameters set in BtnModify when there was a detection event.
BtnStop stopped detecting vehicles when it received a stop event message from the DCM. Then, information about a vehicle’s speed was generated in order to process the data collected by BtnPlayCtrl. BtnSave stored log files that were used to transmit data to the computer on the ground, which processed the data acquired by the front/rear sensors. The DCM loaded the raw data from the DIM into BtnRawpath.
The block diagram of information about a vehicle’s detection and speed for data processing contained the following:
  • BtnModify: Applied the vehicle detection parameter to the vehicle detection algorithm.
  • BtnPlayCtrl: Started vehicle detection and speed generation.
  • BtnStop: Stopped vehicle detection and speed generation.
  • BtnSave: Saved the created information about speed in a log file.
  • BtnRawPath: Loaded the file containing raw information from the DIM.
  • Data Processing: Detected vehicles and calculated their speed according to an event message.
Figure 6 shows the hierarchical control procedure for each vehicle detection and speed calculation event in the detection information analysis shown in Figure 5. BtnPlayCtrl controlled vehicle detection according to the event message from the DCM. The drone detected moving vehicles when BtnPlayCtrl received an ‘operate’ message, and the drone did not detect vehicles when BtnPlayCtrl received a ‘stop’ message. BtnModify operated when vehicle detection was not required and changed each parameter setting at this time. BtnSave created a log file and created a header file to save the log file when it received a ‘start saving’ event message. BtnSave stopped saving the log file and the header file when the state of the message was changed to ‘end’. BtnStop was applied when the user wanted to change the status of an event. In this mode, it stopped both the measurement of a moving vehicle’s speed and log file creation and storage and was used to change parameters again.
The hierarchical control procedure for each event was as follows:
  • BtnPlayCtrl: Saved raw data by changing the status of each event in the analysis of vehicle detection information.
  • BtnModify: Changed the parameters when it was not in data acquisition mode.
  • BtnSave: Created the log file and the header file and saved the log file.
  • BtnStop: Updated the data display event to ‘stop’.
Figure 7 shows the Hierarchical Control Procedure (HCP) for the extraction of the data from the DIM in the vehicle detection part. The thread was operated at intervals of 8 ms and executed functions, such as initialization, data collection, and data processing, according to the operation state in order to extract information about detected vehicles. The data processing unit controlled data acquisition according to the event state of each cycle. A speed error occurred in the study due to diffused reflection or sensor receiver signal noise. In particular, the speed error was greater when the sensor scanned the wheels of the vehicle. The average filter was used to minimize the effect on the error. Since the research concept was set to detect a vehicle over 15 m, an error occurred due to the diagonal scanning of the sensor. This error was corrected by processing it as a DC offset.
When the drone arrived at the measurement point to detect vehicles, it would receive an event message from the DCM telling it to initialize the parameters for vehicle detection. The drone would transition to a standby state and would not detect vehicles when the DIM received a ‘stop’ event message from the DCM. The drone would transition from the standby state to the vehicle detection state when the DIM received an event message from the DCM stating that it was operational. The drone transitioned between states according to the messages from the DCM, which controlled vehicle detection. The data obtained by the drone were transmitted to the DCM, and the DCM calculated the vehicle’s speed by processing the data received from the DIM.
The procedure shown in Figure 7 was as follows.
  • In the initial state, the parameters were initialized, and the graph output’s state was changed.
  • The algorithm’s state was changed to acquire data.
  • Filtering was performed after obtaining raw data.
  • The vehicle detection algorithm was applied to extract the data rate.
  • The speed calculation algorithm was applied using the data from the vehicle detection algorithm.

5. Experiment

Figure 8 shows the distance-measuring sensor that measured the distance between the drone and the road. The sensor was manufactured by using a Lightware’s SF30C LiDAR sensor. The distance-measuring sensor was fixed in place, and the target was moved to measure the distance in order to determine the accuracy of the developed distance-measuring sensor. The maximum detection distance of the distance-measuring sensor was found to be 100 m. The sensor had a beam angle of 3.5 mrad.
Figure 9 shows the peak-to-peak measurement of the distance from 10 m to 100 m using the distance-measuring sensor shown in Figure 8. The measured distance in Figure 8 was tested by using the drone from 10 m to 100 m after taking into consideration the minimum and maximum distance from the ground required for safe flight. The experiment measured the distance between the distance-measuring sensor and the target while changing the distance in 10 m increments (repeated 10 times for each distance). The results of the experiment showed that the maximum measurement distance difference error was 7 cm at 100 m, and the minimum measurement distance difference error was 3 cm at 20 m and 80 m. The average peak-to-peak distance difference error of the distance-measuring sensor was 4.5 cm.
Table 1 shows the average, maximum, minimum, and peak-to-peak measurement for each distance from the distance-measuring sensor. When the measurement distance was 60 m and 80 m, the average measurement distance was 60.02 m and 80.02 m, respectively. The sensor produced the minimum difference in measurement distance at 60 m and 80 m. When the measurement distance was 100 m, the difference in measurement distance was at its maximum due to the average measurement distance of 100.7 m. The maximum measurement distance at 100 m was 101.1 m, which was the maximum measurement difference shown in Table 1. The maximum measurement distance at 80 m was 80.03 m, which was the minimum measurement difference shown in Table 1. Since the minimum measurement distance of 80 m was 80 m, this distance was the most accurately measured distance in the experiment. The minimum measurement distance of 100 m showed the maximum measurement difference to be 100.4 m.
Figure 10 shows images taken during an experiment performed to compare a speed-gun with the developed drone and confirm the accuracy of the speed measurements from the system developed in this study. The drone measured the speed of a moving vehicle in only one lane and then transmitted the data to the ground control part to calculate the speed in the DCM. The number shown in red and yellow in Figure 10 was a measurement of the speed at which the vehicle moved between the speed-gun and the drone. Table 2 shows the results of the test conducted on one road. In the 5th experiment, the difference in measured speed between the speed-gun and the drone was the smallest (0.17 km/h). In the 3rd experiment, the difference in measured speed between the speed-gun and the drone was the largest (4.34 km/h). The average difference in measured speed between the speed-gun and the drone for a vehicle moving in only one lane was 1.47 km/h. Table 2 shows the measured speed values of the drone and the speed gun for the detection vehicle.
Table 3 shows the drone and speed-gun measurement data for the vehicle speed in multiple lanes. Existing equipment used a speed-gun, and the drone’s height was a fixed flight (hovering) at 35 m above the road. In the experiments, the vehicle speed was measured 17 times in multiple lanes. The vehicle speeds measured using the speed-gun and the drone in the 16th experiment on the first lane were 55 km/h and 54.7 km/h, respectively. The difference in measured speed between the two pieces of equipment was 0.3 km/h, which is the minimum value shown in Table 3. The vehicle speeds measured using the speed-gun and the drone in the 6th experiment on the first lane were 85 km/h and 78.19 km/h, respectively. The difference in measured speed between the two equipment was 6.81 km/h, which is the maximum difference in measured speed shown in Table 3. The vehicle speeds measured using the drone and the speed-gun in the 6th experiment on the second lane were 84.78 km/h and 85 km/h, respectively. The difference in measured speed between the two pieces of equipment was 0.22 km/h, which was the minimum value obtained in the second lane during the experiment. The maximum difference in measured speed between the two pieces of equipment in the second lane was obtained during the 15th experiment. At this time, the vehicle speeds measured using the drone and speed-gun were 44.68 km/h and 50 km/h, respectively, and the difference in the measured speed was 3.32 km/h.
In Table 3, the ‘X’ symbol indicates that the vehicle’s speed was not measured. The results of the experiment in multiple lanes showed that there were some differences in measured speed between the developed system and the existing piece of equipment. The reason for these differences was that the drone scanned the beam using a diagonal line, so a vehicle’s position and the time at which it was detected might vary depending on the scanning position and the vehicle’s driving pattern. In addition, it might be that the differences in measured speed occurred because of the difference in the time of detection of the entry and exit sensors. If the distance between the two sensors was insufficient, a vehicle might not be detected, and its speed might not be measured. Table 4 shows Root Mean Square Error (RMSE) and the vehicle detection rate for the data shown in Table 3. The RMSE for the first lane was 3.30 km/h, and the RMSE for the second lane was 2.27 km/h. The vehicle detection rate was 100% in the first lane; however, in the second lane, the vehicle detection rate was 94.12% because one vehicle was not detected. The average vehicle detection rate was 97.06%.
There were two reasons for the difference in speed measurements obtained during the experiment. First, since the developed drone sensor scanned the beam diagonally, the detected position of a vehicle varied depending on the scanning position and the vehicle’s driving pattern. Second, the scanning distance value was indirectly calculated by the distance values from two sensors and the proximity angle when a vehicle failed to be detected. The scanning distance value increased the difference in measured speed due to the diffusion of reflections or noise from the sensor.

6. Conclusions

In this study, a drone to accurately and safely measure the speed of a vehicle moving on a road was developed. The drone tested the moving vehicle detection 17 times in a hovering flight. The drone measurement performance was compared with the vehicle speed measurement. The proposed drone system was comprised of the drone for the detection of moving vehicles and ground equipment for the processing of data. The difference between the drone system proposed in this study and the existing speed measurement system was that a set of sensors could be installed on the drone to acquire information on all vehicles moving in each lane. To determine the accuracy of the speed measurements from the developed drone system, its performance was compared with that of a speed-gun that is commonly used to measure vehicle speeds. The advantages of the proposed drone system were as follows: (1) it reduced the risk of roadside traffic accidents, which the use of the existing method for measuring vehicle speeds entails. (2) It allowed one to obtain information about the speed of vehicles without the need to install a gentry-structure. In addition, the system improved the accuracy of vehicle speed measurements on roads affected by foggy and dark environments. Since the weight of the sensor module and the gimbal was located below the center of the drone, it had a wind resistance of up to 15 m/s when an appropriate Proportional Integral Derivative (PID) control setting was used. The weight of the drone developed in this study was 20 kg, including a dummy. The drone was equipped with an 8 kg dummy to fly at a wind speed of 15 m/s. When the dummy was removed from the drone, it could fly up to 10 m/s wind resistance, with the drone weighing 12 kg. The drone’s movement during the stationary flight (hovering) could be measured within 1 m. It would be dangerous to test the entire system on a public road without being able to ensure that the drone’s flight would be 100% safe. In addition, it was impossible to fully consider all road irregularities because the drone needed to fly in a place without streetlights and wires. In the future, research should be conducted on how to improve the drone’s wind resistance so that it remains stable during flights in strong winds. Besides, a method that uses an average filter and a speed correction algorithm for reducing the difference in measured speed must be developed.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflict of interest.

Abbreviations

The following abbreviation are used in this manuscript:
LiDARLight detection and ranging
TCPTransmission control protocol
DCDirect current
DCMDetection control module
DIMDetection information module
HCPHierarchical control procedure
RMSERoot mean square error
PIDProportional integral derivative

References

  1. Clone, F.; Cristallini, D.; Cerutti Maori, D.; Lombardo, P. Direction of arrival estimation performance comparsion of dual cancelled channels space time adaptive processing techniques. IET J. Mag. 2014, 8, 7–26. [Google Scholar] [CrossRef]
  2. Xiang, W.; Zhital, H.; Yiyu, Z. Underdeterminded DOA estimation and blind separation of non-disjoint sources in time frequency domain based on sparse representation method. IEEE Trans. Syst. Eng. Electron. J. 2014, 25, 247–258. [Google Scholar] [CrossRef]
  3. Huang, H.; Bai, J.; Zhou, H. Present situation and key technologies of unmanned cooperative operation under intelligent air combat system. Navig. Control. 2019, 18, 15–23. [Google Scholar]
  4. Yan, J.; Jiu, B.; Liu, H.; Bao, Z. Joint cluster and power allocation algorithm for multiple targets tracking in multistatic radar systems. J. Electron. Inf. Technol. 2013, 35, 1875–1881. [Google Scholar] [CrossRef]
  5. He, Q.; Blum, R.; Godrich, H.; Haimovich, A. Target velocity estimation and antenna placement for MIMO radar with widely separated antennas. IEEE J. Sel. Top. Signal Process. 2010, 4, 79–100. [Google Scholar] [CrossRef]
  6. Ye, Y.; Wei, Y.; Kirubarajan, T. Scaled accuracy based power allocation for multi-target tracking with colocated MIMO radars. IEEE Trans Signal Process. 2019, 158, 227–240. [Google Scholar]
  7. Yan, J.; Pu, W.; Zhou, S. Collaborative detection and power allocation framework for target tracking in multiple radar system. Inf. Fusion 2020, 55, 173–183. [Google Scholar] [CrossRef]
  8. Yan, J.; Liu, H.; Pu, W. Joint beam selection and power allocation for multiple targets tracking in netted collocated MIMO radar system. IEEE Trans. Signal Process. 2016, 64, 6417–6427. [Google Scholar] [CrossRef]
  9. Chavali, P.; Nehorai, A. Scheduling and power allocation in a cognitive radar network for multiple-target tracking. IEEE Trans. Signal Process. 2012, 60, 715–729. [Google Scholar] [CrossRef]
  10. Hampapur, A.; Brown, L.; Connell, J.; Pankanti, S.; Senior, A. Smart surveillance: Applications, technologies and implications. In Proceedings of the Fourth International Conference on Information, Communications and Signal Processing, 2003 and the Fourth Pacific Rim Conference on Multimedia. Proceedings of the 2003 Joint, Singapore, 15–18 December 2013; pp. 1133–1138. [Google Scholar] [CrossRef] [Green Version]
  11. Sun, J.; Li, B.; Jiang, Y.; Wen, C.-Y. A camera-based target detection and positioning UAV system for search and rescue (SAR) purposes. Sensors 2016, 16, 1778. [Google Scholar] [CrossRef] [Green Version]
  12. Nam, D.; Yeom, S. Moving vehicle detection and drone velocity estimation with a moving drone. IJFIS Int. J. Fuzzy Log. Intell. Syst. 2020, 20, 43–51. [Google Scholar] [CrossRef]
  13. Yeom, S.; Cho, I. Detection and Tracking of Moving Pedestrians with a small unmanned aerial vehicle. Appl. Sci. 2019, 9, 3359. [Google Scholar] [CrossRef] [Green Version]
  14. Fernandez, M.G.; Morgenthaler, A.; Lopez, Y.A.; Heras, F.L.; Rappaport, C. Bistatic landmine and IED detection combining vehicle and drone mounted GPR sensors. Remote Sens. 2019, 11, 1–14. [Google Scholar]
  15. Ore, G.; Alcantra, M.S.; Goes, J.A.; Oliveira, L.P.; Yepes, J.; Teruel, B.; Castro, V.; Bins, L.S.; Castro, F.; Luebeck, D.; et al. Crop growth monitoring with drone-borne DInSAR. Remote Sens. 2020, 12, 615. [Google Scholar] [CrossRef] [Green Version]
  16. Almeida, J.; Rufino, J.; Alam, M.; Ferreira, J. A survey on fault tolerance techniques for wireless vehicular networks. Electronics 2019, 8, 1358. [Google Scholar] [CrossRef] [Green Version]
  17. Botte, M.; Pariata, L.; D’Acierno, L.; Bifulco, G.N. An overview of cooperative driving in the European union: Policies and practies. Electronics 2019, 8, 616. [Google Scholar] [CrossRef] [Green Version]
  18. Liu, X.; Jaekei, A. Congestion Control in V2V safety communication: Problem, analysis, approaches. Electronics 2019, 8, 540. [Google Scholar] [CrossRef] [Green Version]
  19. Koutalakis, P.; Tzoraki, Q.; Zainme, G. UAVs for hydrologic scopes: Application of a ow-cost UAV to estimate surface water velocity by using three different image-based methods. Drones 2019, 3, 14. [Google Scholar] [CrossRef] [Green Version]
  20. Yang, L.; Zhou, S.; Zhao, L.; Bi, G. A data-driven approach for monitoring forward velocity for small and lightweight drone. In Proceedings of the 2015 IEEE International Conference on Aerospace Electronics and Remote Sensing Technology (ICARES), Bali, Indonesia, 3–5 December 2015. [Google Scholar]
  21. Di Giovanni, D.; Fumian, F.; Malizia, A. Application of miniaturized sensors to Unmanned Aerial Vehicles, a new pathway for the survey of critical areas. J. Instrum. 2019, 14, 3. [Google Scholar] [CrossRef]
  22. Cai, C.; Carter, B.; Srivastava, M.; Tsung, J.; Vahedi-Faridi, J.; Wiley, C. Designing a Radiation Sensing UAV System. In Proceedings of the IEEE Systems and Information Engineering Design Conference, Charlottesville, VA, USA, 29 April 2016. [Google Scholar]
  23. Boudergui, K.; Carrel, F.; Domenech, T.; Guenard, N.; Poli, J.P.; Ravet, A.; Schoepff, V.; Woo, R. Development of a Drone Equipped with Optimized Sensors for Nuclear and Radiological Risk Characterization. In Proceedings of the 2nd International Conference on Advancements in Nuclear Instrumentation, Measurement Methods and their Applications, Ghent, Belgium, 6–9 June 2011. [Google Scholar]
  24. Miranda, S.; Baker, C.; Woodbridge, K. Fuzzy logic approach for prioritisation of radar tasks and sectors of surveillance in multifunction radar. IET Radar Sonar Navig. 2007, 1, 131–141. [Google Scholar] [CrossRef]
  25. Hassanalian, M.; Abdelkefi, A. Classifications, applications, and design challenges of drones: A review. Prog. Aerosp. Sci. 2017, 91, 99–131. [Google Scholar] [CrossRef]
  26. Kukko, A.; Kaartinen, H.; Hyyppa, J.; Chen, Y. Multiplatform mobile laser scanning: Usability and performance. Sensors 2012, 12, 11712–11733. [Google Scholar] [CrossRef] [Green Version]
  27. Kim, B.H.; Khan, D.; Bohak, C.; Kim, J.K.; Choi, W.; Lee, H.J.; Kim, M.Y. LiDAR data generation fused with virtual targets and visualization for small drone detection system. In Proceedings of the Technologies for Optical Countermeasures XV. International Society for Optics and Photonics, Berlin, Germany, 10–13 September 2018; p. 107970I. [Google Scholar]
  28. Busset, J.; Perrodin, F.; Wellig, P.; Ott, B.; Heutschi, K.; Rühl, T.; Nussbaumer, T. Detection and tracking of drones using advanced acoustic cameras. In Proceedings of the Unmanned/Unattended Sensors and Sensor Networks XI, and Advanced Free-Space Optical Communication Techniques and Applications International Society for Optics and Photonics, Toulouse, France, 23–24 September 2015; p. 96470F. [Google Scholar]
  29. Tse, D.; Viswanath, P. Fundamentals of Wireless Communication; Cambridge University Press: Cambridge, UK, 2005. [Google Scholar]
  30. Mirˇceta, K.; Bohak, C.; Kim, B.H.; Kim, M.Y.; Marolt, M. Drone segmentation and tracking in grounded sensor scanned LiDAR datasets. In Proceedings of the Zbornik Sedemindvajsete Mednarodne Elektrotehniške in Raˇcunalniške Konference ERK 2018, Portorož, Slovenia, 17–18 September 2018; pp. 384–387. [Google Scholar]
  31. Pichon, L.; Ducanchez, A.; Fonta, H.; Tisseyre, B. Quality of digital elevation models obtained from unmanned aerial vehicles for precision viticulture. OENO One 2016, 50, 101–111. [Google Scholar] [CrossRef] [Green Version]
  32. Escobar Villanueva, J.R.; Iglesias Martinez, L.; Perez Montiel, J.I. DEM generation from fixed-wing UAV imaging and LiDAR-derived ground control points for flood estimations. Sensors 2019, 19, 3205. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Jung, J.; Bae, S. Real-time road lane detection in urban areas using LiDAR data. Electronics 2018, 7, 276. [Google Scholar] [CrossRef] [Green Version]
  34. Zhou, H.; Kong, H.; Wei, L.; Creighton, D.; Nahavandi, S. Efficient road detection and tracking for unmanned aerial vehicle. IEEE J. Mag. 2014, 16, 297–309. [Google Scholar] [CrossRef]
  35. Pali, E.; Mathe, K.; Tamas, L.; Busoniu, L. Railway track following with the AR.drone using vanishing point detection. In Proceedings of the 2014 IEEE International Conference on Automation, Quality and Testing, Robotics. Cluj-Napoca, Romania, 22–24 May 2014; pp. 1–6. [Google Scholar]
  36. Liu, X.; Yang, T.; Li, J. Real-time ground vehicle detection in aerial infrared imagery based on convolutional neural network. Electronics 2018, 7, 78. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The vehicle measurement using a sensor set in each lane.
Figure 1. The vehicle measurement using a sensor set in each lane.
Electronics 09 01136 g001
Figure 2. The vehicle speed calculation flow.
Figure 2. The vehicle speed calculation flow.
Electronics 09 01136 g002
Figure 3. The components of the developed drone.
Figure 3. The components of the developed drone.
Electronics 09 01136 g003
Figure 4. The detection and speed measurement of the module control system.
Figure 4. The detection and speed measurement of the module control system.
Electronics 09 01136 g004
Figure 5. Block diagram of detection information for data processing.
Figure 5. Block diagram of detection information for data processing.
Electronics 09 01136 g005
Figure 6. The hierarchical control procedure for each event.
Figure 6. The hierarchical control procedure for each event.
Electronics 09 01136 g006
Figure 7. The Hierarchical Control Procedure for data analysis.
Figure 7. The Hierarchical Control Procedure for data analysis.
Electronics 09 01136 g007
Figure 8. The distance-measuring LiDAR sensor.
Figure 8. The distance-measuring LiDAR sensor.
Electronics 09 01136 g008
Figure 9. The measurement speed error according to the change in distance.
Figure 9. The measurement speed error according to the change in distance.
Electronics 09 01136 g009
Figure 10. Comparison of speed measurements from a speed-gun and the developed drone.
Figure 10. Comparison of speed measurements from a speed-gun and the developed drone.
Electronics 09 01136 g010
Table 1. Distance measurement according to the change in distance.
Table 1. Distance measurement according to the change in distance.
Distance (m)Average (cm)Max (cm)Min (cm)Peak to Peak (cm)
101003.519100510014
202004.823200620033
303004.391300730025
404007.657401040064
505003.617500650015
606002.320600660024
707003.086700570014
808002.362800380003
909005.368901090046
10010,070.56810,11010,0407
Table 2. Data on the velocity of vehicles in one lane measured using a speed-gun and the developed drone.
Table 2. Data on the velocity of vehicles in one lane measured using a speed-gun and the developed drone.
TestSpeed-Gun
(km/h)
Developed Drone
(km/h)
Difference in Speed
(km/h)
13131.420.42
22324.41.4
33842.344.34
42829.711.71
55050.170.17
660633
75959.680.68
84848.090.09
96870.012.01
108382.160.84
Table 3. Data on the velocity of vehicles in multiple lanes measured using a speed-gun and the developed drone.
Table 3. Data on the velocity of vehicles in multiple lanes measured using a speed-gun and the developed drone.
1st Lane2nd Lane
TestDeveloped Drone
(km/h)
Speed-Gun
(km/h)
Developed Drone
(km/h)
Speed Gun
(km/h)
158.285555.7255
269.696567.3865
378.047573.3275
427.472527.4325
511.46X11.76X
678.198584.7885
788.9684X84
841.54040.5240
910.761010.8410
1035.53235.0132
1125.332424.3224
1246.624344.6243
1357.135557.1355
1438.523640.1636
1553.75044.6850
1654.75556.2155
1793.669393.8393
Table 4. RMSE (Root Mean Square Error) and vehicle detection rates of the developed drone in multiple lanes.
Table 4. RMSE (Root Mean Square Error) and vehicle detection rates of the developed drone in multiple lanes.
RMSE
(km/h)
Vehicle Detection
Rate (%)
1st lane2.83100
2nd lane1.8394.12

Share and Cite

MDPI and ACS Style

Lee, K.H. Improvement in Target Range Estimation and the Range Resolution Using Drone. Electronics 2020, 9, 1136. https://doi.org/10.3390/electronics9071136

AMA Style

Lee KH. Improvement in Target Range Estimation and the Range Resolution Using Drone. Electronics. 2020; 9(7):1136. https://doi.org/10.3390/electronics9071136

Chicago/Turabian Style

Lee, Kwan Hyeong. 2020. "Improvement in Target Range Estimation and the Range Resolution Using Drone" Electronics 9, no. 7: 1136. https://doi.org/10.3390/electronics9071136

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop