Next Article in Journal
The Effect of Cu (II) on Swelling and Shrinkage Characteristics of Sodium Bentonite in Landfills
Next Article in Special Issue
Product Quality Prediction for Wire Electrical Discharge Machining with Markov Transition Fields and Convolutional Long Short-Term Memory Neural Networks
Previous Article in Journal
Applying Automatic Translation for Optical Music Recognition’s Encoding Step
Previous Article in Special Issue
A Five-Step Approach to Planning Data-Driven Digital Twins for Discrete Manufacturing Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Study on Distance Measurement Module for Driving Vehicle Velocity Estimation in Multi-Lanes Using Drones

Division of IT Convergence, Daejin University, Pocheon 1007, Korea
Appl. Sci. 2021, 11(9), 3884; https://doi.org/10.3390/app11093884
Submission received: 24 March 2021 / Revised: 15 April 2021 / Accepted: 22 April 2021 / Published: 25 April 2021
(This article belongs to the Special Issue Smart Manufacturing Technology)

Abstract

:
A method of estimating driving vehicle information usually uses a speed gun and a fixed speed camera. Estimating vehicle information using the speed gun has a high risk of traffic accidents by the operator and the fixed speed camera is not efficient in terms of installation cost and maintenance. The existing driving vehicle information estimation method can only measure each lane’s driving vehicle information, so it is impossible to measure multi-lanes simultaneously with a single measuring device. This study develops a distance measurement module that can acquire driving vehicle information in multi-lanes simultaneously with a single system using a drone. The distance measurement module is composed of two LiDAR sensors to detect the driving vehicle in one lane. The drone is located above the edge of the road and each LiDAR sensor emits the front/rear point of the road measuring point to detect the driving vehicle. The driving vehicle velocity is estimated by detecting the driving vehicle’s detection distance and transit time through radiation, with the drone LiDAR sensor placed at two measurement points on the road. The drone LiDAR sensor radiates two measuring points on the road and estimates the velocity based on driving vehicle’s detection distance and driving time. As an experiment, the velocity accuracy of the drone driving vehicle is compared with the speed gun measurement. The vehicle velocity RMSE for the first and second lanes using drones is 0.75 km/h and 1.3 km/h, respectively. The drone and the speed gun’s average error probabilities are 1.2% and 2.05% in the first and second lanes, respectively. The developed drone is more efficient than existing driving vehicle measurement equipment because it can acquire information on the driving vehicle in a dark environment and a person’s safety.

1. Introduction

Drones can be applied to detect a wide variety of objects. The drone’s object detection methods involve global positioning system (GPS) jamming, radar, radio wave signal detection (radio frequency sensing), image cameras and light detection and ranging (LiDAR). The GPS jamming method [1] can disable the GPS safety mode by emitting a fake GPS signal to a nearby object, converting the object path and making the current position misunderstood. However, this method is not legally permitted outside of the exhibition situation and specific time points. The radar method [2,3] detects the object by extracting information from the received signal by radiating radio waves, but the resolution is not better than other methods. The radio wave signal detection [4,5] finds both the drone and the manipulator by receiving the radio wave used by the drone without emitting radio waves like the radar and acquiring the frequency used for communication with the drone controller. However, this method’s problem is to find out the presence or absence of radio waves and the specific direction of the drone cannot be known. In addition, it is challenging to find drones if they fly autonomously without communication. In the method of detecting an object using an image camera [6,7], the object is identified, but the ability to detect the object is degraded in rain, snow and dark environments. The LiDAR method [8,9,10,11] detects the object by analyzing the object’s received signal by emitting the transmission signal with a small beam width. This method has a higher object detection accuracy than other methods.
Giuseppina et al. [12] studied the lane support system to prevent accidents caused by road departure and used a decision tree method to analyze the cause of the defects and the importance of the variable involved in the process. Marek et al. [13] studied the method of identifying the location of garbage from images collected using the drone camera and displaying it on a global map using the on-board sensor set. Hyeon et al. [14] studied a hardware architecture applicable to a system-on-chip and a vision-based tracking algorithm to track objects in the drone hovering state. Muhammad et al. [15] studied 3D maps using the data acquired from the drone. Ruiqian et al. [16] studied the method for optimizing the resolution of images detected by the drone. This method improved the image resolution extracted from the existing object detection network by applying an extended convolutional network to the global density model of the drone’s object detection. Victor et al. [17] studied a method to improve the above-ground biomass estimation performance by attaching a LiDAR sensor and a multispectral camera to an unmanned aerial vehicle platform. To compare the drone LiDAR data with the above-ground biomass estimate in the same area, the point cloud was created by applying a red-green-blue (RGB) image and a structure for the motion process and the normalized difference vegetation index of the multispectral image was calculated. Srinvasa et al. [18] studied autonomous vehicles to detect and avoid obstacles. In this method, a mapping technology that detects obstacles and creates an image of the surrounding environment using the Raspberry Pi and the LiDAR module without the computer vision technology was examined. Razvan et al. [19] studied a method in which a vehicle detects objects, obstacles, pedestrians or traffic signs in foggy weather conditions by using Laser and LiDAR methods to estimate driving visibility. This method determines the vehicle operation method according to the vehicle and improves the autonomous vehicle’s stability. Charles [20] presented two methods to extract vehicle velocity. The first method was to extract the vehicle velocity by representing the 3D coordinates of the vehicle point with two images acquired with a static camera and minimizing residual errors and dimensional deviations. The first method’s velocity extraction performance was compared with the second LiDAR method for the vehicle velocity measurement. Takashi et al. [21] studied a pedestrian recognition algorithm that detects pedestrians using high-resolution vehicle LiDAR sensors and adapts them to traffic congestion as a method to reduce pedestrian accidents. Fernando et al. [22] detected a vehicle using a laser radar mounted on the vehicle bumper in a road environment. This method estimates vehicle movement and shape by fusion of computer vision and laser radar data. Donho et al. [23] studied object detection and tracking using a camera mounted on the drone. The motion compensation of the drone used frame subtraction, morphological operation and false blob removal. However, an error occurs in the position and the object’s velocity if the sampling time is not accurate and the velocity of the object is slow. Jianqing et al. [24] studied ground filtering and point clustering techniques in windy weather by attaching light detection and ranging to a fixed device and compared the performance with existing data processing algorithms. Shuang et al. [25] studied the object matching framework based on affine-function transformation for vehicle detection from the crewless aerial vehicle’s camera image. This method effectively handles vehicles in various conditions such as scale change, direction change, shadow and partial occlusion.
The driving vehicle detection is usually measured using the speed video camera of a gentry structure installed on the road or the speed gun at the edge of the road since the driving vehicle information detector of the gentry structure is not efficient for mobility because it is fixed and the method of detecting information on the driving vehicle using the speed gun on the road’s outskirts has a high risk of traffic accidents for the measuring person.
In Figure 1, the vehicle velocity estimation is performed by a person using the speed measuring device on the road’s outskirts. When a measurer detects the vehicle information on the road’s outskirts, it is impossible to secure stability against traffic accidents. In addition, when estimating the driving vehicle information using the vehicle speed measuring device on the outskirts of the road, it is difficult to estimate accurate information because the position and inclination of the measuring device are changed due to wind resistance.
In this study, the drone distance measurement module is developed to obtain driving vehicle information with safe and efficient mobility. The drone distance measurement module measures the distance between the driving vehicle and the drone and the vehicle passing time between two measurement points on the road.
Figure 2 shows the road driving vehicle information acquisition system using the drone developed in this paper. The vehicle information acquisition system consists of a drone vehicle detection part (airborne equipment) and a ground control part (ground equipment). The drone vehicle detection part is composed of the LiDAR sensor, vehicle detection module (VDM) and detection information acquisition module (DIAM), whereas the ground control part is composed of the detection module control analysis (DMAC) to monitor and control the state of the drone and the detection information acquisition analysis (DIAA) to calculate the vehicle velocity. The drone vehicle detection part indicates two specific points on the road lane using the LiDAR sensor installed on the drone. The measuring point is set as a front point and a rear point for two points on the road, while the drone measures the distance between the drone and the measurement point when the vehicle passes through two points. The drone vehicle detection module is composed of two LiDAR sensors in one lane to obtain vehicle information. It is composed of four LiDAR sensors to obtain driving vehicle information in two lanes. The drone’s entire vehicle detection module developed in this study is composed of six LiDAR sensors to acquire vehicle information driving in multi-lanes.
The drone vehicle distance measurement method sets the distance between the drone and the measurement point by indicating the road’s front and rear points by two LiDAR sensors mounted on the drone. The distance between the LiDAR sensor and the measuring point changes when the vehicle passes the measuring point. The distance between the LiDAR sensor and the measuring point changes when the vehicle passes through the two measuring points.

2. Distance Measurement Module Development

2.1. Vehicle Detection Module Analysis

The vehicle detection module acquires information using LiDAR sensors at two measuring points in the laser for the vehicle detection and the velocity calculation and transmits the information to the ground control part. The acquired information is essential information for detecting the vehicle and determining the velocity. The vehicle detection module is necessary to detect the driving vehicle, change the angle for each sensor, check the sensor arrangement and transmit data. The sensor angle change and the sensor placement determine vehicle velocity for two points on the road.
Figure 3 is a connection diagram of the ground control part and the drone vehicle detection part. The drone vehicle detection part is composed of VDM, DIAM, camera and storage medium. The VDM creates raw sensor information for the vehicle detection and velocity calculation in the measurement range. The VDM collects the detection sensor information of two measuring points to calculate the vehicle velocity and transmits it to DIAM. The collected sensor information is important data for driving the vehicle detection and the velocity calculation. The VDM functions for detecting the driving vehicle are as follows:
  • detection of the driving vehicle distance between two measuring points;
  • change of LiDAR sensor angle to measuring point distance;
  • check the LiDAR sensor placement;
  • the function of transmitting and processing the collected raw data.
The VDM requires two LiDAR sensors to generate raw data for detecting the vehicle in each lane and a laser indicator to check the location of the LiDAR sensor projecting. The VDM communicates with the DIAM using three universal asynchronous receiver transmitters (UARTs).
The VDM and DIAM of the vehicle detection part transmit data and control signals using serial communication and the camera and DIAM transmit information using high-definition multimedia interface (HDMI) communication. The 915 MHz radio frequency (RF) communication is used transmit control signals and data between the drone vehicle detection part and the ground control part. The LiDAR sensor front (LSF) and LiDAR sensor rear (LSR) are front/rear LiDAR sensors at the road’s vehicle measurement point. The VDM is a device for detecting the driving vehicle and requires one VDM in one lane. Three VDMs are required to detect all vehicles driving in three lanes and one VDM is equipped with two LiDAR sensors. Vehicle information and the VDM status collected from the LiDAR sensor are transmitted to DIAM. The DIAM transmits start and stop signals and sampling rate to the VDM for the vehicle detection. The DIAM collects each lane’s sensor distance information as measured by the camera and the VDM and stores the raw data in the vehicle detection and the storage medium (micro san disk memory). The DIAM transmits data to the ground control part the DIAA to calculate traffic volume and vehicle velocity in real-time. The ground part DMCA calculates the vehicle velocity using the DIAM data and the DIAA extracts the vehicle detection and the velocity using the raw information of the DIAM storage medium to extract more accurate traffic information.

2.2. Vehicle Detection Module Data Flow

The distance measurement module collects information on the driving vehicle by connecting to two LiDAR sensors that detect two measuring points in the lane. The collected raw information is encapsulated and transmitted to the DIAM. In addition, it should be possible to control and set the operation of the LiDAR sensor for the DIAM command signal. Therefore, the VDM control specifications for the distance measurement module are as follows:
  • use of three or more communication ports;
  • 32 kbps or higher communication speed specification;
  • circular queue that can store 100 data per communication port;
  • communication port 2000 bytes per second interrupt processing capacity;
  • sensor information parsing function;
  • framing function for the DIAM transmission;
  • sensor control function and status check function.
Figure 4 shows the vehicle detection flow chart of the VDM. The light emitting diode (LED)/watchdog/LiDAR sensor periodically checks the hardware status of the VDM and communicates data processing to the UART. The data processing receives raw data and the vehicle detection status from periodic and beaglebone black communication (BB comm). Then, it transmits the data to the gate keeper in the form of a frame. In addition, the measured data from the LiDAR sensor is converted into distance information and stored as a queue. The BB comm controls each lane measurement LiDAR sensor to analyze information received from the DIGM using UART. The gate keeper sequentially transmits information to the DIAM from the distance information queue.

2.3. Vehicle Detection Module Design and Development

The VDM connects two LiDAR sensors for the front/rear measurement points of the lane to generate raw vehicle information for each lane and the laser indicator is used to check the measuring point of the LiDAR sensor. The VDM uses three UARTs to communicate with the DIAM and requires general purpose input output (GPIO) for LED and laser indicator control. The drone developed in this paper is located above the outside of the road and detects the driving vehicles from 1st to 3rd lanes by hovering. The drone is designed to have a detection error of about 3% or less when the vehicle’s maximum driving velocity is 170 km. The factor having the most significant influence on the accuracy of vehicle detection information is sensor performance (collection cycle/distance measurement) and the sensor accurately indicates the point of the road measurement distance. It is crucial for each sensor to precisely indicate the lane measurement point to collect accurate vehicle information. In this paper, a high-power laser indicator is used to confirm that the LiDAR sensor accurately indicates the lane’s measurement point and the vehicle information of each lane is detected by controlling the indicator on/off. Figure 5 and Figure 6 reference the VDM hardware artwork and the VDM printed circuit board (PCB) developed in this study, respectively. Figure 5 uses a program (Or-Cad) to design a circuit and PCB is manufactured as an outsourcing service. The VDM is connected to two LiDAR sensors that detect two-lane points and transmits the encapsulated raw information to the DIAM. The VDM should control and set the sensor operation according to the command received from the DIAM; that is, the VDM transfers the collected information to the DIAM by being connected to two sensors that detect lane measurement points, with the VDM operating the sensor according to the DIGM command.
The VDM control specification is as follows:
  • three or more communication ports (UART);
  • 32 kbps or higher communication speed;
  • circular queue capable of storing 1000 data per communication port;
  • interrupt processing capacity of more than 2000 bytes per second of communication port;
  • sensor information parsing function and farming function for the DIGM transmission;
  • sensor control function and status check function.

2.4. DIAM Design and Development

The DIAM is saved in real-time to collect the sensor distance information of each lane from three VDMs. The stored data are transmitted to the DMCA of the ground control part in real-time to calculate the traffic and the velocity information. The real-time raw information is stored in the micro san disk (SD) memory unit and the DIAA of the ground control part measures the accurate vehicle information when the traffic information collection is completed. The detection algorithm determines the vehicle front/rear using the collected raw information. The generated information is transferred to the DMCA queue.
The vehicle is detected using the detection algorithm and the DMCA transmission is determined by the command mode received from the DMCA detection mode. The detection algorithm is composed of normal mode and debug mode. The normal mode transmits the vehicle entry/exit detection information to calculate the vehicle velocity in the DMCA. The information of each sensor is checked in debug mode. In this mode, the DMCA sends 1 or 2 sensor selection commands and the selected sensor information is grouped by 10 to be transmitted to the DMCA.
Normal mode activity:
  • sensor number: check the sensor status detected entry/exit.
  • status: front/rear status;
  • tick: calculation of the time required for tick data rate in the front/rear detection (operates at 800 us/tick);
  • base value: represents data to obtain the distance between two sensor measurement points;
  • data produce: all distance information received from the VDM is saved in a file.
Debug mode activity:
  • sensor number: identify and check the selected sensor information;
  • current value: check the current sensor distance data;
  • base value: check the currently applied base value;
  • status: check the current status of sensors such as vehicle entry/exit, standby, status, etc.;
  • wait count: a specific time is counted when the condition of transitioning from the current stat to another state is satisfied (800 us/count).
Figure 6 shows the DIAM artwork and the developed DIAM (Figure 6); DIAM processes more than 1000 distance information per second from the three VDMs simultaneously and transmits it to the DMCA.
Figure 6 the DIAM performs the following functions:
VDM and communication function (RS232 communication):
  • acquisition of front/rear real-time sensor distance information;
  • acquisition of sensor operation status information;
  • control commands such as start/stop of collection;
  • simultaneous processing of multiple VDM information.
DMCA and communication function (Ethernet 915 MHz wireless):
  • compression function of real-time detection information (transmission of entry/exit event information);
  • integrity of event information delivery (transmission confirmation and error check);
  • real time information transmission function for validating detection processing.
Vehicle detection function:
  • distance value slope and change amount search function;
  • front/rear decision;
  • determine the validity of status transition;
  • separate management of parameters for each sensor (expected to be different for each detection location and inspection pattern).
Real-time raw information storage function:
  • include in the parameter file header for vehicle detection.

2.5. Velocity Estimation

This study develops the VDM and the DIAM to mount the drone as mission equipment. The driving vehicle information of each lane is obtained by hovering the drone at the outside the lane. Figure 7 shows the vehicle detection method using the drone. The drone LiDAR sensor radiates two measuring points to obtain information on the vehicle driving in the lane. H f . is the distance between the entrance measurement point and the LiDAR front sensor, H r is the distance between the LiDAR rear sensor and the exit measurement point, θ s is the angle between the front/rear sensors and L is the distance between the two measurement points. The two LiDAR sensors are assigned to one lane and each sensor detects the vehicle by radiating each front/rear point. The distance (L) between the two measurement points is changed according to the flying height of the drone and the distance is calculated by Equation (1) [26].
The distance between the two measuring points is expressed as follows:
L = H f 2 + H r 2 ( 2   H f   H r   c o s θ s )
From Equation (1), the driving vehicle velocity for each lane is expressed as follows:
V = L t
In Equation (2), V is the vehicle velocity and t is the measurement difference time between the LiDAR front sensor and the rear sensor at the two measuring points.
Figure 8 shows the vehicle velocity algorithm. First of all, the LiDAR front sensor measures the distance of the entry point. The distance of the H f changes when the vehicle driving through the entry measurement point. According to the change in the distance of H f , it is determined that the vehicle has passed the entrance measurement point and the time of the entry measurement point is recorded. The passing time is not recorded if there is no change in the distance of the H f . It records the passing time of the vehicle exit point if there is a change in H r distance when the LiDAR rear sensor is radiating the exit point. The t is obtained using the passing time of the entrance measurement point of the LiDAR front sensor and the exit passing time of the LiDAR rear sensor. The passing vehicle count is determined by the change in H r measurement distance.

3. Experiment of Vehicle Detection

This chapter describes an experiment to detect the driving vehicle on the road using the drone. The drone has six VDM sensor units to detect driving vehicles from the first lane to the third lane of the road, but the experiment is conducted on the first and second lanes. The reason is that the road environment for testing up to a total of three lanes is not suitable and testing on urban roads is not safe. Thus, the test is only done up to the two entire lanes.
Figure 9a shows the octa-quad vehicle detection drone developed in this paper. The detection sensor module part is mounted on the gimbal at the bottom of the drone. Figure 9b shows the drone vehicle detection part of the detection sensor module unit and a green indicator is displayed to confirm the radiation position of the LiDAR sensor. The drone is equipped with a total of six LiDAR sensors to detect the vehicle information from lane 1 to lane 3. The radiating point’s position can be checked with the green color of the indicator inside the red circle. The VDM angle can be controlled so that the LiDAR sensor can accurately radiate the measuring point of the road regardless of the drone’s flying height.
Figure 10 shows the test environment on the road for vehicle detection. The drone’s position is stopped in the vertical air outside the road and the detection module sensor unit radiates the position of the entry/exit measurement point for each lane. At this time, the indicator checks the location of the entry/exit measurement point of the LiDAR sensor. Figure 11 shows the experiment for measuring the vehicle velocity in the lane. The drone radiates the entry/exit measurement point with the LiDAR sensor to measure the vehicle’s velocity in two lanes and acquires the vehicle’s velocity when passing through the measurement point. VAN drives the test vehicle in the first lane and small cars in the second lane.
Figure 11a,c show that the vehicle’s velocity measurement in the first lane using the speed gun and the drone. Figure 11b,d show the velocity measurement of the vehicle driving in the second lane using the speed gun and the drone. Figure 11a shows 58.28 km/h and 58.70 km/h for the driving vehicle speed measurement using the speed gun and the drone, respectively. In Figure 11b, the velocity measurement of the vehicle driving the speed gun is 54.83 km/h and the measurement of the drone driving vehicle’s velocity is 55.49 km/h. Figure 11c shows 60.99 km/h and 63.15 km/h for the speed gun and the drone velocity measurements, respectively. Figure 11d shows that the speed gun’s driving vehicle velocity measurement is 53.14 km/h and the drone’s driving vehicle velocity measurement is 56.34 km/h.
Table 1 shows the velocity of the vehicle driving in the first and the second lanes using the drone and the speed gun. The speed gun’s average error and the drone in the first and the second lanes are 0.58 km/h and 1.06 km/h, respectively. The speed gun and the drone’s root mean square error (RMSE) represent 0.75 km/h and 1.31 km/h in the first and second lanes, respectively. The average error probability between the speed gun and the drone is 1.2% and 2.05% in the first and second lanes, respectively.
Figure 12 shows the wind resistance test of the developed drone’s ability to detect the driving vehicles. The change in wind speed is controlled by the distance between the wind turbine and the drone. The wind resistance is measured according to changes in the wind speed. Figure 12a shows the wind resistance of the drone at the wind speed of 6.6 m/s when the distance between the wind turbine and the drone is 8 m. Figure 12b shows the wind resistance of the drone in the wind speed of 7.7 m/s when the distance between the wind turbine and the drone is 8 m and Figure 12c shows wind resistance of the drone in the wind speed of 8.7 m/s when the distance is 4 m. Figure 12d shows the wind resistance of the drone in the wind speed of 8.7 m/s when the distance between the wind turbine and the drone is 2 m and Figure 12e shows the wind resistance of the drone in the wind speed of 12.1 m/s when the distance is 1 m.
This study detects the driving vehicle using the LiDAR sensor module mounted on the drone. It is very dangerous to conduct the experiment on the entire system on a general road without ensuring 100% safety for the drone flight. The experimental environment was not conducted on a city road, but on an outside road where no vehicle was driving. Since the drone driving vehicle detection experiment is conducted on a place without obstacles such as streetlights and electric wire, it is not possible to sufficiently consider any exceptional situations that occur on general city roads. The vehicle velocity measurement using the drone was conducted 25 times. In the experiment, the average velocity error and the total average vehicle detection rate showed performances of 1.6% and 100%, respectively. Although it was not possible to conduct the wind resistance experiment for the drone in an actual vehicle driving environment, the wind resistance of the drone was shown up to the wind speed 12 m/s, configured by an arbitrary environment.

4. Conclusions

In this paper, the driving vehicle was detected using the drone and the vehicle velocity was calculated based on the detection information. The driving vehicle velocity estimation was calculated from a distance and passing the time by detecting the driving vehicle with the drone LiDAR sensor at the two measurement points on the road. The drone’s position for measuring the driving vehicle was stationary flying above the outside of the road, and the driving vehicle detected the first lane and the second lane at the position of the drone. The vehicle velocity estimation of the developed drone was compared with the speed gun of the existing equipment. The speed gun and drone’s RMSEs were 0.75 km/h in the first lane and 1.31 km/h in the second lane. The performance of measuring the driving vehicle’s velocity was similar between the drone and the existing equipment.
The existing driving vehicle velocity measuring equipment requires measuring equipment for each lane. However, the developed drone system can measure vehicles in all lanes with a single drone. Drone velocity measurement can reduce the risk of roadside traffic accidents and there is no need to install a gentry structure. In addition, vehicle information can be obtained in a dark environment. The velocity of the vehicle can be measured at a maximum wind speed of 12.1 m/s. The drone’s total weight is about 12.7 kg and the hovering range is within 1 m.
This research and development (R&D) drone system has developed a driving vehicle detection system from one to three lanes, but there is no test road environment with roads from one to three lanes. Thus, the driving vehicle velocity measurement was tested in the first and second lanes. It was impossible to test the driving vehicle’s velocity in the urban road environment with up to three lanes because drone flight regulations and laws restrict it. In addition, South Korea is limited in the area and time of drone flights due to neighboring countries’ environment. It is impossible to sufficiently consider the irregularities in all situations since the measurement of the driving vehicle velocity cannot be tested in various road topologies and night environments.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data availability in a publicly accessible repository.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, J. How to us the C/A code of GPS to detect the exospheric passive radar target. In Proceedings of the 7th International Conference on Signal Processing, Beijing, China, 31 August–4 September 2004; pp. 2194–2197. [Google Scholar]
  2. Yan, J.; Liu, H.; Jiu, B.; Liu, Z.; Bao, Z. Joint detection and Tracking processing algorithm for target tracking in multiple radar system. IEEE Sens. 2015, 15, 6534–6541. [Google Scholar] [CrossRef]
  3. Zhang, W.; Li, H.; Sun, G.; He, Z. Enhanced detection of Doppler-spread target for FMCW radar. IEEE Trans. Aerosp. Electron. Syst. 2019, 55, 2066–2078. [Google Scholar] [CrossRef]
  4. Dempsey, T.P.; Brooker, M.C. An SPW computer simulation analysis of an RF signal detection system. In Proceedings of the Tactical Communications Conference, Fort Wayne, IN, USA, 30 April–2 May 1996; pp. 239–301. [Google Scholar]
  5. Prasad, K.N.R.S.V.; D’souza, K.B.; Bhargava, V.K. A downscaled faster-RCNN Framework for signal detection and time frequency localization in wideband RF systems. IEEE Trans. Wirel. Commun. 2020, 19, 4847–4862. [Google Scholar] [CrossRef]
  6. Zhangjing, W.; Xianhan, M.; Zhen, H.; Haoran, L. Research of target detection and classification techniques using millimeter wave radar and vision sensors. Remote Sens. 2021, 13, 1064. [Google Scholar]
  7. Liu, H.; Pi, W.; Zha, H. Motion detection for multiple moving targets by using an omnidirectional camera. In Proceedings of the International Conference of Robotics, Intelligent System and Signal Processing, Changsha, China, 8–13 October 2003; pp. 422–426. [Google Scholar]
  8. Escobar Villanueva, J.R.; Iglesias Martinez, L.; Perez Montiel, J.I. DEM generation from fixed-wing UAV imaging and LiDAR-derived ground control points for flood estimations. Sensors 2019, 19, 3205. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Premebida, C.; Ludwig, O.; Nunes, U. LIDAR and vision-based pedestrian detection system. J. Field Robot. 2009, 26, 696–711. [Google Scholar] [CrossRef]
  10. Kim, B.H.; Khan, D.; Bohak, C.; Kim, J.K.; Choi, W.; Lee, H.J.; Kim, M.Y. Lidar data generation fused with virtual targets and visualization for small drone detection system. In Proceedings of the Technologies for Optical Countermeasures XV International Society for Optics and Photonics, Berlin, Germany, 10–13 September 2018; pp. 284–293. [Google Scholar]
  11. Mirčeta, K.; Bohak, C.; Kim, B.H.; Kim, M.Y.; Marolt, M. Drone segmentation and tracking in grounded sensor scanned LiDAR datasets. In Proceedings of the Zbornik Sedemindvajsete Mednarodne Elektrotehniške in Računalniške Conference, Portorož, Slovenija, 17–18 September 2018; pp. 384–387. [Google Scholar]
  12. Giuseppina, P.; Salvatore, C.; Alessandro, D.; Alessandro, S. Decision tree method to analyze the performance of lane support systems. Sustainability 2021, 13, 846. [Google Scholar]
  13. Marek, K.; Mateusz, P.; Bartosz, P.; Krzysztof, W. Autonomous, onboard vision-based trash and litter detection in low altitude aerial images collected by an unmanned aerial vehicle. Remote Sens. 2021, 13, 965. [Google Scholar]
  14. Hyeon, K.; Jaechan, C.; Yongchul, J.; Seongjoo, L.; Yunho, J. Area-efficient vision-based feature tracker for autonomous hovering of unmanned aerial vehicle. Electronics 2020, 9, 1591. [Google Scholar]
  15. Muhammad, H.C.; Anuar, A.; Qudsia, G.; Muhammad, S.F.; Himan, S.; Nadhir, A.I. Assessment of DSM based on radiometric transformation of UAV data. Sensors 2021, 21, 1649. [Google Scholar]
  16. Ruiqian, Z.; Zhenfeng, S.; Xial, H.; Jiaming, W.; Deren, L. Object detection in UAV images via global density fused convolution Network. Remote Sens. 2020, 12, 3140. [Google Scholar]
  17. Victor, G.J.; Andreas, F.; Jorg, B. AGB estimation in a tropical mountain forest (TMF) by means of RGB and multispectral images using an unmanned aerial vehicle(UAV). Remote Sens. 2019, 11, 1413. [Google Scholar]
  18. Srinivasa, N.R.; Alekhya, E.; Niyzauddin Suleman, M.D.; Sai Nikhil, K. Autonomous obstacle avoidance vehicle using LIDAR and embedded system. IJRASET 2020, 8, 25–31. [Google Scholar]
  19. Razvan, C.M.; Ciprian, D.; Florin, A.; Florin, S.; Ioan, S. Laser and LIDAR in a system for visibility distance estimation in fog conditions. Sensors 2020, 20, 6322. [Google Scholar]
  20. Charles, B. Vehicle speed estimation from two images for LIDAR second assessment. In Proceedings of the International Conference on Computer Vision Theory and Application, Rome, Italy, 24–26 February 2012; pp. 381–386. [Google Scholar]
  21. Takashi, O.; Hiroshi, S.; Yasuhiro, S.; Kiyokazu, T.; Katsuhiro, M. Pedestrian detection and tracking using in-vehicle Lidar for automotive application. In Proceedings of the IEEE Intelligent Vehicles Symposium, Baden-Baden, Germany, 5–9 June 2011; pp. 734–739. [Google Scholar]
  22. Fernando, G.; Pietro, C.; Alberto, B. Vehicle Detection Based on Laser Radar. LNCS 2009, 5717, 391–397. [Google Scholar]
  23. Donho, N.; Seokwon, Y. Moving vehicle detection and drone velocity estimation with a moving drone. IJFIS Int. J. Fuzzy Log. Intell. Syst. 2020, 20, 43–51. [Google Scholar]
  24. Wu, J.; Xu, H.; Tian, Y.; Pi, R.; Yue, R. Vehicle detection under adverse weather from roadside LiDAR data. Sensors 2020, 20, 3433. [Google Scholar] [CrossRef]
  25. Cao, S.; Yu, Y.; Guan, H.; Peng, D.; Yan, W. Affine-Function Transformation-Based Object Matching for Vehicle Detection from Unmanned Aerial Vehicle Imagery. Remote Sens. 2019, 11, 1708. [Google Scholar] [CrossRef] [Green Version]
  26. Lee, K.H. Improvement in target range estimation and the range resolution using drone. Electronics 2020, 9, 1136. [Google Scholar] [CrossRef]
Figure 1. Vehicle velocity measurement with existing equipment.
Figure 1. Vehicle velocity measurement with existing equipment.
Applsci 11 03884 g001
Figure 2. Vehicle information acquisition system: LSF—LiDAR sensor front; LSR—LiDAR sensor rear; VDM—Vehicle detection module; DIAM—Detection information acquisition module; DMCA—Detection module control analysis; DIAA—Detection information acquisition analysis.
Figure 2. Vehicle information acquisition system: LSF—LiDAR sensor front; LSR—LiDAR sensor rear; VDM—Vehicle detection module; DIAM—Detection information acquisition module; DMCA—Detection module control analysis; DIAA—Detection information acquisition analysis.
Applsci 11 03884 g002
Figure 3. Vehicle information detection system.
Figure 3. Vehicle information detection system.
Applsci 11 03884 g003
Figure 4. Vehicle detection module (VDM) data flow diagram.
Figure 4. Vehicle detection module (VDM) data flow diagram.
Applsci 11 03884 g004
Figure 5. This is a VDM of the vehicle detection module part: (a) VDM hardware (H/W) artwork; (b) Developed VDM.
Figure 5. This is a VDM of the vehicle detection module part: (a) VDM hardware (H/W) artwork; (b) Developed VDM.
Applsci 11 03884 g005
Figure 6. This is a detection information acquisition module (DIAM) of the vehicle detection module part: (a) DIAM H/W artwork; (b) Developed DIAM.
Figure 6. This is a detection information acquisition module (DIAM) of the vehicle detection module part: (a) DIAM H/W artwork; (b) Developed DIAM.
Applsci 11 03884 g006
Figure 7. Vehicle detection on lane.
Figure 7. Vehicle detection on lane.
Applsci 11 03884 g007
Figure 8. Vehicle measurement time flow chart.
Figure 8. Vehicle measurement time flow chart.
Applsci 11 03884 g008
Figure 9. This is a developed drone for vehicle detection a lane: (a) drone with detection sensor part; (b) drone VDM indicating in the red circle.
Figure 9. This is a developed drone for vehicle detection a lane: (a) drone with detection sensor part; (b) drone VDM indicating in the red circle.
Applsci 11 03884 g009
Figure 10. Vehicle detection experiment method on the first and second lines.
Figure 10. Vehicle detection experiment method on the first and second lines.
Applsci 11 03884 g010
Figure 11. Vehicle speed experiment method on the first and second lanes; (a) speed gun 58.28 km/h and drone 58.70 km/h on the first lane; (b) speed gun 54.83 km/h and drone 55.49 km/h on the second lane; (c) speed gun 60.99 km/h and drone 63.15 km/h on the first lane; (d) speed gun 53.14 km/h and drone 56.34 km/h on the second lane.
Figure 11. Vehicle speed experiment method on the first and second lanes; (a) speed gun 58.28 km/h and drone 58.70 km/h on the first lane; (b) speed gun 54.83 km/h and drone 55.49 km/h on the second lane; (c) speed gun 60.99 km/h and drone 63.15 km/h on the first lane; (d) speed gun 53.14 km/h and drone 56.34 km/h on the second lane.
Applsci 11 03884 g011
Figure 12. This is the wind resistance of the developed drone: (a) wind speed of 6.6 m/s at a distance of 8 m from the fan to the drone; (b) wind speed 7.7 m/s at a distance of 6 m from the fan to the drone; (c) wind speed 8.7 m/s at a distance 4 m from the fan to the drone; (d) wind speed 10.1 m/s at a distance 2 m from the fan to the drone; (e) wind speed 12.1 m/s at a distance 1 m from the fan to the drone.
Figure 12. This is the wind resistance of the developed drone: (a) wind speed of 6.6 m/s at a distance of 8 m from the fan to the drone; (b) wind speed 7.7 m/s at a distance of 6 m from the fan to the drone; (c) wind speed 8.7 m/s at a distance 4 m from the fan to the drone; (d) wind speed 10.1 m/s at a distance 2 m from the fan to the drone; (e) wind speed 12.1 m/s at a distance 1 m from the fan to the drone.
Applsci 11 03884 g012aApplsci 11 03884 g012b
Table 1. Comparison of the vehicle measurements for the speed gun and the drone on the multi-lane.
Table 1. Comparison of the vehicle measurements for the speed gun and the drone on the multi-lane.
Measurement Velocity on First LaneMeasurement Velocity on Second Lane
NoSpeed Gun (km/h)Drone (km/h)DifferenceDifference Rate (%)Speed-Gun (km/h)Drone (km/h)DifferenceDifference Rate (%)
149.4950.170.681.3745.4842.97−2.515.52
252.5752.32−0.250.4848.5349.631.12.27
352.3252.12−0.200.3845.4446.250.811.78
443.2543.630.380.8846.5846.37−0.210.45
533.0034.81.85.4533.2833.700.421.26
647.6148.090.481.0149.1350.110.981.99
760.9963.152.163.5453.1456.343.26.02
856.6356.990.360.6447.2848.531.252.64
954.8355.490.661.2047.1547.420.270.57
1055.5256.450.931.6858.2858.700.420.72
1128.0228.240.220.7926.4327.761.335.03
1244.8044.48−0.320.7145.1047.051.954.32
1357.7557.21−0.540.9449.2149.540.330.67
1454.2254.11−0.110.2045.0545.270.220.49
1558.3259.531.212.0746.9045.24−1.663.54
1668.5368.680.130.2468.8770.972.13.98
1771.6472.320.681.5379..3480.000.661.02
1875.0674.81−0.250.5287.3786.47−0.901.38
1980.1379.93−0.200.4192.6593.030.380.54
2084.6885.060.380.86100.89100.00−0.891.24
2190.3591.010.660.11112.38113.060.680.91
2298.5399.000.470.68125.23125.980.750.99
23113.58113.980.400.54138.85139.020.170.22
24127.06127.390.331.70150.17148.57−1.601.50
25136.85138.571.722.11161.35159.68−1.672.12
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lee, K.-H. A Study on Distance Measurement Module for Driving Vehicle Velocity Estimation in Multi-Lanes Using Drones. Appl. Sci. 2021, 11, 3884. https://doi.org/10.3390/app11093884

AMA Style

Lee K-H. A Study on Distance Measurement Module for Driving Vehicle Velocity Estimation in Multi-Lanes Using Drones. Applied Sciences. 2021; 11(9):3884. https://doi.org/10.3390/app11093884

Chicago/Turabian Style

Lee, Kwan-Hyeong. 2021. "A Study on Distance Measurement Module for Driving Vehicle Velocity Estimation in Multi-Lanes Using Drones" Applied Sciences 11, no. 9: 3884. https://doi.org/10.3390/app11093884

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop