Next Article in Journal
Drone-Aided Delivery Methods, Challenge, and the Future: A Methodological Review
Previous Article in Journal
Segmentation Detection Method for Complex Road Cracks Collected by UAV Based on HC-Unet++
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

YOLO-Based UAV Technology: A Review of the Research and Its Applications

1
College of Information and Electrical Engineering, Shenyang Agricultural University, Shenyang 110866, China
2
College of Electronic Engineering and Artificial Intelligence, South China Agricultural University, Guangzhou 510642, China
*
Authors to whom correspondence should be addressed.
Drones 2023, 7(3), 190; https://doi.org/10.3390/drones7030190
Submission received: 17 February 2023 / Revised: 4 March 2023 / Accepted: 7 March 2023 / Published: 10 March 2023

Abstract

:
In recent decades, scientific and technological developments have continued to increase in speed, with researchers focusing not only on the innovation of single technologies but also on the cross-fertilization of multidisciplinary technologies. Unmanned aerial vehicle (UAV) technology has seen great progress in many aspects, such as geometric structure, flight characteristics, and navigation control. The You Only Look Once (YOLO) algorithm was developed and has been refined over the years to provide satisfactory performance for the real-time detection and classification of multiple targets. In the context of technology cross-fusion becoming a new focus, researchers have proposed YOLO-based UAV technology (YBUT) by integrating the above two technologies. This proposed integration succeeds in strengthening the application of emerging technologies and expanding the idea of the development of YOLO algorithms and drone technology. Therefore, this paper presents the development history of YBUT with reviews of the practical applications of YBUT in engineering, transportation, agriculture, automation, and other fields. The aim is to help new users to quickly understand YBUT and to help researchers, consumers, and stakeholders to quickly understand the research progress of the technology. The future of YBUT is also discussed to help explore the application of this technology in new areas.

1. Introduction

As science and technology develop, new ways of living and working emerge, and new technologies with higher value gradually replace old ones. The value of new technologies is not only the innovation and development of the technology but whether the technology is effective in improving productivity and making a contribution to human society, i.e., the application of new technologies to change traditional ways of addressing existing problems to improve social productivity. Today, interdisciplinary or multifield cooperation is a trendy topic; that is, mature technologies in multiple fields are combined to become a new technology, and the advantages of various technologies are retained to compensate for the disadvantages. The integration of existing technologies in multiple fields can not only quickly generate new methods and ideas to address existing problems but can also greatly reduce resource consumption. Currently, unmanned aerial vehicles (UAVs) or aerial robotics are in a period of rapid development [1], and target detection performance based on the You Only Look Once (YOLO) algorithm [2] has reached a high level in industry. The algorithm still needs to be modified and improved [3]. UAVs can carry a variety of devices to accomplish different tasks. Examples of these tasks include spraying liquid medicine [4], mapping [5], logistics transportation [6], disaster management [7], aerial photography [8], and sowing fertilizer or seeds [9]. Object detection technology based on the YOLO algorithm has been able to achieve human behavior analysis [10], face mask recognition [11], medical diagnosis analysis [12], autonomous driving [13], traffic assessment [14], multitarget tracking [15], and robot vision [16]. However, UAVs face complex scenarios or work with the need to maintain good data communication with ground control terminals, so the innovation and development of UAV technology may be limited by certain application environments. However, UAVs face complex scenarios or work with the need to maintain good data communication with ground control terminals, so the innovation and development of UAV technology may be limited by certain application environments. Moreover, object detection technology based on the YOLO algorithm needs to be deployed into high-performance processors and be used in conjunction with image or video data, which places certain requirements on the scenarios where it is used. These two technologies can be combined to create a new technology—YOLO-based UAV technology (YBUT). UAVs provide more application scenarios for the YOLO algorithm, and the YOLO algorithm can assist UAVs in completing more novel tasks. In this way, drone technology and the YOLO algorithm can further facilitate people’s daily lives while contributing to the productivity of their respective industries.
A UAV is often defined as an unmanned flying device that can either fly autonomously according to a course or program established within the system or can be manually controlled by the controller. UAVs can be classified into various types depending on various parameters. In recent years, as a hot spot in the new round of global scientific and technological revolution and industrial revolution, UAVs have been able to replace most of the tasks that used to be completed by manned aircraft. At the same time, as UAV technology continues to mature, the number of UAVs in countries around the world is increasing every year, and according to Global commercial drone annual sales and sales statistics [17], as illustrated in Figure 1, there will be approximately 2,679,000 UAVs in the world by 2025, with a market size worth approximately USD 5.3 billion. With such a large number of UAVs worldwide, it may be possible to make UAVs more valuable if they can be used as aerial platforms to deploy YOLO algorithms.
YOLO is a widely used deep learning algorithm, because it is a classification-/regression-based object detection method, giving the algorithm its core strengths: a very simple structure, small model size, and fast computational speed. After seven years of development since the introduction of YOLO (as of February 2023), researchers released seven versions of the YOLO algorithm [18,19,20,21,22]. After the YOLO algorithm was popularized, researchers and users improved it for various applications due to its openness and ease of secondary development, and various revisions were introduced, such as YOLODrone [23], YOLOv4_Drone [24], VIT-YOLO [25], YOLO-RTUAV [26], YOLO-Neck [27], and YOLOv7-DeepSORT [28]. The mechanism of the YOLO-based object detection algorithm is that the input image is resized to the same size, and then the image is divided into a total of S × S network cells of equal size, and each individual network cell can detect objects within it. If the center of a detected target falls into a network cell, that network cell will make a prediction about the target. Each network cell may have N detection boxes, each of which not only calculates its own position but also makes a prediction score. The score represents the likelihood that a detection target is present in the predicted network cell. As there may be multiple boxes in a network cell, YOLO will automatically select the highest-scoring target category for prediction (see Figure 2).
When UAVs are used for operations in various industries and when an operation involves target identification and positioning, manual operation is often required to confirm the goal location and operation location through real-time images transmitted by UAV cameras, such as transport UAVs for precise parcel delivery. The process is tedious and cumbersome and requires a high level of UAV handling skills, which can easily lead to accidents and financial losses. Although YOLO-based target detection technology emerged late, it has been continuously improved and developed by researchers and already has a high accuracy rate in the fields of object detection and image recognition; the detection speed and accuracy have reached the forefront of the industry while playing an important role in the application of target recognition in UAV aerial images [29]. Khang et al. [30] conducted experiments on the VisDrone2019 dataset containing 96 videos and 39,988 annotated frames and evaluated deep learning detectors with FPS and mAP as evaluation metrics, including Faster R-CNN, RFCN, SNIPER, YOLO, RetinaNet, and CenterNet. Ammar et al. [31] evaluated the performance of convolutional neural network models, such as Faster R-CNN, YOLOv3, YOLOv4, and EfficientDet, using IoU, precision, recall, F1, AP, and mAP as evaluation metrics. The experimental results showed that YOLO is ideal for real-time target detection applications. If the UAV is equipped with an embedded processor deploying the YOLO algorithm, object detection recognition can be realized on real-time footage from the UAV camera, turning the two steps of UAV acquisition and computer detection into simultaneous UAV acquisition and detection, which greatly saves operational time and improves operational efficiency. The improvement in the level of autonomous target recognition by drones can strongly promote the automation or unmanned operation of drones in most industries.
Over the past few years, YBUT has become a popular research area of interest, but the application scenarios and impact of the technology have yet to be enhanced; a summary overview of the recent state of the application in this technology area is lacking. This paper, therefore, presents the history of YBUT and provides an overview of case studies of the application of YBUT in several industry sectors, intending to provide researchers, beginners, and consumers with a better understanding of the field, as well as a reference for the research and application of the technology in new areas. Here, we also discuss the direction of the technology and provide an outlook on its application.

2. Survey Methodology

In this section, we explain the methodology and the idea behind the selection of the papers studied and the main areas of application of YBUT. To screen the literature efficiently and quickly for papers within the scope of this overview, a clear and simple screening process was identified for the published literature, and its methodology is explained, together with an analysis of the main research directions of interest in international and Chinese journals.

2.1. Screening Methods for Research Papers Related to YBUT

To search for high-impact research/papers on aerial robots or UAVs that use deep learning YOLO models/algorithms, many of the keywords come from top journals and conferences, including the Web of Science Core Collection, KCI (Korean Journal Database), MEDLINE®, SciELO Citation Index, and China National Knowledge Infrastructure indexed journals, among others. The collected keywords were grouped into A1, A2, A3, and A4 groups and searched in various search engines; then, the results were then filtered for the next step. The keyword groupings used and the detailed search method for the articles are shown in Figure 3.
With the Web of Science search engine as an example, after searching for the above keywords, 243 articles in 41 research fields were collected as of February 2023, containing the types of papers, conferences, revisions, letters, etc., with the specific subject directions shown in Table 1.
Based on the applicability of the filtered articles, we did not consider multiple topics, such as revisions and letters. The articles were also rigorously reviewed for content to remove articles that did not contribute to the topic of this research review, with a focus on checking the image data within the article and the dataset used. At the same time, the articles were verified and analyzed for algorithmic improvements and innovations, and some articles were selected that were progressive or identified as implementable for the development of the relevant industry. Finally, the introduction, discussion, summary, and outlook of the articles, after the screening process was completed, were checked and categorized. Each step of the methodology used in the screening process of the required articles is shown in Figure 4.

2.2. Research Topics Utilizing YBUT

Through the method described above, the search results were analyzed by using both English and Chinese search engines, such as Web of Science and China National Knowledge Infrastructure, to obtain the main research themes of English and Chinese journals in the relevant fields. Computer vision technology has developed a great variety of algorithms to date, among which the YOLO algorithm was proposed in 2016 and then first applied in 2017 by Jiang et al. [32], who combined the YOLO algorithm with UAVs. Since then, the YOLO algorithm and UAV fusion technology have been continuously developed, and there has been a surge in related research results or applications. The technology has also moved from an exploratory experiment to an academic research hotspot (see Figure 5).
Based on our survey of YBUT application areas, the information for popular topics in this field in English journals is summarized in a pie chart, as shown in the survey results in Figure 6. As seen from the pie chart, the popular topics are mainly in the industries of technical studies, engineering, and transportation, and the number of published papers or conference literature represents the interest of researchers. We also surveyed Chinese journals on popular topics in this area and found that they focus more on the technical studies, engineering, and automation sectors. As UAV technology and YOLO algorithms continue to evolve, this technology is beginning to be explored in most areas, and in a few areas, there have been some successes. The development and research of YBUT have been hot topics in top journals and conferences, and now the practical application of the technology is gradually attracting their interest.

3. YBUT Development

YBUT advances without the support of high-performance computer processors. UAVs have moved from being operated manually by remote control to now being controlled automatically by computers, and image recognition has moved from being run by computer systems to now being run by onboard embedded systems for real-time detection and recognition. Each of these technological advances has taken the application of technology to a new level in some areas.

3.1. Early Development of YBUT

At the beginning of the research on YBUT, the technology was proposed because of a technological fusion between UAV technology and YOLO algorithms in the context of a trend towards cross-disciplinary development. Among other things, UAV technology research began in the 1920s and has been developed, to date, with successful applications in agriculture, surveillance, monitoring, traffic construction, system transportation, system inspection, etc. The YOLO algorithm was proposed in 2016 [2], and after several improvements, it has reached the forefront of the object recognition field in terms of detection speed, detection accuracy, and recognition classification.
The application of YBUT in real production operations started in 2017. In the early stages of the YBUT application, the main working mechanism was image or video data acquisition by UAVs, followed by object detection, identification, and classification by computers running YOLO-based object detection algorithms. To explore methods to detect vehicles from UAV-captured images for application in traffic monitoring and management, and as deep learning algorithms have shown significant advantages in target detection, researchers have tried to apply YOLO-based object detection algorithms to vehicle detection in UAV images. Jiang et al. [32] constructed a multisource data acquisition system by integrating a thermal infrared imaging sensor and a visible-light imaging sensor on a UAV, corrected and aligned the images through feature point extraction and single response matrix methods, and then performed image fusion on the multisource data. Finally, they utilized a deep learning YOLO algorithm for data training and vehicle detection (see Figure 7). The experimental results found that the inclusion of a thermal infrared image dataset could improve the accuracy of vehicle detection and verified that the YOLO framework is an advanced and effective framework for real-time target detection. The first attempt to combine and apply the YOLO algorithm with UAV technology by Jiang et al. [32] demonstrated the usability of YOLO-based UAV technology. Although the detection performance of the early YOLOv1 algorithm was not very good, the experimental results were relatively satisfactory as the first exploration of the technology and the innovative incorporation of thermal infrared image data.
Based on this study, Xu et al. [33] proposed an improved algorithm for small vehicle detection based on YOLOv2, whose detection structure model is shown in Figure 8. Compared with the YOLOv2 model structure, the algorithm adds an additional feature layer that can reach 1/32 of the input image in size, making the algorithm more adept at detecting small targets and having higher localization accuracy than YOLOv2. This research has greatly contributed to researchers’ understanding of YBUT and has also inspired researchers to make targeted improvements to the YOLO algorithm structure when carrying out applications in this field. Since then, Ruan et al. [34] and Yang et al. [35] have further explored the application of YBUT in other fields.
In addition, Ruan et al. [34] attempted to use a deep learning and vision-based drogue detection and localization method to address the accurate detection and localization of fog droplets for autonomous aerial refueling of UAVs in complex environments. They used the trained YOLO algorithm for cone trace detection, the least squares ellipse fit to determine the long semiaxis of the ellipse after determining the fiducial location, and, finally, a monocular vision camera for vertebral drogue localization (see Figure 9). The simulation experimental results show that the method can not only correctly identify cones in complex environments but also accurately locate cones in a range of 2.5–45 m, indicating that the YOLO method has good results for target object detection and localization in various complex environments. Yang et al. [35] investigated a method to achieve real-time pedestrian detection and tracking on a mobile platform with multiple disturbances; they attempted to use a UAV hovering in the air for data acquisition of special targets while using a ground station deployed with YOLOv2 to accept video streams from the UAV for analysis and detection. The results of outdoor pedestrian detection experiments showed the robustness of the method when the brightness varied and pedestrians continued to interfere, demonstrating that this is a stable method for exceptional pedestrian tracking on UAV platforms. Most of these early studies explored simple applications of the fusion of the two technologies due to the lack of maturity of the technology fusion application, but the information gained from the research is of greater reference value for subsequent research on YBUT. An increasing number of researchers are focusing on and exploring the field of YOLO-based UAVs, continuing to drive progress in the development of the field, and a new generation of YBUT has emerged as the performance of high-performance computer processors increases while the size of the hardware decreases.

3.2. YBUT Develops by Leaps and Bounds

As YBUT continues to evolve, a new generation has emerged in which the UAV is equipped with a high-performance processor rich in computing resources, within which YOLO-based object detection algorithms are deployed, allowing the processor to detect, to identify, and to classify mission objects in real time as the UAV collects data. Zhang et al. [36], to explore the feasibility of a new generation of technology, embedded the YOLOv3 algorithm into the resource-limited NVIDIA Jason TX1 platform environment (see Figure 10) and had the UAV carry the embedded platform for real-time pedestrian detection experiments. The experimental results demonstrated the feasibility of implementing YOLO-based real-time target detection on a resource-limited mobile platform and provided a reference for the development of next-generation YBUT. Alam et al. [37], to alleviate the computational pressure on the onboard embedded processor of the UAVs and to enhance the practicality of YBUT, proposed a cost-effective aerial surveillance system that reserves the limited Tiny-YOLO computational requirements on the onboard embedded processor Movidius VPU, shifts the large Tiny-YOLO computational tasks to the cloud, and maintains minimal communication between the UAV and the cloud. Experimental results showed that the system is six-times faster in target detection processing at frames per second compared to the speed of other state-of-the-art approaches, while the application of airborne-embedded processor technology reduces end-to-end latency and network resource consumption (see Figure 11). Similar research was conducted by Dimithe et al. [38]. The new generation of YBUT brings the YOLO algorithm and drone technology closer together. Although the new generation of YBUT does not show higher performance than previous technologies due to the limited computational resources of the embedded processor on board, the advantages of the new generation of YBUT were demonstrated with practical results by Zhang et al. [36]. It is sufficient to show that the future development of YBUT will tend towards a high degree of integration of YOLO algorithms with UAV technology.
Based on this research, Cao et al. [39] proposed a target detection and tracking method based on the YOLO algorithm and the PID algorithm by using a new generation of high-performance embedded processor, NVIDIA Jason TX2, in combination with the pixhawk4 flight control processor. The PID algorithm performs UAV flight control, and the YOLO algorithm is used to identify objects, to extract pixel coordinates and then to convert the pixel coordinates to actual coordinates, where pixel coordinates were the coordinates of the target object relative to the camera image. The actual coordinates were the relative coordinates of the target in a spatial coordinate system constructed with the camera lens as the coordinate origin. Experimental results showed that the method can effectively detect flight targets and perform real-time tracking tasks. Doukhi et al. [40] used a UAV equipped with an Nvidia Jetson TX2 high-performance embedded processor and a PID controller. Then, they deployed the YOLOv3 algorithm in the embedded processor to intuitively guide the UAV to track the detected target by using the YOLO-based target detection algorithm, while the PID controller was used to control the UAV flight. Experimental results showed that the proposed method successfully achieves a visual SLAM for localization and UAV tracking flight through the fisheye camera only without external positioning sensors or the introduction of GPS signals (see Figure 12). Afifi et al. [41] proposed a robust framework for multiscene pedestrian detection, which uses YOLO-v3 object detection as the backbone detector (see Figure 13) and runs on the Nvidia Jetson TX2 embedded processor onboard the UAV. Experimental results from multiple scenarios of outdoor pedestrian detection showed that the proposed detection framework showed better performance in terms of mAP and FPS, as the computational resources of the embedded processor increase compared to the YOLOv3 algorithm. To facilitate the development of a new generation of YBUT, Zhao et al. [42] improved YOLOv3-tiny, resulting in an 86.1% decrease in the model size, a 19.2% increase in AP50, and a 2.96-times faster detection speed than YOLOv3. The experimental results demonstrated that the improved algorithm is more suitable for low-end performance embedded processors in UAV target detection applications.
Although the new generation of YBUTs can simplify the operational steps in applications, can improve operational integrity, and can increase the adaptability of technology applications, it is still difficult to obtain satisfactory performance for complex applications based on the limited computing power of existing embedded processors. At the same time, high-performance embedded processors have been slow in development and may not be able to match the computing performance of high-performance computer processors for some time. Therefore, most researchers prefer applications that take the form of a UAV collecting image or video data and a high-performance computer deploying YOLO for object detection [43]. Regardless of which of the two approaches researchers take, each study and application drives YBUT research forwards so that YBUT continues to be understood, accepted, and used by researchers in other fields.

4. Practical Applications of YBUT in Several Fields

In recent years, UAV load capacity, as one of the key points of UAV technology development, has achieved greater results, which provides the basis for carrying professional equipment embedded with YOLO algorithms for object identification. The YOLO algorithm has been very widely used in various fields for object identification detection, while the method of carrying embedded processors in UAVs for object identification detection in the air has only just started to become popular. With the development of UAV technology and advances in algorithm performance, YBUT applications are expected to spread widely in life and in production in the coming years. Examples of YBUT applications in engineering, transportation, agriculture, automation, and other fields are outlined below, and these application methods or application approaches are discussed.

4.1. Related Applications in the Field of Engineering

Engineering is the main activity of everyday production and an important way of generating economic value. Amongst engineering operations, manual operations are both an important way of increasing productivity and a hindrance to it. Highly efficient large-scale manual operations in engineering are bound to produce higher production values, but there are also problems with overall production being affected by manual errors. With the advent of the industrial age, large-scale machine production has gradually replaced manual production, resulting in an exponential increase in output and a gradual reduction in costs. However, certain special jobs still need to adhere to manual work, such as checking power components of transmission lines and monitoring industrial instrumentation data. Although these jobs are not very difficult, the work is tedious, and it is very easy for staff to become fatigued and negligent, resulting in serious consequences. With the progress of UAV technology and the YOLO algorithm, some problems in engineering can be addressed by using a machine instead of manual labor or manual operation of the machine, which can alleviate the labor pressure on the staff to a considerable extent.
In engineering applications, YBUT has been successfully used and can, to a certain extent, replace people in some operations. The more mature research fields in which YBUT has been applied are transmission line detection [44], building surface detection [45], moving target tracking [46], gauge display reading [47], photovoltaic module detection [48], and building identification and classification [49]. According to the current survey, YBUT application research in the engineering field, researchers prefer the direction of transmission line detection. Objects, such as power line poles [50], insulators [51], electrical components [52], distribution line poles [53], transmission towers [54], bird nests [55], and breakers [56], can be accurately identified, classified, and located in complex environments. For example, Bao et al. [57] proposed an end-to-end parallel mixed attention detection YOLO network (PMA-YOLO) by collecting transmission line vibration damper data through UAVs and then creating a dataset to train and test the model; the results showed that the model can detect abnormal vibration dampers with an accuracy of 93.8% (see Figure 14). The successful detection and classification of various equipment and facilities in these transmission lines lay the technical basis for the construction of future intelligent power systems.
Recently, Alsanad et al. [58] proposed an improved YOLOv3 algorithm for small UAV detection in low-altitude airspace; experiments showed that the disclosed improved model of the algorithm can effectively detect low-altitude UAVs in complex environments (see Figure 15) and can be successfully applied to the anti-drone research field to manage low-altitude airspace UAVs. The proposed method yielded a further enhancement in the low-altitude small-UAV detection performance of YBUT based on previous studies [59,60,61,62]. Other information regarding YBUT applications in the engineering field is shown in Table 2.
YBUT is of great value to the engineering field and the productivity of society. Although there are still more engineering problems waiting to be resolved and more traditional manual methods waiting to be improved, YOLO algorithm object detection continues to become more accurate and faster; UAVs are becoming more convenient and safer, and if YBUT can continue to be used to develop innovations in the engineering field, then YBUT can create more value in the engineering field.

4.2. Related Applications in the Field of Transportation

With the expansion of human space and the extension of people’s physical movement, the dependence on transport for daily travel is increasing. This has led to a dramatic increase in the size of roads and the number of vehicles over the last few decades. When there are many roads and many means of transport, their management becomes very important. The legislature has set up various traffic regulations to limit their use to ensure a stable order in life, but monitoring their compliance accurately and effectively is a problem that persists. Although there are cameras all over the streets and alleys, this does not fully detect all violations of the law and does not impose penalties.
To further manage and constrain the various modes of transportation in life, several attempts have been made in the field of transportation with YBUT. For example, Feng et al. [74] proposed a YOLOv3-based method for UAV detection (see Figure 16). Omar et al. [75] proposed an aerial image vehicle detection method based on the YOLOv4 algorithm (see Figure 17), and Liu et al. [76] proposed a method for the automatic detection and tracking of vehicles in an urban environment by UAVs based on the YOLOv4 and DeepSORT algorithms. These studies have yielded excellent results in motorized and non-motorized vehicle recognition and classification tasks based on datasets of air traffic images and have also enabled the automatic detection and tracking of urban vehicles. The accurate identification and classification of motor vehicles and non-motorized vehicles allows for accurate restraint of their behavior according to road management rules in intelligent traffic management, while the automatic detection and tracking of vehicles can provide assistance in the effective punishment of violations. The fundamental applied research on YBUT in urban traffic further accelerates the intelligent management of urban traffic and contributes to the creation of a civilized city.
Both urban traffic management applications and urban road management are important directions for the application of YBUT technology. Silva et al. [77] designed a distributed UAV platform deploying YOLOv4 to detect road damage (see Figure 18). Zhao et al. [78] proposed a YOLOv3-based algorithm for UAV highway center mark detection, YOLO-Highway (see Figure 19). Recently, Ma et al. [79] proposed a new method for road damage detection based on YBUT, which experimentally showed better performance than previous similar studies and further promoted the application of road damage detection technology in urban road management. The intelligent management of urban traffic is not only the management of motor vehicles and non-motor vehicles but also the management of urban roads. The widespread application of YBUT in the field of traffic greatly promotes the process of intelligent management and has great significance for the convenience of future residents’ lives. The expansion of the application of YBUT in urban road management is another step forward in the promotion of intelligent urban traffic management. Other information regarding YBUT applications in the transportation sector is shown in Table 3.
The above shows that researchers have made considerable research progress in this area and demonstrates the great potential of YOLO-based UAV application technology in the transport sector. With the support of this technology, not only can the cost of traffic video acquisition and processing be significantly reduced, but the spatial flexibility of traffic supervision is also enhanced. Although fewer researchers have experimented in this area, it is unlikely that the application of this technology is limited to scenarios, such as vehicle inspection and road detection; there must exist many more applications that are more beneficial to people’s everyday lives.

4.3. Related Applications in the Field of Agriculture

In agriculture, there are often situations where failure to detect early symptoms of pests and diseases can lead to major pest and disease disasters and severe economic losses. When preventing or treating pests and diseases, there may also be excessive use of pesticides that can lead to environmental pollution and reduced crop yields. Wild vegetables, which are not commonly encountered every day, are often found in sites with lush vegetation, are less productive but have high nutritional value, and finding them has always been a serious challenge. We can use drones to perform some of the agricultural work and use the YOLO algorithm to assist in this process, which can be much more efficient and save time.
In this area of agriculture, many tricky jobs already have new solutions based on YBUT. With the continuous development and extension of YBUT, it is now possible to detect different targets and features among large plant species, such as in dead tree detection [96], pine wilt nematode disease detection [97,98,99] (see Figure 20), pine wilt detection [100], oil palm tree fruit detection [101], and other tasks. Additionally, YBUT can be applied in analyses involving small plants, such as in weed detection around peas and strawberries [102] (see Figure 21), field wheat phenotype monitoring [103], and tomato germinator detection [104]. Moving targets, such as animals, can also be detected, classified, and counted with high accuracy [105] (see Figure 22). Other information on YBUT applications in the agricultural sector is given in Table 4.
In this section, we provide an overview of the applications of YBUT in agriculture, and these exploratory applications point the way for expanding YOLO-based UAV applications in agriculture. Although the technology is still at the beginning stage in agriculture and many issues have not yet been resolved, such as dataset collection and sharing and stable YOLO algorithms that are more suitable for applications, we believe that YBUT can aid in the development of smart agriculture and has a broad scope of development in the agricultural field.

4.4. Related Applications in the Field of Automation

The production method of the future is automated production with machines completely replacing manual labor. In everyday production, most operations require human control of the machines, while some of the more technologically advanced production operations have already been automated with machines replacing humans. In operations where staff are involved in production, their main task is to control the machine, i.e., to adjust the machine’s working status according to the real-time operational situation. The combination of computer technology, which can now make decisions instead of humans, and object detection technology based on the YOLO algorithm, which can detect the status of the operation in real time and can provide feedback, can replace staff control of the machine to a certain extent. If both technologies are applied to drone platforms, it may be possible to reduce the labor pressure for workers and can increase the productivity of some industries.
To automate the use of YBUT in various applications, numerous researchers have developed different supporting technologies. After many studies, the technology for the detection, tracking, and avoidance of specific targets has matured and is now largely automated [115,116,117,118]. Notably, YBUT has been effectively used for the detection and localization of pedestrians [119,120,121,122,123]. Moreover, with an increase in unmanned mobility concepts, certain applications have been rapidly automated. Kraft et al. [124] proposed a YOLOv4-based method for locating litter in parks by using drones. The experimental results showed that the drones can detect litter and can collect litter location data in a fixed area while marking the litter location on a map for sweepers to see for easy cleaning (see Figure 23). In the future, the system can also work together with other equipment to locate and automatically sweep up litter, completely reducing the workload of sweepers. Liao et al. [125] proposed a UAV-based marine litter detection system that uses a UAV with an improved YOLO algorithm for marine litter detection; their system transmits the results to a ground-based monitoring platform via the internet to assist government agencies in implementing management plans (see Figure 24). Other information regarding YBUT applications in the automation sector is shown in Table 5.
The successful application of YBUT in the field of automation demonstrates a viable method for achieving an unmanned and automated future. The research information obtained from existing exploratory practical applications provides a credible reference for reducing the pressure on human labor and increasing existing productivity levels in the future. In the coming period, the research focus of YBUT in the engineering field should be on expanding the types of operations to which the technology can be applied, developing specialist drones, developing high-performance YOLO algorithms suitable for embedded environments, and developing visual and convenient control systems.

4.5. Related Applications in Other Fields

In addition to the main areas of YBUT application discussed above, some researchers have explored completely new areas, experimented with new methods, and used these methods to promote and enhance the applicability and usefulness of YBUT. Wyder et al. [130] integrated YBUT with vision service algorithms to successfully achieve the autonomous detection and tracking of moving targets in a GPS-limited environment. Quan, Herrmann et al. [131] proposed Project Vulture, an intelligent human–subject location system for UAVs based on the YOLO algorithm, and the system possessed higher sensitivity than other peer systems in mountain rescue operations. Similar content has been studied by Kashihara et al. [132] and Sambolek and Ivasic-Kos [133]. Arnold et al. [134] investigated object classification functions and reactive group behavior in a dispersed autonomous heterogeneous swarm of UAVs deployed with YOLO; their approach supported the identification of specific targets with a UAV, and other UAVs were able to learn the behavior accordingly. The experimental results showed that the system still performs well at 25 m from the building. Jing et al. [135] proposed a neural network based on YOLOv5s-ViT-BiFPN, which can assess the damage of rural houses after natural disasters using drone images (see Figure 25). Information regarding YBUT applications in other areas is shown in Table 6.
YBUT has been used successfully in various fields, which has greatly contributed to the development of the industry and the advancement of technology for the benefit of society. From the above overview of the various areas, there is much more value to be created by this technology. To create more value, we can improve and optimize existing applications, expand the idea with examples of successful applications, and use this to apply the technology to more areas.

5. Development Prospects

With the rapid development of YOLO-based object detection technology and special UAV research, the YOLO-based UAV industry has set off a technological boom with multifield applications and multidirectional development. With the assistance of a variety of cutting-edge technologies, it is possible to improve productivity and quality of life and to create economic benefits while creating good ecological, environmental, and social benefits. YBUT shows increasingly obvious value and potential as it develops.

5.1. Improving the Quality of UAV Datasets and Training YOLO Algorithms Suitable for Aerial Imagery

In the practical application of YBUT, it is often the case that accuracy and speed are high for ground-based operational tests but low for aerial UAV operations, possibly due to the unsuitability of the dataset used by recent target detection algorithms [155,156,157,158]. Therefore, when collecting and selecting datasets to create models that perform as well as possible for the UAV operational environment, the following should be noted: (1) When performing image acquisition of the target, care should be taken that the acquisition equipment is as consistent as possible and that the same equipment is used for acquisition from start to finish so that the same-resolution image can be obtained. This helps ensure that the image content is not distorted due to inconsistencies in image size during algorithm training. (2) The dataset should be collected from as many different angles as possible, e.g., different camera angles, weather differences, various light intensities, several poses of the target, numerous target positions, and different target backgrounds. (3) When annotating the target border category and coordinate information within the dataset, we need to reduce the area of the background content within the border as much as possible and must ensure that all the target content is placed within the border. When annotating multiple categories of borders, we need to minimize the area of overlapping borders to avoid the algorithm combining the two into the same content.
At the same time, to accelerate the progress of research on YOLO-based UAV object recognition technology, it is recommended that most developers create good-performing models, describe the content and application performance of the datasets used, and upload them to the community for sharing.

5.2. Research into Object Detection Algorithms Suitable for UAV-Embedded Processors

In YBUT research, UAVs can carry limited hardware resources for mobile processors and cannot be better transplanted to existing YOLO algorithms for application. Therefore, lightweight target detection algorithms should be investigated for mobile processors with limited resources, or performance optimization or network pruning model improvements should be made to existing YOLO algorithms [159,160,161,162,163,164]. Liu et al. [165] proposed a Slice-Concat structure based on YOLOv3 and YOLOv3-SPP, which can improve the target detection speed by simply changing the width and height of the uniform input dataset. Zhang et al. [166] proposed an intelligent approach for UAVs that combines machine learning, traditional algorithms, and intelligent AI algorithms. The YOLOv3 algorithm is then used to sense the location of objects in the environment and to classify them, and finally, AI is used to evaluate the working state. Experiments showed that the method has high computational speed and recognition accuracy, good generality, portability, and scalability, and they proposed a new development direction for future UAV technology.
To quickly promote the YBUT and facilitate learning and application by other interested researchers or industry beginners, it is recommended that all peer researchers who have successfully implemented the application for lightweight UAV models disclose their optimization methods and model source code and provide detailed explanations of the optimized parts and the optimized network structure.

5.3. Developing Professional, Stable, and Reliable UAVs in Combination with the YOLO Inspection Environment

Given the many areas of development of YBUT and the issues with UAVs themselves, further development and promotion of UAVs with high professionalism, high environmental adaptability, stability, and reliability should be carried out [167,168,169,170,171,172]. The overall design of UAVs should be combined with the field of UAV operations, while considering the environment, to improve the professionalism of UAV operations and to ensure the adaptability and stability of UAV operations. In terms of the UAV power system, we should develop the core components of power motors or engines, improve the service life of the core components, and reduce the total amount of the whole aircraft to improve the practicality of UAVs. In terms of UAV onboard equipment, multisensor fusion technology should continue to be developed and applied autonomously. In terms of UAV safety, the development of UAV safety flight algorithms is necessary. We also improve the UAV runaway self-protection system, realize effective obstacle avoidance, runaway self-protection, fault self-testing, and runaway warning functions. Moreover, we need to monitor all parameters of the UAV itself to protect the users’ property.

5.4. Enhancing the Security of YBUT for Multiple Application Scenarios

The development of YBUT and its widespread use in various fields have led to the technology being gradually recognized by researchers, but in the pursuit of rapid technological development, safety issues are often easily overlooked. In the daily application of YBUT, UAVs mainly transmit data with ground control terminals by wireless communication, which is easily interfered with and invaded by others, thus causing problems, such as loss of control of UAVs and data information leakage [173]. In many application scenarios, if a UAV is hacked by others and loses control of its flight, it will not only pose a threat to the UAV itself but also to the surrounding environment and may even endanger the personal safety of the operator. To prevent relevant security issues from occurring, it is vital to enhance the data security of the YBUT. Both the storage and transmission of data information and the transmission of UAV movement control commands should be the main targets for security enhancements in the YBUT.

5.5. Popularising YBUT Knowledge, Training Technical Application Talents, and Improving Relevant Laws, Regulations, and Codes of Practice

Popularizing the knowledge of UAVs and YOLO algorithms and training composite talents in UAV control and YOLO algorithm application should be the role of higher education institutions, research units, relevant enterprises, and group organizations. Improving relevant laws, regulations, and codes of practice is an inescapable responsibility of the relevant legislative bodies in the face of the rise of new technological developments and applications. Although YBUT is developing rapidly, it takes time to achieve autonomous unmanned operation of drones, so talent for drone control should be cultivated to improve the overall level of the industry and the scope of production and use in life. At the same time, to adapt to the rapid iteration of YOLO algorithm versions, knowledge of YOLO algorithm applications should be popularized, and the ability of relevant personnel to apply YOLO algorithms should be improved to bring into play the diversity of YOLO-based UAV operations. To further protect the legal rights of users and others, the YBUT must be applied in strict accordance with relevant laws and regulations, and operators must be trained to ensure the correct use of the YBUT.

6. Conclusions

In any period, social progress needs advanced productivity as a basis, and every advancement needs time to develop. When an emerging field becomes popular, the field then gathers most of the current resources to develop it so that it rapidly progresses and spreads to other fields. Then, having been fully integrated with other fields, it is presented to people in the way of practical applications to address the needs of life so that people benefit.
In this literature review, we demonstrate that the combination of deep learning YOLO algorithms and UAV technology can be of great use in the future, and an attempt is made to introduce and to promote the technology to attract the attention of more researchers. In this paper, we first describe the development of YBUT, including the early developments and leaps and bounds in the application of YOLO algorithms in conjunction with UAV technology. Second, to promote YBUT, the main areas where researchers have applied the technology and the recent state of research, as well as the exploration and experimentation of the technology in certain new areas, are presented. It is clear from the article that YOLO-based object detection algorithms could be a key enabler for future drone applications, allowing drones to provide better productivity and greater convenience.
Currently, UAV technology and YOLO-based object detection are relatively well established in their respective pre-existing fields, and the cross-fertilization of the two into new technologies is becoming an increasingly important area. The results show that there is a high degree of advantageous complementarity between UAV-derived aerial platforms and YOLO algorithms for object detection. However, the application methodology and performance of YBUT need to be further enhanced. The development of YBUT has, thus far, seen more applications in engineering, transportation, agriculture, and automation and less practice in other fields; the diffusion of the technology remains a challenge. The future development of technology needs to take these four issues into account. The actual detection performance of ground-acquired datasets applied directly to the training of UAV-based object detection algorithms is not very satisfactory, and further dedicated high-quality datasets need to be acquired. Deploying existing YOLO algorithms directly to mobile processors through optimization can complete the current exploratory research goals, but this step is still a long way from industrializing YBUT and requires dedicated algorithms to be developed for the UAV hardware environment. For future applications of the technology in more areas, a single specialist drone should be developed for specific use environments. The timely development of talent for the development and application of YBUT is also an effective way to rapidly promote the technology. To some extent, the rapid diffusion of YBUT and the continuous identification of new problems and needs during the diffusion process, addressing new problems and meeting new needs, can also contribute to technological progress.

Author Contributions

Conceptualization, C.C. and Z.Z.; methodology, C.C.; software, S.F.; validation, C.C., Z.Z. and S.G.; formal analysis, T.X.; investigation, C.C.; resources, C.C.; data curation, Z.Z.; writing—original draft preparation, C.C. and Z.Z.; writing—review and editing, W.Y.; visualization, C.C. and Z.Z.; supervision, Y.L.; project administration, T.X. and W.Y.; funding acquisition, T.X., W.Y. and Y.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the General Program of Liaoning Provincial Educational Department (LJKMZ20221059), the Key Tackling Program of Liaoning Provincial Educational Department (LSNZD202005), and the 111 Project (D18019).

Data Availability Statement

Not applicable.

Acknowledgments

We sincerely thank Dongxu Su, Yihan Liu, Hongyang Zhou, and Ziqi Yu from Shenyang Agricultural University for supporting this work.

Conflicts of Interest

The authors declare no conflict of interest.

Nomenclature

AcronymsDefinition
UAVUnmanned aerial vehicle
YOLOYou Only Look Once
YBUTYOLO-based UAV technology
UAVsUnmanned aerial vehicles
FPSFrames Per Second
mAPMean average precision
Faster R-CNNFaster regional Convolutional Neural Network
RFCNRegion-based Fully Convolutional Network
SNIPERScale Normalization for Image Pyramids with Efficient Resampling
IoUIntersection over union
F1Harmonic mean of precision and recall
APAverage precision
VPUVision Processing Unit
PIDProportional Integral Differential
SLAMSimultaneous localization and mapping
GPSGlobal Positioning System
DEEPSORTDeep Simple Online and Realtime Tracking

References

  1. Fan, B.; Li, Y.; Zhang, R.; Fu, Q. Review on the technological development and application of UAV systems. Chin. J. Electron. 2020, 29, 199–207. [Google Scholar] [CrossRef]
  2. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
  3. Jiang, P.Y.; Ergu, D.; Liu, F.Y.; Cai, Y.; Ma, B. A review of yolo algorithm developments. In Proceedings of the 8th International Conference on Information Technology and Quantitative Management (ITQM)—Developing Global Digital Economy after COVID-19, Chengdu, China, 9–11 July 2021; pp. 1066–1073. [Google Scholar]
  4. Ahmad, F.; Qiu, B.; Dong, X.; Ma, J.; Huang, X.; Ahmed, S.; Chandio, F.A. Effect of operational parameters of UAV sprayer on spray deposition pattern in target and off-target zones during outer field weed control application. Comput. Electron. Agric. 2020, 172, 105305. [Google Scholar] [CrossRef]
  5. Gasparovic, M.; Zrinjski, M.; BarkoviC, D.; Radocaj, D. An automatic method for weed mapping in oat fields based on UAV imagery. Comput. Electron. Agric. 2020, 173, 105385. [Google Scholar] [CrossRef]
  6. Chu, L.; Li, X.; Xu, J.; Neiat, A.G.; Liu, X. A holistic service provision strategy for drone-as-a-service in MEC-based UAV delivery. In Proceedings of the IEEE International Conference on Web Services (ICWS)/IEEE World Congress on Services (IEEE SERVICES), Chicago, IL, USA, 5–11 September 2021; pp. 669–674. [Google Scholar]
  7. Nath, N.D.; Cheng, C.S.; Behzadan, A.H. Drone mapping of damage information in GPS-Denied disaster sites. Adv. Eng. Inform. 2022, 51, 101450. [Google Scholar] [CrossRef]
  8. Abate, A.F.; De Maio, L.; Distasi, R.; Narducci, F. Remote 3D face reconstruction by means of autonomous unmanned aerial vehicles. Pattern Recognit. Lett. 2021, 147, 48–54. [Google Scholar] [CrossRef]
  9. Su, D.; Yao, W.; Yu, F.; Liu, Y.; Zheng, Z.; Wang, Y.; Xu, T.; Chen, C. Single-neuron PID UAV variable fertilizer application control system based on a weighted coefficient learning correction. Agriculture 2022, 12, 1019. [Google Scholar] [CrossRef]
  10. Li, Y.; Dai, Z. Abnormal behavior detection in crowd scene using YOLO and Conv-AE. In Proceedings of the 33rd Chinese Control and Decision Conference (CCDC), Kunming, China, 22–24 May 2021; pp. 1720–1725. [Google Scholar]
  11. Yu, J.; Zhang, W. Face mask wearing detection algorithm based on improved YOLO-v4. Sensors 2021, 21, 3263. [Google Scholar] [CrossRef]
  12. Rivero-Palacio, M.; Alfonso-Morales, W.; Caicedo-Bravo, E.; IEEE. Mobile application for anemia detection through ocular conjunctiva images. In Proceedings of the IEEE Colombian Conference on Applications of Computational Intelligence (ColCACI), Cali, Colombia, 26–28 May 2021. [Google Scholar]
  13. Liu, Y.; Hong, W. Target detection based on DB-YOLO in road environment. In Proceedings of the 33rd Chinese Control and Decision Conference (CCDC), Kunming, China, 22–24 May 2021; pp. 1496–1501. [Google Scholar]
  14. Azimjonov, J.; Ozmen, A. A real-time vehicle detection and a novel vehicle tracking systems for estimating and monitoring traffic flow on highways. Adv. Eng. Inform. 2021, 50, 101393. [Google Scholar] [CrossRef]
  15. Lv, X.; Lian, X.; Tan, L.; Song, Y.; Wang, C. HPMC: A multi-target tracking algorithm for the IoT. Intell. Autom. Soft Comput. 2021, 28, 513–526. [Google Scholar] [CrossRef]
  16. Chen, W.; Zhang, J.; Guo, B.; Wei, Q.; Zhu, Z. An apple detection method based on Des-YOLO v4 algorithm for harvesting robots in complex environment. Math. Probl. Eng. 2021, 2021, 7351470. [Google Scholar] [CrossRef]
  17. Buchholz, K.; Statista. Commercial Drones Are Taking Off. 2019. Available online: https://www.statista.com/chart/17201/commecial-drones-projected-growth/ (accessed on 1 February 2023).
  18. Redmon, J.; Farhadi, A. YOLO9000: Better, faster, stronger. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 7263–7271. [Google Scholar]
  19. Redmon, J.; Farhadi, A. Yolov3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
  20. Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. Yolov4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
  21. Li, C.; Li, L.; Jiang, H.; Weng, K.; Geng, Y.; Li, L.; Ke, Z.; Li, Q.; Cheng, M.; Nie, W.; et al. YOLOv6: A single-stage object detection framework for industrial applications. arXiv 2022, arXiv:2209.02976. [Google Scholar]
  22. Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv 2022, arXiv:2207.02696. [Google Scholar]
  23. Sahin, O.; Ozer, S. YOLODrone: Improved YOLO architecture for object detection in drone images. In Proceedings of the 44th International Conference on Telecommunications and Signal Processing (TSP), Brno, Czech Republic, 26–28 July 2021; pp. 361–365. [Google Scholar]
  24. Tan, L.; Lv, X.; Lian, X.; Wang, G. YOLOv4_Drone: UAV image target detection based on an improved YOLOv4 algorithm. Comput. Electr. Eng. 2021, 93, 107261. [Google Scholar] [CrossRef]
  25. Zhang, Z.; Lu, X.; Cao, G.; Yang, Y.; Jiao, L.; Liu, F.; Soc, I.C. ViT-YOLO: Transformer-based YOLO for object detection. In Proceedings of the 18th IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, BC, Canada, 11–17 October 2021; pp. 2799–2808. [Google Scholar]
  26. Koay, H.V.; Chuah, J.H.; Chow, C.-O.; Chang, Y.-L.; Yong, K.K. YOLO-RTUAV: Towards real-time vehicle detection through aerial images with low-cost edge devices. Remote Sens. 2021, 13, 4196. [Google Scholar] [CrossRef]
  27. Wang, P.; Wu, L.; Qi, J.; Dai, J. Unmanned aerial vehicles object detection based on image haze removal under sea fog conditions. IET Image Process. 2022, 16, 2709–2721. [Google Scholar]
  28. Yang, F.; Zhang, X.; Liu, B. Video object tracking based on YOLOv7 and DeepSORT. arXiv 2022, arXiv:2207.12202. [Google Scholar]
  29. Lin, Y.; Chen, T.; Liu, S.; Cai, Y.; Shi, H.; Zheng, D.; Lan, Y.; Yue, X.; Zhang, L. Quick and accurate monitoring peanut seedlings emergence rate through UAV video and deep learning. Comput. Electron. Agric. 2022, 197, 106938. [Google Scholar] [CrossRef]
  30. Khang, N.; Nhut, T.H.; Phat, C.N.; Khanh-Duy, N.; Nguyen, D.V.; Nguyen, T.V. Detecting objects from space: An evaluation of deep-learning modern approaches. Electronics 2020, 9, 583. [Google Scholar]
  31. Ammar, A.; Koubaa, A.; Benjdira, B. Deep-learning-based automated palm tree counting and geolocation in large farms from aerial geotagged images. Agronomy 2021, 11, 1458. [Google Scholar] [CrossRef]
  32. Jiang, S.; Luo, B.; Liu, J.; Zhang, Y.; Zhang, L. UAV-based vehicle detection by multi-source images. In Proceedings of the 2nd CCF Chinese Conference on Computer Vision (CCCV), China Comp Federat, Tianjin, China, 12–14 October 2017. [Google Scholar]
  33. Xu, Z.; Shi, H.; Li, N.; Xiang, C.; Zhou, H. Vehicle detection under UAV based on optimal dense YOLO method. In Proceedings of the 5th International Conference on Systems and Informatics (ICSAI), Nanjing, China, 10–12 November 2018; pp. 407–411. [Google Scholar]
  34. Ruan, W.; Wang, H.; Kou, Z.; Su, Z.; IEEE. Drogue detection and location for UAV autonomous aerial refueling based on deep learning and vision. In Proceedings of the IEEE CSAA Guidance, Navigation and Control Conference (CGNCC), Xiamen, China, 10–12 August 2018. [Google Scholar]
  35. Yang, Z.; Huang, Z.; Yang, Y.; Yang, F.; Yin, Z. Accurate specified-pedestrian tracking from unmanned aerial vehicles. In Proceedings of the 18th IEEE International Conference on Communication Technology (IEEE ICCT), Chongqing, China, 8–11 October 2018; pp. 1256–1260. [Google Scholar]
  36. Zhang, D.; Shao, Y.; Mei, Y.; Chu, H.; Zhang, X.; Zhan, H.; Rao, Y. Using YOLO-based pedestrian detection for monitoring UAV. In Proceedings of the 10th International Conference on Graphics and Image Processing (ICGIP), Chengdu, China, 12–14 December 2018. [Google Scholar]
  37. Alam, M.S.; Natesha, B.V.; Ashwin, T.S.; Guddeti, R.M.R. UAV based cost-effective real-time abnormal event detection using edge computing. Multimed. Tools Appl. 2019, 78, 35119–35134. [Google Scholar] [CrossRef]
  38. Dimithe, C.O.B.; Reid, C.; Samata, B. Offboard machine learning through edge computing for robotic applications. In Proceedings of the IEEE SoutheastCon Conference, St Petersburg, FL, USA, 19–22 April 2018. [Google Scholar]
  39. Cao, M.; Chen, W.; Li, Y. Research on detection and tracking technology of quad-rotor aircraft based on open source flight control. In Proceedings of the 39th Chinese Control Conference (CCC), Shenyang, China, 27–29 July 2020; pp. 6509–6514. [Google Scholar]
  40. Doukhi, O.; Hossain, S.; Lee, D.-J. Real-time deep learning for moving target detection and tracking using unmanned aerial vehicle. J. Inst. Control. Robot. Syst. 2020, 26, 295–301. [Google Scholar] [CrossRef]
  41. Afifi, M.; Ali, Y.; Amer, K.; Shaker, M.; Elhelw, M. Robust real-time pedestrian detection on embedded devices. In Proceedings of the 13th International Conference on Machine Vision, Rome, Italy, 2–6 November 2020. [Google Scholar]
  42. Zhao, L.; Zhang, Q.; Peng, B.; Liu, Y. Faster object detector for drone-captured images. J. Electron. Imaging 2022, 31, 043033. [Google Scholar] [CrossRef]
  43. Zheng, A.; Fu, Y.; Dong, M.; Du, X.; Chen, Y.; Huang, J. interface identification of automatic verification system based on deep learning. In Proceedings of the 2021 IEEE 16th Conference on Industrial Electronics and Applications (ICIEA), Chengdu, China, 1–4 August 2021; pp. 46–49. [Google Scholar]
  44. Ohta, H.; Sato, Y.; Mori, T.; Takaya, K.; Kroumov, V. Image acquisition of power line transmission towers using UAV and deep learning technique for insulators localization and recognition. In Proceedings of the 23rd International Conference on System Theory, Control and Computing (ICSTCC), Sinaia, Romania, 9–11 October 2019; pp. 125–130. [Google Scholar]
  45. Han, Q.; Liu, X.; Xu, J. Detection and location of steel structure surface cracks based on unmanned aerial vehicle images. J. Build. Eng. 2022, 50, 104098. [Google Scholar] [CrossRef]
  46. Cintas, E.; Ozyer, B.; Simsek, E. Vision-based moving UAV tracking by another UAV on low-cost hardware and a new ground control station. IEEE Access 2020, 8, 194601–194611. [Google Scholar] [CrossRef]
  47. Li, C.; Zheng, D.; Liu, L.; Zheng, X. A UAV-based machine vision algorithm for industrial gauge detecting and display reading. In Proceedings of the 5th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS), Singapore, 17–19 July 2020; pp. 109–115. [Google Scholar]
  48. Starzynski, J.; Zawadzki, P.; Haranczyk, D. Machine learning in solar plants inspection automation. Energies 2022, 15, 5966. [Google Scholar] [CrossRef]
  49. Kim, J.S.; Young, H.I. Analysis of building object detection based on the YOLO neural network using UAV images. J. Korean Soc. Surv. Geod. Photogramm. Cartogr. 2021, 39, 381–392. [Google Scholar]
  50. Zhang, S.; Chen, B.; Wang, R.; Wang, J.; Zhong, L.; Gao, B. Unmanned Aerial Vehicle (UAV) vision-based detection of power line poles by CPU-based deep learning method. In Proceedings of the 9th IEEE Annual International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (IEEE-CYBER), Suzhou, China, 29 July–2 August 2019; pp. 1630–1634. [Google Scholar]
  51. Sadykova, D.; Pernebayeva, D.; Bagheri, M.; James, A. IN-YOLO: Real-time detection of outdoor high voltage insulators using UAV imaging. IEEE Trans. Power Deliv. 2020, 35, 1599–1601. [Google Scholar] [CrossRef]
  52. Chen, H.; He, Z.; Shi, B.; Zhong, T. Research on recognition method of electrical components based on YOLO V3. IEEE Access 2019, 7, 157818–157829. [Google Scholar] [CrossRef]
  53. Chen, B.; Miao, X. Distribution line pole detection and counting based on YOLO using UAV inspection line video. J. Electr. Eng. Technol. 2020, 15, 997. [Google Scholar] [CrossRef] [Green Version]
  54. Mo, Y.; Xie, R.; Pan, Q.; Zhang, B. Automatic power transmission towers detection based on the deep learning algorithm. In Proceedings of the 2nd International Conference on Computer Engineering and Intelligent Control (ICCEIC), Chongqing, China, 12–14 November 2021; pp. 11–15. [Google Scholar]
  55. Zhang, Z.; He, G. Recognition of bird nests on power transmission lines in aerial images based on improved YOLOv4. Front. Energy Res. 2022, 10, 435. [Google Scholar] [CrossRef]
  56. Zheng, H.; Ping, Y.; Cui, Y.; Li, J. Intelligent diagnosis method of power equipment faults based on single-stage infrared image target detection. IEEJ Trans. Electr. Electron. Eng. 2022, 17, 1706–1716. [Google Scholar] [CrossRef]
  57. Bao, W.; Ren, Y.; Wang, N.; Hu, G.; Yang, X. Detection of abnormal vibration dampers on transmission lines in UAV remote sensing images with PMA-YOLO. Remote Sens. 2021, 13, 4134. [Google Scholar] [CrossRef]
  58. Alsanad, H.R.; Sadik, A.Z.; Ucan, O.N.; Ilyas, M.; Bayat, O. YOLO-V3 based real-time drone detection algorithm. Multimed. Tools Appl. 2022, 81, 26185–26198. [Google Scholar] [CrossRef]
  59. Hu, Y.; Wu, X.; Zheng, G.; Liu, X. Object detection of UAV for anti-UAV based on improved YOLO v3. In Proceedings of the 38th Chinese Control Conference (CCC), Guangzhou, China, 27–30 July 2019; pp. 8386–8390. [Google Scholar]
  60. Yuan, X.; Xia, J.; Wu, J.; Shi, J.; Deng, L. Low altitude small UAV detection based on YOLO model. In Proceedings of the 39th Chinese Control Conference (CCC), Shenyang, China, 27–29 July 2020; pp. 7362–7366. [Google Scholar]
  61. Madasamy, K.; Shanmuganathan, V.; Kandasamy, V.; Lee, M.Y.; Thangadurai, M. OSDDY: Embedded system-based object surveillance detection system with small drone using deep YOLO. EURASIP J. Image Video Process. 2021, 2021, 1–14. [Google Scholar] [CrossRef]
  62. Cetin, E.; Barrado, C.; Pastor, E. Improving real-time drone detection for counter-drone systems. Aeronaut. J. 2021, 125, 1871–1896. [Google Scholar] [CrossRef]
  63. Sousa, A.D.P.d.; Sousa, G.C.L.d.; Maués, L.M.F. Using digital image processing and Unmanned Aerial Vehicle (UAV) for identifying ceramic cladding detachment in building facades. Ambiente Construído 2022, 22, 199–213. [Google Scholar] [CrossRef]
  64. Wang, J.; Jiang, S.; Song, W.; Yang, Y. A comparative study of small object detection algorithms. In Proceedings of the 38th Chinese Control Conference (CCC), Guangzhou, China, 27–30 July 2019; pp. 8507–8512. [Google Scholar]
  65. Han, J.; Yang, Z.; Xu, H.; Hu, G.; Zhang, C.; Li, H.; Lai, S.; Zeng, H. Search like an eagle: A cascaded model for insulator missing faults detection in aerial images. Energies 2020, 13, 713. [Google Scholar] [CrossRef] [Green Version]
  66. Yan, J.; Hao, Y. Recognition method of electrical components based on improved YOLOv3. In Proceedings of the 2nd International Conference on Artificial Intelligence and Information Systems (ICAIIS), Chongqing, China, 28–30 May 2021. [Google Scholar]
  67. Liu, C.; Wu, Y.; Liu, J.; Sun, Z. Improved YOLOv3 network for insulator detection in aerial images with diverse background interference. Electronics 2021, 10, 771. [Google Scholar] [CrossRef]
  68. Kumar, P.; Batchu, S.; Swamy, S.N.; Kota, S.R. Real-time concrete damage detection using deep learning for high rise structures. IEEE Access 2021, 9, 112312–112331. [Google Scholar] [CrossRef]
  69. Tu, R.; Zhu, Z.; Bai, Y.; Gao, M.; Ge, Z. Key parts of transmission line detection using improved YOLO v3. Int. Arab J. Inf. Technol. 2021, 18, 747–754. [Google Scholar]
  70. Ding, C.; Lu, L.; Wang, C.; Ding, C. Design, sensing, and control of a novel UAV platform for aerial drilling and screwing. IEEE Robot. Autom. Lett. 2021, 6, 3176–3183. [Google Scholar] [CrossRef]
  71. Yang, Z.; Xu, Z.; Wang, Y. Bidirection-fusion-YOLOv3: An improved method for insulator defect detection using UAV image. IEEE Trans. Instrum. Meas. 2022, 71, 1–8. [Google Scholar] [CrossRef]
  72. Kim-Phuong, P.; Thai-Hoc, L.; Trung-Thanh, N.; Ngoc-Long, L.; Huu-Hung, N.; Van-Phuc, H. Multi-model deep learning drone detection and tracking in complex background conditions. In Proceedings of the International Conference on Advanced Technologies for Communications (ATC), Ho Chi Minh City, Vietnam, 14–16 October 2021; pp. 189–194. [Google Scholar]
  73. Wang, X.; Li, W.; Guo, W.; Cao, K. SPB-YOLO: An efficient real-time detector for unmanned aerial vehicle images. In Proceedings of the 3rd International Conference on Artificial Intelligence in Information and Communication (IEEE ICAIIC), Jeju Island, South Korea, 13–16 April 2021; pp. 99–104. [Google Scholar]
  74. Feng, R.; Fan, C.; Li, Z.; Chen, X. Mixed road user trajectory extraction from moving aerial videos based on convolution neural network detection. IEEE Access 2020, 8, 43508–43519. [Google Scholar] [CrossRef]
  75. Omar, W.; Oh, Y.; Chung, J.; Lee, I. Aerial dataset integration for vehicle detection based on YOLOv4. Korean J. Remote Sensing 2021, 37, 747–761. [Google Scholar]
  76. Liu, X.; Zhang, Z. A Vision-based target detection, tracking, and positioning algorithm for unmanned aerial vehicle. Wirel. Commun. Mob. Comput. 2021, 2021, 1–12. [Google Scholar] [CrossRef]
  77. Silva, L.A.; San Blas, H.S.; Garcia, D.P.; Mendes, A.S.; Gonzalez, G.V. An architectural multi-agent system for a pavement monitoring system with pothole recognition in UAV images. Sensors 2020, 20, 6205. [Google Scholar] [CrossRef]
  78. Zhao, Z.; Han, J.; Song, L. YOLO-Highway: An improved highway center marking detection model for unmanned aerial vehicle autonomous flight. Math. Probl. Eng. 2021, 2021, 1–14. [Google Scholar] [CrossRef]
  79. Ma, D.; Fang, H.; Wang, N.; Zhang, C.; Dong, J.; Hu, H. Automatic detection and counting system for pavement cracks based on PCGAN and YOLO-MF. IEEE Trans. Intell. Transp. Syst. 2022, 23, 22166–22178. [Google Scholar] [CrossRef]
  80. Kim, J.M.; Sekwon, H.; Chae, J.H.; Do, M. Road crack detection based on object detection algorithm using unmanned aerial vehicle image. J. Korea Inst. Intell. Transp. Syst. 2019, 18, 155–163. [Google Scholar] [CrossRef]
  81. Sharma, R.; Patel, K.; Shah, S.; Aibin, M. Aerial footage analysis using computer vision for efficient detection of points of interest near railway tracks. Aerospace 2022, 9, 370. [Google Scholar] [CrossRef]
  82. Krump, M.; Russ, M.; Stuetz, P. Deep learning algorithms for vehicle detection on UAV platforms: First investigations on the effects of synthetic training. In Proceedings of the 6th International Conference on Modelling and Simulation for Autonomous Systems (MESAS), Palermo, Italy, 29–31 October 2019; pp. 50–70. [Google Scholar]
  83. Luo, X.; Tian, X.; Zhang, H.; Hou, W.; Leng, G.; Xu, W.; Jia, H.; He, X.; Wang, M.; Zhang, J. Fast automatic vehicle detection in UAV images using convolutional neural networks. Remote Sens. 2020, 12, 1994. [Google Scholar] [CrossRef]
  84. Hassan, S.A.; Han, S.H.; Shin, S.Y. Real-time road cracks detection based on improved deep convolutional neural network. In Proceedings of the IEEE Canadian Conference on Electrical and Computer Engineering (CCECE), London, Canada, 3 August–1 September 2020. [Google Scholar]
  85. Chung, Q.M.; Le, T.D.; Dang, T.V.; Vo, N.D.; Nguyen, T.V.; Khang, N. Data augmentation analysis in vehicle detection from aerial videos. In Proceedings of the RIVF International Conference on Computing and Communication Technologies (RIVF), RMIT University. Ho Chi Minh City, Vietnam, 14–15 October 2020; pp. 367–369. [Google Scholar]
  86. Li, Y.; Chen, Y.; Yuan, S.; Liu, J.; Zhao, X.; Yang, Y.; Liu, Y. Vehicle detection from road image sequences for intelligent traffic scheduling. Comput. Electr. Eng. 2021, 95, 107406. [Google Scholar] [CrossRef]
  87. Chen, Y.; Zhao, D.; Er, M.J.; Zhuang, Y.; Hu, H. A novel vehicle tracking and speed estimation with varying UAV altitude and video resolution. Int. J. Remote Sens. 2021, 42, 4441–4466. [Google Scholar] [CrossRef]
  88. Rampriya, R.S.; Suganya, R.; Nathan, S.; Perumal, P.S. A comparative assessment of deep neural network models for detecting obstacles in the real time aerial railway track images. Appl. Artif. Intell. 2022, 36, 2018184. [Google Scholar] [CrossRef]
  89. Gupta, P.; Pareek, B.; Singal, G.; Rao, D.V. Edge device based military vehicle detection and classification from UAV. Multimed. Tools Appl. 2022, 81, 19813–19834. [Google Scholar] [CrossRef]
  90. Golyak, I.S.; Anfimov, D.R.; Fufurin, I.L.; Nazolin, A.L.; Bashkin, S.V.; Glushkov, V.L.; Morozov, A.N. Optical multi-band detection of unmanned aerial vehicles with YOLO v4 convolutional neural network. In Proceedings of the SPIE Future Sensing Technologies Conference, Electr Network, Online, 9–13 November 2020. [Google Scholar]
  91. Emiyah, C.; Nyarko, K.; Chavis, C.; Bhuyan, I. Extracting vehicle track information from unstabilized drone aerial videos using YOLOv4 common object detector and computer vision. In Proceedings of the 6th Future Technologies Conference (FTC), Electr Network, Online, 28–29 October 2021; pp. 232–239. [Google Scholar]
  92. Luo, X.; Wu, Y.; Zhao, L. YOLOD: A target detection method for UAV aerial imagery. Remote Sens. 2022, 14, 3240. [Google Scholar] [CrossRef]
  93. Feng, J.; Yi, C. Lightweight detection network for arbitrary-oriented vehicles in UAV imagery via global attentive relation and multi-path fusion. Drones 2022, 6, 108. [Google Scholar] [CrossRef]
  94. Chen, Z.; Cao, L.; Wang, Q. YOLOv5-based vehicle detection method for high-resolution UAV images. Mob. Inf. Syst. 2022, 2022, 1828848. [Google Scholar] [CrossRef]
  95. Luo, X.; Wu, Y.; Wang, F. Target detection method of UAV aerial imagery based on improved YOLOv5. Remote Sens. 2022, 14, 5063. [Google Scholar] [CrossRef]
  96. Wang, X.; Zhao, Q.; Jiang, P.; Zheng, Y.; Yuan, L.; Yuan, P. LDS-YOLO: A lightweight small object detection method for dead trees from shelter forest. Comput. Electron. Agric. 2022, 198, 107035. [Google Scholar] [CrossRef]
  97. Sun, Z.; Ibrayim, M.; Hamdulla, A. Detection of pine wilt nematode from drone images using UAV. Sensors 2022, 22, 4704. [Google Scholar] [CrossRef]
  98. Wu, B.; Liang, A.; Zhang, H.; Zhu, T.; Zou, Z.; Yang, D.; Tang, W.; Li, J.; Su, J. Application of conventional UAV-based high-throughput object detection to the early diagnosis of pine wilt disease by deep learning. For. Ecol. Manag. 2021, 486, 118986. [Google Scholar] [CrossRef]
  99. Zhou, Z.; Yang, X. Pine wilt disease detection in UAV-CAPTURED images. Int. J. Robot. Autom. 2022, 37, 37–43. [Google Scholar] [CrossRef]
  100. Jintasuttisak, T.; Edirisinghe, E.; Elbattay, A. Deep neural network based date palm tree detection in drone imagery. Comput. Electron. Agric. 2022, 192, 106560. [Google Scholar] [CrossRef]
  101. Junos, M.H.; Khairuddin, A.S.M.; Thannirmalai, S.; Dahari, M. Automatic detection of oil palm fruits from UAV images using an improved YOLO model. Visual Comput. 2022, 38, 2341–2355. [Google Scholar] [CrossRef]
  102. Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Anwar, S. Deep learning-based identification system of weeds and crops in strawberry and pea fields for a precision agriculture sprayer. Precis. Agric. 2021, 22, 1711–1727. [Google Scholar] [CrossRef]
  103. Zhao, J.; Yan, J.; Xue, T.; Wang, S.; Qiu, X.; Yao, X.; Tian, Y.; Zhu, Y.; Cao, W.; Zhang, X. A deep learning method for oriented and small wheat spike detection (OSWSDet) in UAV images. Comput. Electron. Agric. 2022, 198, 107087. [Google Scholar] [CrossRef]
  104. Egi, Y.; Hajyzadeh, M.; Eyceyurt, E. Drone-computer communication based tomato generative organ counting model using YOLO V5 and deep-sort. Agriculture 2022, 12, 1920. [Google Scholar] [CrossRef]
  105. Xie, Y.; Jiang, J.; Bao, H.; Zhai, P.; Zhao, Y.; Zhou, X.; Jiang, G. Recognition of big mammal species in airborne thermal imaging based on YOLO V5 algorithm. Integr. Zool. 2022, 18, 333–352. [Google Scholar] [CrossRef]
  106. Priya, R.D.; Devisurya, V.; Anitha, N.; Kalaivaani, N.; Keerthana, P.; Kumar, E.A. Automated cattle classification and counting using hybridized mask R-CNN and YOLOv3 algorithms. In Proceedings of the 21st International Conference on Intelligent Systems Design and Applications (ISDA), World Wide Web, Electr Network, Online, 13–15 December 2021; pp. 358–367. [Google Scholar]
  107. Ulhaq, A.; Adams, P.; Cox, T.E.; Khan, A.; Low, T.; Paul, M. Automated detection of animals in low-resolution airborne thermal imagery. Remote Sens. 2021, 13, 3276. [Google Scholar] [CrossRef]
  108. Petso, T.; Jamisola, R.S., Jr.; Mpoeleng, D.; Bennitt, E.; Mmereki, W. Automatic animal identification from drone camera based on point pattern analysis of herd behaviour. Ecol. Inform. 2021, 66, 101485. [Google Scholar] [CrossRef]
  109. Guzel, M.; Sin, B.; Turan, B.; Kadioglu, I. Real-time detection of wild mustard (Sinapis arvensis L.) with deep learning (YOLO-v3). Fresenius Environ. Bull. 2021, 30, 12197–12203. [Google Scholar]
  110. Hashim, W.; Eng, L.S.; Alkawsi, G.; Ismail, R.; Alkahtani, A.A.; Dzulkifly, S.; Baashar, Y.; Hussain, A. A Hybrid vegetation detection framework: Integrating vegetation indices and convolutional neural network. Symmetry 2021, 13, 2190. [Google Scholar] [CrossRef]
  111. Idrissi, M.; Hussain, A.; Barua, B.; Osman, A.; Abozariba, R.; Aneiba, A.; Asyhari, T. Evaluating the forest ecosystem through a semi-autonomous quadruped robot and a hexacopter UAV. Sensors 2022, 22, 5497. [Google Scholar] [CrossRef] [PubMed]
  112. Jemaa, H.; Bouachir, W.; Leblon, B.; Bouguila, N. Computer vision system for detecting orchard trees from UAV images. In Proceedings of the 24th ISPRS Congress on Imaging Today, Foreseeing Tomorrow, Nice, France, 6–11 June 2022; pp. 661–668. [Google Scholar]
  113. dos Santos, A.; Biesseck, B.J.G.; Latte, N.; Santos, I.C.d.L.; dos Santos, W.P.; Zanetti, R.; Zanuncio, J.C. Remote detection and measurement of leaf-cutting ant nests using deep learning and an unmanned aerial vehicle. Comput. Electron. Agric. 2022, 198, 107071. [Google Scholar] [CrossRef]
  114. Puliti, S.; Astrup, R. Automatic detection of snow breakage at single tree level using YOLOv5 applied to UAV imagery. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102946. [Google Scholar] [CrossRef]
  115. Chang-Jin, S. A study on pedestrians tracking using low altitude UAV. Trans. Korean Inst. Electr. Eng. P 2018, 67, 227–232. [Google Scholar]
  116. Barisic, A.; Car, M.; Bogdan, S. Vision-based system for a real-time detection and following of UAV. In Proceedings of the International Workshop on Research, Education and Development of Unmanned Aerial Systems (RED UAS), Cranfield University, Cranfield, UK, 25–27 November 2019; pp. 156–159. [Google Scholar]
  117. Li, J.-M.; Chen, C.W.; Cheng, T.-H. Estimation and tracking of a moving target by unmanned aerial vehicles. In Proceedings of the American Control Conference (ACC), Philadelphia, PA, USA, 10–12 July 2019; pp. 3944–3949. [Google Scholar]
  118. Huang, Z.-Y.; Lai, Y.-C. Image-based sense and avoid of small scale UAV using deep learning approach. In Proceedings of the International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 1–4 September 2020; pp. 545–550. [Google Scholar]
  119. Jin, C.-J.; Shi, X.; Hui, T.; Li, D.; Ma, K. The automatic detection of pedestrians under the high-density conditions by deep learning techniques. J. Adv. Transp. 2021, 2021, 1–11. [Google Scholar] [CrossRef]
  120. Zhang, X.; Li, N.; Zhang, R. An improved lightweight network MobileNetv3 Based YOLOv3 for pedestrian detection. In Proceedings of the IEEE International Conference on Consumer Electronics and Computer Engineering (ICCECE), Guangzhou, China, 15–17 January 2021; pp. 114–118. [Google Scholar]
  121. Boudjit, K.; Ramzan, N. Human detection based on deep learning YOLO-v2 for real-time UAV applications. J. Exp. Theor. Artif. Intell. 2022, 34, 527–544. [Google Scholar] [CrossRef]
  122. Shao, Y.; Zhang, X.; Chu, H.; Zhang, X.; Zhang, D.; Rao, Y. AIR-YOLOv3: Aerial Infrared Pedestrian Detection via an Improved YOLOv3 with Network Pruning. Appl. Sci. 2022, 12, 3627. [Google Scholar] [CrossRef]
  123. Kainz, O.; Gera, M.; Michalko, M.; Jakab, F. Experimental solution for estimating pedestrian locations from UAV imagery. Appl. Sci. 2022, 12, 9485. [Google Scholar] [CrossRef]
  124. Kraft, M.; Piechocki, M.; Ptak, B.; Walas, K. Autonomous, onboard vision-based trash and litter detection in low altitude aerial images collected by an unmanned aerial vehicle. Remote Sens. 2021, 13, 965. [Google Scholar] [CrossRef]
  125. Liao, Y.-H.; Juang, J.-G. Real-time UAV trash monitoring system. Appl. Sci. 2022, 12, 1838. [Google Scholar] [CrossRef]
  126. Liu, M.; Wang, X.; Zhou, A.; Fu, X.; Ma, Y.; Piao, C. UAV-YOLO: Small object detection on unmanned aerial vehicle perspective. Sensors 2020, 20, 2238. [Google Scholar] [CrossRef] [Green Version]
  127. Wang, L.; Ai, J.; Zhang, L.; Xing, Z. Design of airport obstacle-free zone monitoring UAV system based on computer vision. Sensors 2020, 20, 2475. [Google Scholar] [CrossRef] [PubMed]
  128. Kong, H.; Chen, Z.; Yue, W.; Ni, K. Improved YOLOv4 for pedestrian detection and counting in UAV images. Comput. Intell. Neurosci. 2022, 2022, 6106853. [Google Scholar] [CrossRef] [PubMed]
  129. Maharjan, N.; Miyazaki, H.; Pati, B.M.; Dailey, M.N.; Shrestha, S.; Nakamura, T. Detection of river plastic using UAV sensor data and deep learning. Remote Sens. 2022, 14, 3049. [Google Scholar] [CrossRef]
  130. Wyder, P.M.; Chen, Y.-S.; Lasrado, A.J.; Pelles, R.J.; Kwiatkowski, R.; Comas, E.O.A.; Kennedy, R.; Mangla, A.; Huang, Z.; Hu, X.; et al. Autonomous drone hunter operating by deep learning and all-onboard computations in GPS-denied environments. PLoS ONE 2019, 14, e0225092. [Google Scholar] [CrossRef]
  131. Quan, A.; Herrmann, C.; Soliman, H. Project Vulture: A prototype for using drones in search and rescue operations. In Proceedings of the 15th Annual International Conference on Distributed Computing in Sensor Systems (DCOSS), Athens, Greece, 29–31 May 2019; pp. 619–624. [Google Scholar]
  132. Kashihara, S.; Wicaksono, M.A.; Fall, D.; Niswar, M. Supportive information to find victims from aerial video in search and rescue operation. In Proceedings of the IEEE International Conference on Internet of Things and Intelligence System (IoTaIS), Bali, Indonesia, 5–7 November 2019; pp. 56–61. [Google Scholar]
  133. Sambolek, S.; Ivasic-Kos, M. Automatic person detection in search and rescue operations using deep CNN detectors. IEEE Access 2021, 9, 37905–37922. [Google Scholar] [CrossRef]
  134. Arnold, R.; Abruzzo, B.; Korpela, C. Towards a heterogeneous swarm for object classification. In Proceedings of the IEEE National Aerospace and Electronics Conference (NAECON), Dayton, OH, USA, 15–19 July 2019; pp. 139–147. [Google Scholar]
  135. Jing, Y.; Ren, Y.; Liu, Y.; Wang, D.; Yu, L. Automatic extraction of damaged houses by earthquake based on improved YOLOv5: A case study in Yangbi. Remote Sens. 2022, 14, 382. [Google Scholar] [CrossRef]
  136. Ajmera, Y.; Singh, S.P. Autonomous UAV-based target search, tracking and following using reinforcement learning and YOLOFlow. In Proceedings of the IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Khalifa University. Abu Dhabi, United Arab Emirates, 4–6 November 2020; pp. 15–20. [Google Scholar]
  137. Sudholz, A.; Denman, S.; Pople, A.; Brennan, M.; Amos, M.; Hamilton, G. A comparison of manual and automated detection of rusa deer (Rusa timorensis) from RPAS-derived thermal imagery. Wildl. Res. 2022, 49, 46–53. [Google Scholar] [CrossRef]
  138. Opromolla, R.; Inchingolo, G.; Fasano, G. Airborne visual detection and tracking of cooperative UAVs exploiting deep learning. Sensors 2019, 19, 4332. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  139. Merizalde, D.; Morillo, P. Real-time social distancing detection approach using YOLO and unmanned aerial vehicles. In Proceedings of the 2nd International Conference on Smart Technologies, Systems and Applications (SmartTech-IC), Quito, Ecuador, 1–3 December 2021; pp. 114–127. [Google Scholar]
  140. Kim, D.; Liu, M.; Lee, S.; Kamat, V.R. Remote proximity proximity monitoring between mobile construction resources using camera-mounted UAVs. Autom. Constr. 2019, 99, 168–182. [Google Scholar] [CrossRef]
  141. Hong, S.-J.; Han, Y.; Kim, S.-Y.; Lee, A.-Y.; Kim, G. Application of deep-learning methods to bird detection using unmanned aerial vehicle imagery. Sensors 2019, 19, 1651. [Google Scholar] [CrossRef] [Green Version]
  142. Arola, S.; Akhloufi, M.A. Vision-based deep learning for UAVs collaboration. In Proceedings of the Conference on Unmanned Systems Technology XXI, Baltimore, MD, USA, 16–18 April 2019. [Google Scholar]
  143. Zheng, R.; Yang, R.; Lu, K.; Zhang, S. A search and rescue system for maritime personnel in disaster carried on unmanned aerial vehicle. In Proceedings of the 18th International Symposium on Distributed Computing and Applications for Business Engineering and Science (DCABES), Wuhan, China, 8–10 November 2019; pp. 43–47. [Google Scholar]
  144. Silvirianti; Shin, S.Y. UAV based search and rescue with honeybee flight behavior in forest. In Proceedings of the 5th International Conference on Mechatronics and Robotics Engineering (ICMRE), Rome, Italy, 16–19 February 2019; pp. 182–187. [Google Scholar]
  145. Zhang, X.; Shi, Z.; Wu, Z.; Liu, J. Sea surface ships detection method of UAV based on improved YOLOv3. In Proceedings of the 11th International Conference on Graphics and Image Processing (ICGIP), Zhejiang Gongshang University, Hangzhou, China, 12–14 October 2019. [Google Scholar]
  146. Medeiros, A.C.S.; Ratsamee, P.; Orlosky, J.; Uranishi, Y.; Higashida, M.; Takemura, H. UAV target-selection: 3D pointing interface system for large-scale environment. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Xian, China, 30 May–5 June 2021; pp. 3963–3969. [Google Scholar]
  147. Sarosa, M.; Muna, N.; Rohadi, E.; IOP. Detection of natural disaster victims using You Only Look Once (YOLO). In Proceedings of the 5th Annual Applied Science and Engineering Conference (AASEC), Bandung, Indonesia, 20–21 April 2020. [Google Scholar]
  148. Rizk, M.; Slim, F.; Charara, J. Toward AI-Assisted UAV for human detection in search and rescue missions. In Proceedings of the International Conference on Decision Aid Sciences and Application (DASA), Sakheer, Bahrain, 7–8 December 2021. [Google Scholar]
  149. Qi, D.; Li, Z.; Ren, B.; Lei, P.; Yang, X. Detection and tracking of a moving target for UAV based on machine vision. In Proceedings of the 7th International Conference on Control, Automation and Robotics (ICCAR), Singapore, 23–26 April 2021; pp. 173–178. [Google Scholar]
  150. Panigrahi, S.; Maski, P.; Thondiyath, A. Deep learning based real-time biodiversity analysis using aerial vehicles. In Proceedings of the 9th International Conference on Robot Intelligence Technology and Applications (RiTA), KAIST, Daejeon, South Korea, 16–17 December 2021; pp. 401–412. [Google Scholar]
  151. Wang, Z.; Zhang, X.; Li, J.; Luan, K. A YOLO-based target detection model for offshore unmanned aerial vehicle data. Sustainability 2021, 13, 12980. [Google Scholar] [CrossRef]
  152. Tanwar, S.; Gupta, R.; Patel, M.M.; Shukla, A.; Sharma, G.; Davidson, I.E. Blockchain and AI-empowered social distancing scheme to combat COVID-19 situations. IEEE Access 2021, 9, 129830–129840. [Google Scholar] [CrossRef]
  153. Gromada, K.; Siemiatkowska, B.; Stecz, W.; Plochocki, K.; Wozniak, K. Real-time object detection and classification by UAV equipped with SAR. Sensors 2022, 22, 2068. [Google Scholar] [CrossRef]
  154. Bahhar, C.; Ksibi, A.; Ayadi, M.; Jamjoom, M.M.; Ullah, Z.; Soufiene, B.O.; Sakli, H. Wildfire and Smoke Detection Using Staged YOLO Model and Ensemble CNN. Electronics 2023, 12, 228. [Google Scholar] [CrossRef]
  155. Narayanan, P.; Borel-Donohue, C.; Lee, H.; Kwon, H.; Rao, R. A real-time object detection framework for aerial imagery using deep neural networks and synthetic training images. In Proceedings of the Conference on Signal Processing, Sensor/Information Fusion, and Target Recognition XXVII, Orlando, FL, USA, 16–19 April 2018. [Google Scholar]
  156. Borel-Donohue, C.; Young, S.S. Image quality and super resolution effects on object recognition using deep neural networks. In Proceedings of the Conference on Artificial Intelligence and Machine Learning for Multi-Domain Operations Applications, Baltimore, MD, USA, 15–17 April 2019. [Google Scholar]
  157. Krump, M.; Stuetz, P. UAV based vehicle detection with synthetic training: Identification of performance factors using image descriptors and machine learning. In Proceedings of the 7th International Conference on Modelling and Simulation for Autonomous Systems (MESAS), Prague, Czech Republic, 21 October 2020; pp. 62–85. [Google Scholar]
  158. Laurito, G.; Fraser, B.; Rosser, K. Airborne localisation of small UAS using visual detection: A field experiment. In Proceedings of the IEEE Symposium Series on Computational Intelligence (SSCI), Canberra, Australia, 1–4 December 2020; pp. 1435–1443. [Google Scholar]
  159. Xing, C.; Liang, X.; Yang, R. Compact one-stage object detection network. In Proceedings of the 8th IEEE International Conference on Computer Science and Network Technology (ICCSNT), Dalian, China, 20–22 November 2020; pp. 115–118. [Google Scholar]
  160. Zhang, J.; Wang, P.; Zhao, Z.; Su, F. Pruned-YOLO: Learning efficient object detector using model pruning. In Proceedings of the 30th International Conference on Artificial Neural Networks (ICANN), Bratislava, Slovakia, 14–17 September 2021; pp. 34–45. [Google Scholar]
  161. Wan, X.; Yu, J.; Tan, H.; Wang, J. LAG: Layered objects to generate better anchors for object detection in aerial images. Sensors 2022, 22, 3891. [Google Scholar] [CrossRef] [PubMed]
  162. Shen, X.; Shi, G.; Ren, H.; Zhang, W. Biomimetic vision for zoom object detection based on improved vertical grid number YOLO algorithm. Front. Bioeng. Biotechnol. 2022, 10, 847. [Google Scholar] [CrossRef] [PubMed]
  163. Jiang, C.; Ren, H.; Ye, X.; Zhu, J.; Zeng, H.; Nan, Y.; Sun, M.; Ren, X.; Huo, H. Object detection from UAV thermal infrared images and videos using YOLO models. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102912. [Google Scholar] [CrossRef]
  164. Jawaharlalnehru, A.; Sambandham, T.; Sekar, V.; Ravikumar, D.; Loganathan, V.; Kannadasan, R.; Khan, A.A.; Wechtaisong, C.; Haq, M.A.; Alhussen, A.; et al. Target object detection from Unmanned Aerial Vehicle (UAV) images based on improved YOLO algorithm. Electronics 2022, 11, 2343. [Google Scholar] [CrossRef]
  165. Liu, X.; Wu, J. Finetuned YOLOv3 for getting four times the detection speed. In Proceedings of the 14th International Conference on Knowledge Science, Engineering, and Management (KSEM), Tokyo, Japan, 14–16 August 2021; pp. 512–521. [Google Scholar]
  166. Zhang, T.; Hu, X.; Xiao, J.; Zhang, G. A Machine learning method for vision-based unmanned aerial vehicle systems to understand unknown environments. Sensors 2020, 20, 3245. [Google Scholar] [CrossRef]
  167. Qin, Z.; Wang, W.; Dammer, K.-H.; Guo, L.; Cao, Z. Ag-YOLO: A real-time low-cost detector for precise spraying with case study of palms. Front. Plant Sci. 2021, 12, 2974. [Google Scholar] [CrossRef]
  168. Ho, M.J. Development of small multi-copter system for indoor collision avoidance flight. J. Aerosp. Syst. Eng. 2021, 15, 102–110. [Google Scholar]
  169. Xing, Z.; Zhang, L.; Ai, J. Research of key technologies for multi-rotor UAV automatic aerial recovery system. Electron. Lett. 2022, 58, 277–279. [Google Scholar] [CrossRef]
  170. Nepal, U.; Eslamiat, H. Comparing YOLOv3, YOLOv4 and YOLOv5 for autonomous landing spot detection in faulty UAVs. Sensors 2022, 22, 464. [Google Scholar] [CrossRef]
  171. Lalak, M.; Wierzbicki, D. Automated detection of atypical aviation obstacles from UAV images using a YOLO algorithm. Sensors 2022, 22, 6611. [Google Scholar] [CrossRef]
  172. Lu, M.; Chen, H.; Lu, P. Perception and avoidance of multiple small fast moving objects for quadrotors with only low-cost RGBD camera. IEEE Robot. Autom. Lett. 2022, 7, 11657–11664. [Google Scholar] [CrossRef]
  173. Wang, C.-N.; Yang, F.-C.; Vo, N.T.M.; Nguyen, V.T.T. Wireless communications for data security: Efficiency assessment of cybersecurity industry—A promising application for UAVs. Drones 2022, 6, 363. [Google Scholar] [CrossRef]
Figure 1. Global commercial drone annual sales and sales statistics [17].
Figure 1. Global commercial drone annual sales and sales statistics [17].
Drones 07 00190 g001
Figure 2. YOLOv7-based UAV technology architecture diagram (BN: batch normalization layer; AF: activation function layer).
Figure 2. YOLOv7-based UAV technology architecture diagram (BN: batch normalization layer; AF: activation function layer).
Drones 07 00190 g002
Figure 3. Literature search methods.
Figure 3. Literature search methods.
Drones 07 00190 g003
Figure 4. Literature survey methodology.
Figure 4. Literature survey methodology.
Drones 07 00190 g004
Figure 5. Number of papers published in top journals and conferences (2017–2022).
Figure 5. Number of papers published in top journals and conferences (2017–2022).
Drones 07 00190 g005
Figure 6. Survey of application areas for YBUT: (a) Popular areas of interest in English journals, (b) Popular areas of interest for Chinese journals.
Figure 6. Survey of application areas for YBUT: (a) Popular areas of interest in English journals, (b) Popular areas of interest for Chinese journals.
Drones 07 00190 g006
Figure 7. Flowchart of the proposed method by Jiang et al. [32].
Figure 7. Flowchart of the proposed method by Jiang et al. [32].
Drones 07 00190 g007
Figure 8. Dense YOLO network model [33].
Figure 8. Dense YOLO network model [33].
Drones 07 00190 g008
Figure 9. Drogue detection method [34].
Figure 9. Drogue detection method [34].
Drones 07 00190 g009
Figure 10. Four-rotor monitoring UAV [36].
Figure 10. Four-rotor monitoring UAV [36].
Drones 07 00190 g010
Figure 11. Complete system design by Alam et al. [37].
Figure 11. Complete system design by Alam et al. [37].
Drones 07 00190 g011
Figure 12. Software architecture for deep-learning-based motion control [40]. The red circles in the diagram represent the input RGB images in the YOLOv3 algorithm, the orange circles represent the calculation process of the YOLOv3 algorithm, and the blue circles represent the target and bounding box data detected by the YOLOv3 algorithm.
Figure 12. Software architecture for deep-learning-based motion control [40]. The red circles in the diagram represent the input RGB images in the YOLOv3 algorithm, the orange circles represent the calculation process of the YOLOv3 algorithm, and the blue circles represent the target and bounding box data detected by the YOLOv3 algorithm.
Drones 07 00190 g012
Figure 13. Workflow of the pedestrian detection framework [41].
Figure 13. Workflow of the pedestrian detection framework [41].
Drones 07 00190 g013
Figure 14. Experimental results of the PMA-YOLO network for the detection of anomalous vibration dampers [57]. The ground truth boxes and prediction boxes for “rusty”, “defective”, and “normal” dampers are shown in yellow, red, blue, and green, respectively.
Figure 14. Experimental results of the PMA-YOLO network for the detection of anomalous vibration dampers [57]. The ground truth boxes and prediction boxes for “rusty”, “defective”, and “normal” dampers are shown in yellow, red, blue, and green, respectively.
Drones 07 00190 g014
Figure 15. Results of the improved YOLOv3 algorithm for UAV detection [58].
Figure 15. Results of the improved YOLOv3 algorithm for UAV detection [58].
Drones 07 00190 g015
Figure 16. Vehicle detection results based on urban road videos [74].
Figure 16. Vehicle detection results based on urban road videos [74].
Drones 07 00190 g016
Figure 17. (a) UAV acquisition images, (b) UAV image detection results [75].
Figure 17. (a) UAV acquisition images, (b) UAV image detection results [75].
Drones 07 00190 g017
Figure 18. Road damage detection results [77].
Figure 18. Road damage detection results [77].
Drones 07 00190 g018
Figure 19. Detection results for road signs in various environmental conditions [78].
Figure 19. Detection results for road signs in various environmental conditions [78].
Drones 07 00190 g019
Figure 20. (a,b) are the original images of the diseased trees detection region, (c,d) are the results of the MobileNetv2-YOLOv4 algorithm for diseased trees detection of the region [97].
Figure 20. (a,b) are the original images of the diseased trees detection region, (c,d) are the results of the MobileNetv2-YOLOv4 algorithm for diseased trees detection of the region [97].
Drones 07 00190 g020
Figure 21. Weed identification results for pea crop area and strawberry crop area [102].
Figure 21. Weed identification results for pea crop area and strawberry crop area [102].
Drones 07 00190 g021
Figure 22. Detection counts of reindeer and sika deer by using the YOLOv5s improved model [105].
Figure 22. Detection counts of reindeer and sika deer by using the YOLOv5s improved model [105].
Drones 07 00190 g022
Figure 23. Results of UAV dataset detection using YOLOv4 [124].
Figure 23. Results of UAV dataset detection using YOLOv4 [124].
Drones 07 00190 g023
Figure 24. Results of UAV litter detection at Badouzi fishing port [125].
Figure 24. Results of UAV litter detection at Badouzi fishing port [125].
Drones 07 00190 g024
Figure 25. YOLOv5s-ViT-BiFPN algorithm for detecting damaged houses [135].
Figure 25. YOLOv5s-ViT-BiFPN algorithm for detecting damaged houses [135].
Drones 07 00190 g025
Table 1. Specific subject directions of the screened articles.
Table 1. Specific subject directions of the screened articles.
No.Research FieldsNo.Research FieldsNo.Research FieldsNo.Research Fields
1Engineering12Transportation23Plant Sciences34Architecture
2Computer Science13Optics24Forestry35Behavioral Sciences
3Automation Control Systems14Physical Sciences Other Topics25Physical Geography36Biodiversity Conservation
4Communication15Geology26Spectroscopy37Geography
5Instruments Instrumentation16Environmental Sciences Ecology27Geochemistry Geophysics38Neurosciences Neurology
6Robotics17Energy Fuels28Materials Science39Parasitology
7Business Economics18Construction Building Technology29Operations Research Management Science40Radiology Nuclear Medicine Medical Imaging
8Mathematics19Agriculture30Zoology41Mechanics
9Imaging Science Photographic Technology20Mathematical Computational Biology31Science Technology Other Topics
10Telecommunications21Physics32Remote Sensing
11Chemistry22Acoustics33Fisheries
Table 2. Overview of papers that explicitly address YBUT in engineering.
Table 2. Overview of papers that explicitly address YBUT in engineering.
YOLO ModelsReferenceObjectMetricPaper TypeSensorsPurpose
YOLOv2Sousa et al. (2022) [63]Ceramic detachmentPrecision 99%,
Recall 98%
JournalCamerasBuilding surface inspection
YOLOv3Wang et al. (2019) [64]Small objectsAccuracy 85%Conference paperCamerasAlgorithmic research
Han et al. (2020) [65]InsulatorsPrecision 92.1%,
Recall 92.2%
JournalCamerasTransmission line inspection
Yan et al. (2021) [66]Electrical componentsN/AConference paperCamerasTransmission line inspection
Liu et al. (2021) [67]InsulatorsPrecision 94%,
Recall 96%
JournalCamerasTransmission line inspection
Kumar et al. (2021) [68]Concrete damageAccuracy 94.24%JournalCamerasBuilding surface inspection
Tu et al. (2021) [69]Power towers and InsulatorsAccuracy 88%JournalCamerasTransmission line inspection
Ding Lu et al. (2021) [70]Holes and bolts N/AJournalCameras Aerial manipulation platform
Yang et al. (2022) [71]InsulatorsmAP 94%JournalCamerasTransmission line inspection
YOLOv4Kim-Phuong et al. (2021) [72]UAVsAccuracy 87.37%Conference paperCamerasMoving target tracking
YOLOv5Wang et al. (2021) [73]Small objectsmAP 81.1%Conference paperCamerasAlgorithmic research
Table 3. Overview of papers that explicitly address YBUT in transportation.
Table 3. Overview of papers that explicitly address YBUT in transportation.
YOLO ModelsReferenceObjectMetricPaper TypeSensorsPurpose
YOLOv2Kim et al. (2019) [80]Road cracksmAP 33%JournalCamerasRoad safety inspection
Sharma et al. (2022) [81]Railway tracksPrecision 74%, Accuracy 85%, mAP 70.7%JournalCamerasRoad safety inspection
YOLOv3Krump et al. (2019) [82]VehiclesmAP 64.4%Conference paperCamerasAlgorithmic research
Luo et al. (2020) [83]VehiclesmAP 97.49%JournalCamerasAlgorithmic research
Hassan et al. (2020) [84]Road cracksAccuracy 92%, mAP 90%Conference paperCamerasRoad safety inspection
Chung et al. (2020) [85]VehiclesmAP 35.08%Conference paperCamerasAlgorithmic research
Li et al. (2021) [86]VehiclesN/AJournalCamerasTraffic Management
Chen et al. (2021) [87]VehiclesmAP 50.05%JournalCamerasVehicle tracking and speed estimation
Rampriya et al. (2022) [88]Obstacles on the railway trackPrecision 70.68%, Accuracy 70.83%,
Recall 73.64%
JournalCamerasRoad safety inspection
Gupta et al. (2022) [89]Military vehiclesN/AJournalCamerasMilitary vehicle detection and classification
YOLOv4Golyak et al. (2020) [90]VehiclesN/AConference paperCameras, Thermal imagerDetection of unmanned vehicles
Emiyah et al. (2021) [91]VehiclesN/AConference paperCamerasVehicle detection and counting
Luo et al. (2022) [92]VehiclesmAP 71.97%JournalCamerasAlgorithmic research
YOLOv5Feng and Yi (2022) [93]VehiclesmAP 89.74%JournalCamerasTraffic Management
Chen et al. (2022) [94]VehiclesPrecision 91.9%,
Recall 82.5%, mAP 89.6%
JournalCamerasTraffic Management
Luo et al. (2022) [95]VehiclesmAP 85.35%JournalCamerasAlgorithmic research
Table 4. Overview of papers that explicitly address YBUT in agriculture.
Table 4. Overview of papers that explicitly address YBUT in agriculture.
YOLO ModelsReferenceObjectMetricPaper TypeSensorsPurpose
YOLOv3Priya et al. (2021) [106]CattleN/AConference paperCamerasLivestock management
Ulhaq et al. (2021) [107]AnimalsmAP 87.1%JournalThermal imagerAnimal management
Petso et al. (2021) [108]AnimalsF1 96%JournalCamerasWildlife monitoring
Guzel et al. (2021) [109]Wild mustardPrecision 45–99%JournalCamerasCrop protection
Hashim et al. (2021) [110]VegetationAccuracy 84%JournalMultispectral cameraHybrid Vegetation Detection
YOLOv5Idrissi et al. (2022) [111]Burrow, Deadwood, Pine, Grass, Oak, Wood, Fire, PedestrianmAP 44.3%JournalCamerasEvaluating the Forest Ecosystem
Jemaa et al. (2022) [112]Orchard treePrecision 91%Conference paperCamerasOrchard tree management
dos Santos et al. (2022) [113]Leaf-cutting antsAccuracy 98.45%JournalCamerasOptimizing the use of pesticides
Puliti and Astrup (2022) [114]Tree damagePrecision 76%,
Recall 78%
JournalCamerasEvaluating the Forest Ecosystem
Table 5. Overview of papers that explicitly address YBUT in automation.
Table 5. Overview of papers that explicitly address YBUT in automation.
YOLO ModelsReferenceObjectMetricPaper TypeSensorsPurpose
YOLOv3Liu et al. (2020) [126]Small objectsmAP 72.54%JournalCamerasSmall object detection
Wang et al. (2020) [127]UAVsN/AJournalCamerasAirport obstacle-free zone monitoring UAV system
YOLOv4Kong et al. (2022) [128]PedestrianmAP 39.32%JournalCamerasPedestrian Detection and Counting
YOLOv5Maharjan et al. (2022) [129]River PlasticN/AJournalCamerasPlastic waste management
Table 6. Overview of papers that explicitly address YBUT in other fields.
Table 6. Overview of papers that explicitly address YBUT in other fields.
YOLO ModelsReferenceObjectMetricPaper TypeSensorsPurpose
YOLOv1Ajmera and Singh (2020) [136]Missing victimN/AConference paperCamerasUrban Search and Rescue
Sudholz et al. (2022) [137]Rusa deerN/AJournalCameras, Thermal imagerThe detection and monitoring of invasive species
YOLOv2Opromolla et al. (2019) [138]UAVsN/AJournalCamerasVisual-based detection and tracking of cooperative UAVs
Merizalde and Morillo (2021) [139]PedestrianRcall 90%Conference paperCamerasReal-time Social Distancing Detection
YOLOv3Kim et al. (2019) [140]Mobile construction resourcesAccuracy 97.43%JournalCamerasProtecting construction workers
Hong et al. (2019) [141]BirdsN/AJournalCamerasWildlife monitoring
Arola and Akhloufi (2019) [142]UAVsN/AConference paperCamerasCollaborative UAV research
Zheng et al. (2019) [143]Distress personnelN/AConference paperCamerasA Search and Rescue System for Maritime Personnel
Silvirianti et al. (2019) [144]UAV flight behaviorAccuracy 83%Conference paperCamerasSearch and rescue for people in distress in the forest
Zhang et al. (2019) [145]Sea surface shipsN/AConference paperCamerasAlgorithmic research
Medeiros et al. (2021) [146]Human postureN/AConference paperCamerasHuman posture guidance system
Sarosa et al. (2020) [147]Victims of natural disastersAccuracy 89%Conference paperCamerasSearch and rescue for victims of natural disasters
Rizk et al. (2021) [148]HumanAccuracy 78.78%Conference paperCamerasSearch and rescue for victims of natural disasters
Qi et al. (2021) [149]Moving TargetN/AConference paperCamerasMoving target detection and tracking
Panigrahi et al. (2021) [150]WildlifemAP 95%Conference paperCamerasBiodiversity analysis
Wang et al. (2021) [151]Offshore Small TargetsPrecision 92.7%, Recall 92.06%, mAP 95.58%JournalCamerasAlgorithmic research
Tanwar et al. (2021) [152]PedestrianN/AJournalCamerasReal-time Social Distancing Detection
YOLOv5Gromada et al. (2022) [153]Military targetsN/AJournalCameras, Synthetic aperture radarAlgorithmic research
Bahhar et al. (2023) [154]Wildfire and SmokemAP 85.8%JournalCamerasForest fire detection
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, C.; Zheng, Z.; Xu, T.; Guo, S.; Feng, S.; Yao, W.; Lan, Y. YOLO-Based UAV Technology: A Review of the Research and Its Applications. Drones 2023, 7, 190. https://doi.org/10.3390/drones7030190

AMA Style

Chen C, Zheng Z, Xu T, Guo S, Feng S, Yao W, Lan Y. YOLO-Based UAV Technology: A Review of the Research and Its Applications. Drones. 2023; 7(3):190. https://doi.org/10.3390/drones7030190

Chicago/Turabian Style

Chen, Chunling, Ziyue Zheng, Tongyu Xu, Shuang Guo, Shuai Feng, Weixiang Yao, and Yubin Lan. 2023. "YOLO-Based UAV Technology: A Review of the Research and Its Applications" Drones 7, no. 3: 190. https://doi.org/10.3390/drones7030190

Article Metrics

Back to TopTop