Next Article in Journal
Crashworthiness Study of a Newly Developed Civil Aircraft Fuselage Section with Auxiliary Fuel Tank Reinforced with Composite Foam
Next Article in Special Issue
Three-Dimensional Drone Exploration with Saliency Prediction in Real Unknown Environments
Previous Article in Journal
Design, Development, and Testing of a Low-Cost Sub-Joule µPPT for a Pocket-Cube
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

UAV Platforms for Data Acquisition and Intervention Practices in Forestry: Towards More Intelligent Applications

1
Key Lab of State Forestry Administration for Forestry Equipment and Automation, School of Technology, Beijing Forestry University, Beijing 100083, China
2
College of Mechanical and Electrical Engineering, North China Institute of Science and Technology, Langfang 065201, China
3
Department of Mechanical Engineering, New Mexico Tech, Socorro, NM 87801, USA
4
Department of Mechanical and Aerospace Engineering, New Mexico State University, Las Cruces, NM 88003, USA
*
Author to whom correspondence should be addressed.
Aerospace 2023, 10(3), 317; https://doi.org/10.3390/aerospace10030317
Submission received: 28 December 2022 / Revised: 11 March 2023 / Accepted: 16 March 2023 / Published: 22 March 2023
(This article belongs to the Special Issue Applications of Drones (Volume II))

Abstract

:
Unmanned air vehicle (UAV) systems for performing forestry applications have expanded in recent decades and have great economic benefits. They are validated to be more appealing than traditional platforms in various aspects, such as repeat rate, spatial resolution, and accuracy. This paper consolidates the state-of-the-art unmanned systems in the forestry field with a major focus on UAV systems and heterogeneous platforms, which are applied in a variety of forestry applications, such as wood production, tree quantification, disease control, wildfire management, wildlife conservation, species classification, etc. This review also studies practical applications under multiple forestry environments, including wild and managed forests, grassland, urban green parks, and stockyards. Special forest environments and terrains present customized demands for unmanned systems. The challenges of unmanned systems deployment are analyzed from environmental characterization, maneuverability and mobility improvement, and global regulatory interpretation. To better apply UAV systems into forestry, future directions are analyzed in terms of mobility enhancement and customized sensory adaption, which need to be further developed for synchronizing all possible agents into automatic functioning systems for forestry exploration.

1. Introduction

Over the past decades, extensive studies of multiple unmanned platforms have been conducted regarding intervention and data acquisition applications in forestry [1,2]. Despite advances in adopting traditional methods, forestry practices are challenging to perform due to the harsh environment [3]. Certain drawbacks exist in these traditional crewed-dominant methods, such as maneuverability, real-time and valid range concerns, deploy-ability, and cost-effectiveness [4]. To address these issues, the development of more mechanized approaches is demanded by the forestry industry [5]. Various merits, including repeat rate, spatial resolution and accuracy, low cost, and great maneuverability, have made unmanned aerial vehicle systems appealing alternatives in diversifying forestry approaches [6,7]. Recently, applying unmanned aerial vehicle systems with different automation levels are receiving increased attention worldwide with the rapid development of the forestry industry. Unmanned systems that integrate both aerial, ground, and heterogeneous platforms are deployed under this circumstance, which greatly satisfies the demands of current forestry applications [8,9].
Introducing unmanned systems into the forestry industry has significantly improved the automation levels in field applications [10,11]. Compared with conventional forestry approaches, such as crewed surveys, watchtower observing, and pilot operating, unmanned platforms, including UAV and mobile robotics, present significant advantages over traditional choices [12,13]. They can be deployed quickly and repeatedly under most weather conditions, which offer more robust availability than satellites and manned crafts [14]. The cost-effectiveness enables UAV to exert their potential in addressing forested tasks [15]. The typical small UAV applied in practices are usually piloted remotely by crews instead of having operation crews onboard, such as traditional vehicles [16,17].
Forestry applications performed by various unmanned platforms extend from fundamental intervention operations, such as harvesting [18], forwarding, logging, and sorting [19], to more comprehensive management, such as data acquisition practices, disease control [20], wildfire management [10], smart transportation [21], and wildlife conservation [22]. Various merits, including cost-effectiveness, productivity, efficiency, and great maneuverability, have made unmanned systems appealing alternatives in diversifying forestry approaches [23,24]. Simon Ecke et al. believe that UAV is the main tool for forest resource detection, but it is faced with the problem that multi-time and long-term monitoring is obviously insufficient and the use of hyperspectral sensors is insufficient. [25]. Aydin and Akhloufi et al. designed a framework for combining drones and unmanned systems and assisting people in wildfire sensing and suppression tasks [26,27]. In addition, Yu et al. control the UAVs to fly or hover over specific areas and retrieve relevant data in real time by monitoring the on-board sensors. Thus, they can be of great help in monitoring wildland fires and in post-fire damage assessment [28,29,30]. Guerrero-Sánchez et al. proposed a filtered observer-based IDA-PBC strategy for trajectory tracking of a quadrotor, which can deal with the noisy output measurements and uncertainties in the dynamics [31]. UAVs are platforms that have been increasingly used to collect data for forest insect pest and disease (FIPD) monitoring [20]. The traditional artificial ground monitoring method uses the phototaxis of pests and diseases to monitor them [32], and UAV systems provide high spatial resolution images that can expand spatial coverage, reduce response times, and reduce costs in forest areas for pest monitoring [33]. Adão et al. proposed a forest pest control approach based on drone based hyperspectral sensors [34]. Eugenio et al. presented six studies on forest health monitoring and other forestry applications [35]. The feasibility of UAVs being utilized in the forestry industry is reviewed throughout the comprehensive studies, which thereby adds substantial value for forestry reference [36,37,38].
However, most of these reviews exclusively place major focus on a single and independent platform as their studied objectives. In practical forestry missions, multiple platforms and cooperative strategies are utilized to achieve satisfying effects, but the merits of applying multiple platforms are not documented. Without synthesizing all platforms into a comprehensive system, considerable benefits and advances would be overlooked or ignored. To this end, this review aims to provide a comprehensive reference by sorting real-world applications.
In summary, the main contributions of this paper are as follows:
  • We consolidate the state-of-the-art unmanned systems in the forestry field with a major focus on UAV systems and heterogeneous platforms.
  • Methodology and application under multiple forestry environments are reviewed, including wood production, tree quantification, disease control, wildfire management, and wildlife conservation.
  • The challenges of UAV systems deployment are analyzed from environmental characterization, maneuverability, and mobility improvement, and global regulatory interpretation.
  • The future directions are analyzed in terms of mobility enhancement and customized sensory adaptation, which need to be further developed for synchronizing all possible agents into automatic functioning systems for forestry exploration.
The rest of this article is organized as follows: main unmanned systems in forestry are described in Section 2, with a major focus on automated machines, mobile robotics, UAV, and heterogeneous strategies. Case studies of forestry applications are presented in Section 3. Discussions regarding current status and challenges are analyzed in Section 4. Lastly, conclusions are drawn by presenting comments with respect to future directions of further development.

2. Forestry Unmanned Platform

Different types of unmanned platforms for forestry applications are described in this section including unmanned ground vehicles (UGVs) and UAVs. In some forestry applications, the hybrid concept or the collaboration of aerial and ground units is needed to improve mission efficiency. In the following subsections, the utilization of robotic platforms is elaborately discussed.

2.1. Unmanned Aerial Vehicles

UAVs are flying vehicles that can be remotely controlled from the ground station or operators with no pilots on board [39,40]. UAVs were initially invented to support warfare for military requirements [41,42]. Then, it was gradually applied to non-military and private sectors by diversifying research methods [43]. UAVs are regarded as a promising alternative for various missions in performing diverse executions. While Unmanned ground vehicles are utilized exclusively for ground scenarios, their perception capabilities from a distant scale are limited. To this end, UAVs are introduced with easy deployability, ideal spatial resolution, and accuracy, and multiple sensor diversity from aerial aspects. Initial focuses of developing unmanned aerial vehicles were put on automatic surveillance and tactical data acquisition [44]. Later, the deployment of UAVs for photogrammetry applications became prevalent as early as 2000, driven by affordable GPS/INS (Global positioning system/Inertial navigation system) technologies [45]. Recently, the design, manufacturing, guidance, navigation, and control (GNC) technologies are becoming increasingly advanced. A wide range of UAVs for situations where the presence of traditional approaches is difficult, impossible, or dangerous have been developed [46,47].
UAVs for forestry tasks can be roughly divided into the helicopter, fixed-wing (e.g., glider or high wing), and rotary wing (single-rotor, coaxial, quadcopter, hexacopter, etc.) platforms according to their mechanical structure, take-off, and landing characteristics. Their distinctive differences are identified in terms of valid range and stability [48]. A comparison between different types of UAVs in terms of payload, wind resistance, minimum speed, flying autonomy, portability, and landing distance aspects is needed. Typical prototypes of fixed-wing and rotary wing UAVs are shown in Figure 1.
Fixed-wing systems are advantageous in speed performance and valid range, while multi-rotor UAVs present better maneuverability [49]. Their merit lies in the long endurance and high cruising speed when implementing missions [50]. For forestry applications, fixed-wing UAVs are preferable when potential work areas are huge with the help of their speed performance and valid range.
The centroid motion equation of fixed wing UAV consists of dynamic equation and kinematic equation. The purpose of establishing the dynamic equation of centroid motion of UAV is to relate the centroid acceleration to the applied force. Its vector form is:
m d V d t = P + A + G
where V is the velocity vector of UAV, G is the earth gravity, P is the thrust of the engine, and A is the aerodynamic force.
The vector form of the kinematics equation of UAV can be expressed as:
d R / d t = V
where R is the centroid position vector of UAV, and V is the velocity vector.
Limitations of fixed-wing UAVs include the requirement of launcher or runaway. Rotary wing UAVs have rotors attached to a fixed central mast. They can take off and land vertically, hover, and fly in any direction [51]. Rotary UAVs have a higher level of maneuverability and flexibility [52,53]. In some small areas of forestry fields, multi-rotors are more ideal and advantageous over fixed-wing UAVs, and are able to take off and land vertically, especially in confined spaces [54]. Their maintenance and power requirements are comparatively higher when compared to fixed-wing platforms. Additionally, their flight speeds and valid coverage are also less competitive.
The dynamic equations of the quadrotor can be derived by the Newton–Euler method, including the position and attitude:
x ¨ = U 1 ( sin θ cos ψ cos ϕ + sin ψ sin ϕ ) / m y ¨ = U 1 ( sin θ sin ψ cos ϕ cos ψ sin ϕ ) / m z ¨ = U 1 cos θ cos ϕ m g / m ϕ ¨ = U 2 / I z + θ ¨ ψ ¨ I y I z / I x θ ¨ = U 3 / I y + ϕ ¨ ψ ¨ I z I z / I y ψ ¨ = U 4 / I z + ϕ ¨ θ ¨ I x I y / I z
where x , y , z are the position vector, ϕ , θ , ψ are the attitude vector, I x , I y , I z are the moment of inertia around the three axes, U 1 is the total lift generated by the quadrotor, and [ U 2 , U 3 , U 4 ] are the control torque. m is the mass of the UAV.
One of the key technologies of UAV is how to design a reasonable control method to replace the role of the pilot in the human–machine system. According to the different control methods, UAV systems can be divided into the following three categories [55].
The base station-controlled UAV is also called Remotely Piloted Vehicle (RPV) [56]. During the flight process, the operator on the ground base station is required to continuously send operation instructions to the controlled UAV. In essence, base station-controlled UAVs are radio-controlled aircraft with complex structures. Due to the limitations of radio control technology in space, UAV has rarely used pure base station control to conduct unmanned driving today.
Semi-autonomous control refers to the control mode combining base station control and preset program. The base station can obtain the control authority of the UAV at any time, and some key actions during the flight need to be instructed by the base station. In general, the UAV can fly and perform relevant actions according to the pre-programmed settings.
Due to the performance limitations of the onboard processor, the researchers chose to send the data to a ground base station for processing, then back to the UAV to guide the movement. Imdoukh et al. developed a fire-resistant drone that can search for survivors and locate them in the shortest possible time [57]. The drone is designed to fly with a fire extinguisher and a camera, allowing the drone’s operator to view the environment and take manual control of special situations. Semi-automatic control methods can realize the exploration of UAV in an unknown environment, and complete the two-dimensional environment modeling. The ground station uses different algorithm modules (modeling, positioning, and obstacle avoidance) to process the data and sends the processed data results to the UAV to complete the positioning and path planning [58,59]. Since there is generally no manual participation in this process, it can also be regarded as a kind of intelligent control, but the communication performance between the ground processor and the UAV has a great impact on the robustness and autonomy of this control mode.
Fully autonomous control is also known as intelligent UAV. Specific tasks can be performed completely autonomously without the help of human instructions. A complete intelligent UAV system has the ability to monitor its own state, collect environmental information, analyze data, and make corresponding responses. Intelligent control UAV should at least include sensor module, SLAM module, path planning module, etc. At the same time, the control module of aerodynamic disturbance can be added to the UAV to realize the adaptive adjustment of the UAV to the complex environment.
Steenbeek et al. proposed a solution for 3D mapping applications that only uses images in input, and integrates SLAM and CNN-based Single Image Depth Estimation methods to densify and scale the data [60]. In autonomous navigation tasks without external GNSS assistance, Dong et al. used visual localization to assist odometer integration and image matching to achieve vehicle localization. A dual-rate Kalman filter is adopted to fuse the data of vision, inertial navigation, and odometry [61,62].
The Department of High Altitude Long Endurance (DHALE) of NASA Vehicle Systems Program (VSP) proposed a quantitative method to evaluate the autonomy of unmanned aerial vehicles (UAVs) [63]. The hierarchy and meaning of this method are clearer, and it has better practical operability, as shown in Table 1 [64].
At present, there are still some problems to be solved in the research of UAV autonomous control. Most of the research results are only applied to the experimental scenarios designed in this work, and their robustness has not been verified in the real environment. Moreover, due to different application fields, such as forest search and rescue, outdoor tracking, etc., the design architecture and sensors of the corresponding intelligent UAV are not exactly the same. These problems require specific consideration and design in the face of different scenarios.

2.2. Unmanned Ground Vehicles

For heavy operations, such as tree harvesting, preliminary ancillary equipment, such as brush cutters and chainsaws, were used around 100 years ago (1916–1917) as they are light and convenient enough to use. Over the past 50 years, forestry has been evolving from manual instrumentation to more and more mechanized. The development of forestry automation gradually replaced past manual work methods [65]. Since 1980, mechanized harvesting systems became commonplace, and they began performing an increasingly dominant role in large-scale forest operations [66]. Various approaches of harvesters with different automation levels are utilized in the modern timber harvesting industry, namely, remote supervision, semi-automatic harvesters, and automatic shuttle [67]. Due to the flexibility of realizing fully automatic machines, semi-automatic systems, and teleoperated selections are mainstream options. Remote supervision or teleoperations is a similar concept to remote control, in which human operators directly control the systems in a remote manner [68]. Semi-automatic harvesters are controlled by manned systems, such as forwarders. A prototype of a typical semi-automatic harvester is “Besten” system [69].
The systems deployed in harvesting practices must consider the slope degree of forestry terrains as a major factor. Additionally, the most suitable selection of harvester systems is characterized by different harvesting methods, such as a whole tree or full tree (FT) cutting, tree length (TL), and cut-to-length (CTL). Given the missions to be completed, the robotic systems for harvesting purposes can be categorized into five groups based on their mechanization levels, according to Gerasimov [70], as shown in Table 2.
An unmanned ground vehicle (UGV) is an automatic ground platform mission without manned controllers aboard presence [71]. They present more desired mobility than harvester systems apart from completing harvesting applications. With sensors and actuators onboard, an UGV can work as an effective tool for field applications, especially under hazardous conditions [72]. This vehicle was first developed with limited effort in 1960s for exploring operations. As a platform under the remote operation frame, UGVs have been presenting unique contributions in tree counting, tree disease control, wildfire management, and heavy applications with the help of payload and perception capabilities. Tang et al. investigate a SLAM-aided positioning solution with point clouds collected by a small-footprint LIDAR [73]. Zhang et al. designed a 2D lidar based tracked robot for information collection system in the forest [74]. In [75], legged robots as shown in Figure 2d are used for environmental monitoring tasks in the Amazon, as the Amazon rainforest is an unstructured and inaccessible environment. Freitas et al. designed a forest robot with a separate robotic arm that can monitor the air and water in the forest [76]. In Figure 2, some manufactured prototypes with different locomotion capabilities are shown.
Given different locomotion components, mobile platforms can be divided into multiple categories, such as wheeled, legged, tracked, spherical, and hybrid mechanisms. Wheeled robotic platforms are acknowledged as efficient solutions in flat as wheels present high speed and smooth performances. However, they may encounter efficiency decrease in challenging terrains where obstacles are larger than wheel radius [77]. Tracked vehicles have lower ground than other platforms as their ground contact areas are larger [78]. For locomotion in forestry applications, walking mobility is required under benign terrains. Legged mobility is best desired as it presents good adaptability on rough terrains. However, the legged mobility may render the operation efficient if the terrains are not too steep. Locomotion modes of ground mobile robots in an unstructured environment are summarized in Table 3 [79].
With respect to the compromise between two notable solutions, one solution is to use the mixed integration of both legged and wheeled mobility functions [80]. In this regard, hybrid articulated robots present better desirable characteristics by uniting various modes, which enables them to have adaptability and agility under varying requirements. They are advantageous alternatives since they offer more degree of freedom than conventional robotic platforms that passively adapt to terrain changes. By mounting multiple sub-systems, such as active suspension mechanisms or transformable components, these mechanisms are capable of increasing stability in practices.
Wheel-on-leg robots are highly hybrid robotic platforms that can effectively clamber uneven and benign terrains with the configuration of wheel and leg components. Integrating leg and wheel locomotion modes and hybrid wheel-on-leg robots are capable of maneuvering and negotiating over unstructured terrains. In this regard, wheel-on-leg rovers are proposed as complex maneuver alternatives that integrate the merits of both wheeled and legged robotics, which offer competitive adaptability and efficiency for addressing practical operations in unstructured terrains that demand flexible mobility capabilities. The ground mobile platforms sector has significantly improved the performance of challenging field applications in forestry. The complex surfaces in forestry lands pose great challenges for efficient mobility strategies, which require reliable and flexible structures under harsh conditions.

2.3. Collaboration of Multi-Hybrid Robot Platforms

When working individually, aerial systems can achieve large-scale missions, but their endurance and load capacities are limited. Ground platforms are capable of completing heavy load and near-distance detection tasks. However, they may encounter accessibility problems in a practical forestry environment. Moreover, their ability in negotiating obstacles is typically limited due to their valid coverage [81]. It is argued that the heterogeneous strategies of different unmanned platforms will make it more competitive and unmatched to execute forestry applications. Heterogeneous robotic systems can realize a more successful implementation in a closely connected manner. For instance, cooperative use of a UAV-to-UGV team can offer comprehensive support for conducting missions including exploration, surveillance, detection, search and rescue, and tracking tasks [82]. By communicating and sharing acquired information, the wholeness of conducting forestry practices will be improved, and, thereby, the accuracy guaranteed. The power-tethered UAVs to UGVs is another concept that can be considered for forestry applications. In Figure 3, schematic views of heterogeneous UAVs and UGVs are presented. The main aim of deploying collaborative multi-robot systems is to offer sound solutions for carrying forestry maintenance, reconnaissance, and surveillance missions by integrating all advanced options.
In the air-ground cooperation system, the UAV acts as a sensor to collect, transmit, detect, and track target-related information, while the UGV plans a path based on the information transmitted by the UAV and feeds back the real-time state of the road for further modification. Stentz et al. demonstrated that the data collected in advance from the UAV can improve the planning efficiency of UGV, but it has the disadvantage that the map cannot be reused multiple times and the preservation period is short [83]. Subsequently, Kelly and Vandapel et al. [70,84] proposed a multi-robot cooperative system to solve the above problems, in which the ground navigation capability of UCV can be improved by obtaining pre-collected data through UAV. With the continuous progress of autonomous exploration technology, UAVs can collect ground images, then correct them by image processing technology, while automatically building a map of the environment so that UGVs can avoid obstacles and perform tasks [85]. Kaslin et al. [86] proposed an elevation map-based localization method for UGVs, which allows UGVs to find their relative position in the reference map provided by the UAV without relying on GPS positioning sensors.
The collaborative strategy uses robotics as its core component, and it combines both aerial and ground systems with expected capabilities. Ground robotics are used fully autonomously or semi-autonomously to foster ground intervention tasks, while the aerial alternatives work as an assistant agent to locate possible operation areas with the help of their competent supervision performance in wide forest areas. A typical real-world case would be the SEMFIRE project presented by the university of Coimbra, Portugal [87]. In this project, tracked mobile robot Ranger and aerial vehicle Scouts are applied to complete a series of tasks, including primary environment reconnaissance, surrounding mapping, regions of interest identification, and surveillance. The ground mobile platform works as a transportation tool carrying aerial alternatives, and it sends commands after reaching the target site. Apart from transportation, the Ranger platform is capable of conducting forest debris cleaning tasks. When one mission is completed, the flying robots land autonomously on the ground mobile vehicle and is ready for the next mission according to operators’ instructions. The diagram of SEMFIRE multi-robot project can be seen in Figure 4, which shows how each sub-system works in this forestry maintenance mission.
The design and practice of multi-robot systems take advantage of intelligent robotics from different application fields and integrated heterogeneous deployments into a comprehensive architecture, which contributes to the coverage availability, traversability, and technical advancement in forestry applications.

3. Forestry Applications

After decades of development and implementation, extensive cases have been studied on various aspects of robotic operational systems in forestry applications. There are varieties of forestry applications that can be addressed by UAVs or/and UGVs, such as tree quantification, disease control, wildfire management, tree estimation, and wildlife monitoring, etc., as shown in Figure 5. In the following subsection, each of the mentioned cases is discussed elaborately, and the related techniques are described.

3.1. Wood Production

Basic forestry applications accomplished by the classic robotic platform is wood harvesting. It deals with harvesting and associated operations, such as thinning, debarking, and cut-to-length tasks, among others. Based on preliminary utilization, robotic systems with a higher automatic degree are applied into the forestry industry, and the applied domains expanded in recent years, ranging from tree harvesting and log transporting operations to forest monitoring, surveillance, and comprehensive management. After years of rapid development, the harvester-forwarder system has been validated as an efficient automation option in terms of profit and productivity considerations [88]. In Scandinavian countries, the percentage of unmanned system harvesting reached more than 95% in the late 1990s [89]. At present, the degree of mechanization has reached almost 100% [90].
The teleoperation method freed forestry workers from steep terrains and hazards of heavy machines by remote controlling. The feasibility has been demonstrated throughout previous cases in this domain [91]. The most classic project was MECHANT, developed Halme from TKK Automation Technology Laboratory [92]. Based on this research project, Oy (former part of John Deere) developed a walking harvester prototype, which was a remotely controlled six-legged platform. Further exploration conducted by TKK put more emphasis on the possibility of combining hybrid locomotion modes in forestry scenarios. The representative prototype was Work partner, a mobile centaur-like robot for field environment operations [93]. The hybrid system enables the Work partner to present a mobility performance by combining legged and wheeled motion features, which is greatly desired in forestry practices.
Harvesting systems encompass a set of automatic machines that are designed to execute wood harvesting tasks. The whole workflow includes tree felling and processing (FP), bunching (B), and extraction (E) stages [94]. Tree felling operations are preliminary phases in the wood extraction work cycle. Towards this aim, fellers are utilized to conduct tree felling tasks, and additionally, assist in following delimbing and extraction operations. For instance, they are developed to automatically measure the stem diameters and lengths in real-time during tree cutting tasks. Magagnotti et al. conducted a temporal study of hybrid poplar plantations. Clear-cutting operations were carried out in both single-tree and multi-tree processing operations using a dual-machine truncating harvest system with a hybrid design. Higher productivity (+8%) was achieved in multi-stem handling mode [84]. Lindroos et al. record data on the position of the harvester head relative to the machine, then the analysis of such data was performed to improve forestry operations and related processes [95]. Mechanized CTL systems are capable of processing wood into logs. Typical CTL harvester prototypes are demonstrated through extensive research worldwide. Additionally, the cut-to-shred harvesters are utilized towards sugarcane and bamboo plantations [96,97]. Forwarders deal with the transportation of harvested logs from the field site to the sorting yard or warehouse [98].
The transportation application after timber harvesting is another major task in the wood production process. In addition to ground-based operations, the airship method was developed as it offers a new perspective in diversifying operational transportation methods [99]. For economic and ecological concerns, a new aerial transportation strategy was presented after comparing UAV and manned helicopter platforms. This UAV logging method was demonstrated as an economically and environmentally competitive alternative to conduct wood transportation tasks, especially in the boreal Canadian forest.

3.2. Tree Quantification

The role of robotic systems becomes increasingly competitive in tree quantification applications as it offers desired cost-effectiveness more than conventional methods. The targets of the tree counting mission can be both live plants in the forest and log stacks in the factory stockyard. The detection of live trees aims to acquire tree height and canopy density parameters. However, for log volume estimation, the process becomes much different as the stacking factor must be determined before identifying the log measurement of interest area. Midhun et al. [100] study evaluates the applicability of low consumer grade cameras attached to UAVs and structure-from-motion (SfM) algorithm for automatic individual tree detection (ITD) using a local-maxima based algorithm on UAV-derived Canopy Height Models (CHMs), but it was only tested in the small field plots. Charton et al. [101] conducted a test on remotely-piloted wood volume estimation in a Brazilian sawmill yard. The oblique imaging technique is regarded as a promising technology tool for remote sensing missions due to its potentiality over traditional vertical imaging method. Lin et al. [102] proposed a novel algorithm for individual tree detection in Sundsberge urban areas, aiming to directly apply it onto UAV for analysis. Parameters, such as tree height and breast width are important information in growth condition evaluation and resource conservation processes. It would greatly improve efficiency to execute quantification tasks before performing ground-based surveys and reconnaissance missions [50]. In 2016, Jan et al. [103] carried out a UAV-based mission to measure the tree height in central Europe. Another tree height estimation study was presented by Bridal et al. [104], in which a lightweight (<0.70 kg) fixed-wing UAV carrying a consumer-grade camera was deployed. The correlation rate reached 94%, and the error was 28 cm. Three-dimensional forest measurement and inventory deal with obtaining various forest attributes. Airborne laser scanning (ALS) [105,106], terrestrial laser scanning (TLS) [107], and mobile laser scanning (MLS) [108] approaches are mainstream techniques of air and ground scenarios for data acquisition. Pierzchala et al. [109] used Superdroid 4WD IG 52 mobile platform to collect 3D data with the help of multiple sensing accessories, including LIDAR, stereo camera, and GPS equipment. The local forest map was generated by using the Simultaneous Localization and Mapping algorithm (graph-SLAM) in this case. Views of the typical reconstructed point cloud for the forestry environment and stem cylinder map are shown in Figure 6.

3.3. Disease Control

Forest pest insects and diseases affect the yield of wood production and environmental sustainability. The outbreak of forest disease causes losses in both economics and ecosystems. Affected plants tend to show critical signatures, such as different canopy shapes and color, which can be captured in high-resolution aerial images. Generally, typical robotic operations can be divided into two types, namely, the monitoring work for decision-making and spray operations. For decision-making purposes, early monitoring mission before the outbreak and widespread is essential for maintaining the balance and health of the forest, especially at a large scale. The aerial spray mission using unmanned vehicles is more desirable since there is no pilot or crew onboard in the whole spraying process. This unmanned application prevents forestry workers from hazardous exposure to prolonged pesticide contact, which may cause versatile damages for human laborers [110].
Multi-spectral image information acquired by UAVs have been demonstrated effective in identifying physiological abnormality, even in quite an early stage. A forest health monitoring study using UAV platform was conducted by Dash et al. [111] in Central North Island, New Zealand. The weighted kappa value reached 0.694, which indicates good accuracy in detecting potential disease. Zhang et al. [112] built a multi-spectral imaging platform for effective pest-affected regions recognition purpose in Yunnan pine forest of Yunnan province. General classification precision is validated to be 94.01%, and its Kappa index is 0.92. Imagery methods for forestry purposes have evolved from multi-spectral to hyperspectral over the past years. Hyperspectral method is more effective than multi-spectral imaging due to its improved performance in spectral and spatial scales. Roope et al. [113] applied a miniaturized UAV with hyperspectral imaging sensors to investigate the spruce dark beetle outbreak in Lahti, Southern Finland. The overall accuracy of identifying tree health conditions was 76% (Cohen’s kappa 0.60) when classifying the targets into healthy, infested, and dead. The classification results of tree health conditions are shown in Figure 7.

3.4. Wildfire Management

Forest fires have become one of the most hazardous natural disasters all over the world in the past decades. Over 100,000 wildfires in the United States occurred annually, and more than 9,000,000 acres are damaged [114]. According to the World Bank report [115], it is estimated that the economic loss caused by wildfires in Indonesia reached over USD 16.1 billion in 2015, and this number was equivalent to 1.9 percent of annual GDP. Forestry unmanned platforms for wildfire management are designed to prevent the occurrence of forest fire phenomena, and decrease the rate of potential economic losses and human injuries [116]. Wildfire control applications can be divided into three stages according to different occurrence levels, namely, wildfire prevention before happening, fire disaster control in time, and after-fire management. Unmanned systems can be utilized in the whole cycle by presenting unique contributions.
UAVs are utilized as a valuable asset in wildfire monitoring recently. Pastor [117] developed the Sky-Eye platform, a helicopter-based UAV for wildfire surveillance purposes. It is capable of offering tactical support for both day/night surveillance and post-fire hot-spot detection tasks. The capability of utilizing Ikhana UAV as an effective tool was studied by the National Aeronautics and Space Administration (NASA) and US Forest Service throughout extensive flight missions in three fire seasons during 2006–2010, as shown in Table 4 [118,119,120]. It is worth noticing that the Ikhana was the first civil UAV platform to receive Certificate of Airworthiness (COA) support during the Southern California firestorms in 2007 [121]. The University of Cincinnati proposed a Surveillance for Intelligent Emergency Response Robotic Aircraft (SIERRA) strategy and validated its performance from site implementations [122]. Towards the same goal, several successful implementations were conducted towards the wildfire management purpose over past years [123,124]. Figure 8 shows views of a flight mission in the burning area, hot-spot detection and identification using image fusion, and Ikhana.
Although the above cases contribute to wildfire management in forestry, they deployed independent robotics as a working platform. It is more efficient to utilize multiple robotic systems since this strategy presents locomotion and perception advancements by adopting multiple platforms. Towards more advanced forestry applications, deploying multiple UAVs to forest fire fighting is conducted as an innovative method [125]. Heterogeneous paring method of UAVs and ground vehicles for forest fire management was studied [126]. Ghamry et al. [127] tested the valid coverage advantages of UAVs and payload merits of UGVs. A post-fire application can offer valuable information and data on the recovery conditions in forest areas, assisting experts with evaluating the sustainability and recovery quality. Chu et al. [128] reviewed the remote sensing-based evaluation after forestry fire disturbances. The assessment focuses on post-fire recovery, including burned forested area mapping, fire-burned severity assessing, and recovery situations tracking.
The SEMFIRE project practiced is another notable case that applies heterogeneous deployment of multi-purpose forestry robots into debris maintenance tasks. Forest debris encompasses accumulated combustible materials due to a range of procedures, such as tree pruning, mowing, raking, and disposal [129], and their existence will lead to a potential wildfire. Therefore, thorough cleaning of fuel debris is an effective measure to prevent wildfire in forests. In this three-year project, multiple agents, including tracked mobile robot Ranger and aerial vehicle Scouts, are deployed for environment reconnaissance, surrounding mapping, regions of interest (ROI) identification, and surveillance tasks. The Ranger platform transports aerial alternatives to the target site while the flying Scouts perform environment sensing work to locate potential ROIs. Apart from transportation and sensing, the Ranger platform is capable of conducting forest debris cleaning task through tree cutting and vegetation mowing with its forestry mulcher attachment. Additionally, one or more Scouts can perform status estimation of engaged agents and objective supervising. When one mission is completed, the flying robots land autonomously on the ground mobile vehicle, and is ready for the next mission according to operators’ instructions. The deployment of Ranger and Scout is illustrated in Figure 9.

3.5. Wildlife Conservation

It is significant for experts to obtain detailed information about wildlife density, population, behavior, patterns, and tracks. Remote operations using unmanned platforms are becoming increasingly popular with their target-friendly working features. Extensive innovative efforts have been made in wildlife monitoring with the help of unmanned technologies over the past decades. Abundant research have been conducted in wildlife monitoring. Current samples of wildlife monitoring cases include tiger [130], koala [22], sea eagle [131], monkey [132], and thrush [133]. UAVs, Wireless Sensor Networks (WSN), and a combination of multiple methods were utilized. Artificial Intelligence (AI) technique was applied in processing steps, which greatly improved the accuracy of species detecting and matching [134]. Sensor nodes positioned in wild environment ground are easily affected by multiple factors due to near-ground disturbance effects, which usually cause low monitoring accuracy. Badescu [130] developed a monitoring strategy by combining UAV and WSN systems to overcome the defects in a traditional method. Nicolas [134] conducted an experiment on animal detection in Kuzukus wildlife reserve. Large animals were chosen to be monitored for two major reasons, namely, their outstanding visibility in the background and great pixel availability. The identification results of wildlife are shown in Figure 10.
UAV systems have revolutionized wildlife monitoring and protection, providing new opportunities for wildlife specialists to inexpensively survey relatively large areas [135]. UAV systems can utilize the airborne sensor platform to monitor wildlife for an accurate estimation of species abundance [136]. UAVs equipped with digital and thermal image sensors can record high-resolution videos and capture images closer to the animal [137]. Jones et al. collected wildlife video and image data from over 30 tasks and concluded that UAV systems can overcome the safety and cost issues associated with manned aircraft for wildlife monitoring [138]. UAV systems have been shown to be effective in conducting wildlife monitoring investigations [139,140].
Tree species classification and inventory research using remote sensing methods have been active over the past forty years [141]. Many innovative techniques are applied to tree species classifications, such as remote sensing-assisted approaches for data collecting and artificial neural network for post-training. In a multi-spectral survey conducted by Rossana [142] in Northern Italy, the availability of using UAVs in conducting tree species classification was investigated and discussed. This practice proved the feasibility and efficiency of distinguishing different bushes, trees, and non-vegetated areas (bare soil, concrete, shadow, etc.). It can be noted from the overall accuracy that the supervised classification has a higher rate (80%) than the unsupervised classification one (50%). Apart from tree species inventory applications in forests, UAVs were also utilized in grassland species classification. Lu [143] carried out a study aiming to investigate the composition of the test site. A Tarot T15 UAV platform was used in this research, consisting of a GPS module for location sensing, an IMU (Inertial measurement unit) module for flight position measuring, and an Ardupilot Mission Planner (AMP) for ground station control. Lisein et al. [144] applied a fixed-wing platform with a focus on the combination of SFM (Structure from motion) [145] and photogrammetry methods in order to model the forest canopy surface from low altitude aerial images. In their work, the elevation of vegetation was determined by a combination of a co-registered LIDAR with a digital terrain model.

4. Discussion

4.1. Current Status

Onboard sensors of robotic platforms are continuously developing towards further improvement. The equipped sensors in different robotic systems are determined by their specific work requirements, including high spatial and temporal resolution [146]. A great diversity of functional sensors can be mounted on forestry robots, ranging from various cameras (Digital, Multispectrum, Infrared, NIR (Near-infrared), SWIR (Short-wave infrared), and TIR (Thermal infrared)) [48], laser scanning [147], multi-/hyper-spectral sensors [148], and LIDAR technologies [149], as described in Table 5.
Generally, UAVs are equipped with different sensors that can exceed twenty in number, including accelerometers, magnetometers, gyroscopes (IMU), and GPS [150]. These sensors are speed or inertial/angular measurement devices and are used to obtain data with the exclusive aim of controlling the UAVs during navigation.
Video cameras are systems broadly used in UAVs for data acquisition and intervention practices in forestry, which include RGB cameras, professional multispectral, hyperspectral, and thermal cameras [151]. They are also known as passive sensors. The acquired images will be associated with the corresponding GPS position, altitude, and direction of UAVs [25].
RGB cameras are able to perceive visible light in the 390–700 nm band of the electromagnetic spectrum, and sub-density spatial resolution can be obtained by RGB cameras even at relatively high ground heights. It is common practice to separate color channels to work with the individual bandwidths, for example, adopting different types of filters [152]. Li et al. developed a red-green-blue (RGB) camera to extract overlay and forest floor pixels using 3D structure-from-motion (SFM) point clouds and 2D superpixel segmentation. This approach can deal with the high price of LIDAR [153].
Multispectral cameras can simultaneously acquire multiple optical spectral bands (usually greater than or equal to three), and extend the visible light to infrared and ultraviolet light. A common method is to combine various optical filters or splitters with a variety of photographic films to receive the light signals radiated or reflected in different narrow spectral bands, and obtain several pictures of the target in different spectral bands [154]. Shin et al. used multispectral UAV imagery to analyze burn severity after forest fires and accurately grade burn surfaces [155].
Hypespectral sensors have a sophisticated technique that can capture and analyze point-by-point spectra in a spatial region. Due to the unique spectral features at different spatial locations of a single object, substances that are not visually distinguishable can be detected [156]. Hyperspectral images consist of narrower bands (10–20 nm), and the images may have hundreds or thousands of bands. The spectral information can fully reflect the difference of physical structure and chemical composition inside the sample. These characteristics determine the unique advantages in the detection of wood product quality. Yang et al. applied the integration of the crop growth model and the random forest model to estimate crop yield based on hyperspectral imagery, and the UAV hyperspectral data were found to significantly improve the retrieval accuracy [157].
Thermal sensors detect infrared energy emitted by objects in a non-contact manner and convert it into electrical signals [158]. Thermal images show the temperature distribution through colored pictures. Differences between thermal and infrared sensors are due to emitted and reflected energy, respectively. Infrared sensors detect objects at night by actively emitting infrared light that increases the brightness of the outside world by tens of thousands of times, then generates thermal images and temperature values on the display. Still et al. demonstrated the feasibility and potential of thermal imaging to measure vegetation surface temperature at various scales [159].
Light Detection and Ranging (LIDAR) systems are used to measure distances by exploring the environment with the pulses projected on the targets, which is an active remote sensing device [160]. The signals are backscattered by the objects, and part of the energy is transmitted to the sensors. The LIDAR system records the elapsed time between transmission and reception and is combined with positional information to obtain a detailed point cloud containing intensity and elevation measurements. These systems have been widely equipped on UAVs to build environment models, avoid obstacles or navigate. Forest health indicators, such as crown density and pattern distribution, can be derived from lidar based point clouds [161,162].
In addition, Synthetic Aperture Radar (SAR) is an also active observation system, which can be installed on a flight platform for observation throughout the day [163]. Sonar systems use sound waves to detect and identify airborne objects, which is rarely used in UAVs for remote sensing, but is used widely in navigation [164]. Tanveer et al. consider an UAV equipped with a biosonar sensor that mimics the sonar of echolocation. This sensor provides a lightweight and low-cost alternative to other widely used sensors, such as Video cameras and LIDAR systems.
Most of the summarized auxiliary and specific sensors have been applied in real-world forestry practices and investigated in the above forestry cases. By mounting external sensors, robotics applied in forestry are enabled to obtain desired capabilities in situation perception, which contributes to their operational performance improvement in forestry data acquisition applications. High-quality data acquired by UAVs in practical investigations include high-resolution images, audio samples, and other formats, which can provide valuable assistance for forestry operators in the further analysis process.
The configuration of multiple imaging cameras is determined by actual need and driven by advances in sensing techniques. A large variety of cameras are used in all cases studied above, ranging from compact RGB digital camera, thermal camera, still imaging camera to customized camera. In Table 6, practical cases using UAV systems are outlined.
Specialized adaptation of robotic systems are desired to adapt to predetermined goals in supporting field tasks. Zhang et al. [123] used a customized imaging component to obtain imagery materials by utilizing different light blocking filters. By replacing the typical NIR-light-blocking filter with a red-light-blocking filter, NIR-green-blue false color images are generated instead of acquiring red-green-blue images. The modified component is contributing to obtaining ideal imagery for further processing. Commercial UAVs are used in almost all studied cases in this paper. Consumer-grade UAV platforms, such as eBee by SenseFly, S800 EVO, Phantom 3 and Phantom 3 pro by DJI, micro UAVs, etc., are widely used. However, most commercial UAV platforms can hardly realize component demands requested by practical needs as their specifications are fixed. It would be more appropriate for researchers to customize the utilized device instead of directly using commercial platforms. On the other hand, the improved payload will allow the upgrading of battery storage, which will contribute to realizing a longer flight mission. Apart from the above options, UAV types are worth considering for carrying practical missions. UAVs with rotary-wing are more popularly utilized in practical applications due to their better operational characteristics. In terms of forestry field applications, a rotary-wing UAV is able to vertically take off and land, which is more ideal and advantageous over fixed-wing crafts.

4.2. Challenges in Forestry

The advantages of applying unmanned systems in forestry have developed to a new level, and their separated demerits are avoided. Despite many signs of progress achieved so far, some certain gaps and strains still exist in both technical and regulatory aspects, which requires further research attention. The challenges are analyzed from environment characterizing, maneuverability, mobility improvement, and regulatory interpretation.
(1)
Environmental uncertainty
In wild working scenarios, uncertainties and variations increase the difficulty of performing pre-assigned operations and acquiring ideal materials for further processing [166]. Practical conditions of applying unmanned systems can be analyzed through the following two aspects. The first one can be summarized as natural variations in wild environments, and major examples would be illumination conditions, wind turbulence, humidity, etc. The other would be the highly varying objectives in field sites, such as the unpredictable paths of live creatures or the spreading of forested disasters. When acquiring field data in mountainous areas, the illumination conditions may cause impacts on the photogrammetric qualities as the illumination conditions to change greatly in different time periods [167]. Dataset acquired from previous field experiments can be seen in Figure 11.
(2)
Maneuverability and utilization of UAVs
To realize better maneuverability and utilization, it is crucial to ensure fundamental abilities for aerial crafts and leave post-processing functions to their ground backups. Therefore, UAVs need to ensure their miniature shapes and sizes. The highly unstructured terrains in the forestry environment pose challenges for efficient mobility strategies. To better suit the forestry working environment, a set of unmanned systems that are capable of maneuvering under harsh terrains and traversing obstacles in forests have been proposed. Multiple locomotion modes that offer the best adaptability for various terrain scenarios were developed regarding the characteristic demand. Efficient speed to achieve ideal work efficiency and clambering over obstacles by switching into traversing modes when encountered with harsh terrains are desired. In terms of this aspect, more reliable and flexible mechanisms under harsh conditions need to be further explored.
The complex surfaces in forestry lands pose great challenges for efficient mobility strategies, which require reliable and flexible structures under harsh conditions. Rocks, fallen trees, muddy slopes, and bushes all increase the difficulty of practical exploration through mobile platforms. For locomotion in forestry applications, walking mobility is required under benign terrains. However, the legged mobility may render the operation efficient if the terrains are not too steep. With respect to the compromise between two notable solutions, one alternative is to use the mixed integration of both legged and wheeled mobility functions. In this regard, hybrid articulated robots present better desirable characteristics by uniting various modes, which enables them to have adaptability and agility under varying requirements. Future development of automation platforms regarding forestry practices needs to improve the locomotion function to make them flexible and suitable to the highly unstructured environment.
(3)
Supervision and regulations
Among all constraints that render practical UAVs deployment, the major non-technical concern should be put on aviation regulations on aerial vehicles [168]. With the rapid development of unmanned aerial platforms, the proper regulations towards the newly dominant system have become major challenges for aerial regulation departments in some regions, such as Australia, EU, and China. In 2002, Australia published its very first regulatory rules for civil applications [169]. Guided by the European Commission, The European Aviation Safety Agency (EASA) was founded with the aim to execute proper management and administration issues for the effective utilization of aerial UAVs [170,171]. It released the regulatory statement in August 2009, and the document aimed to perform a series of UAV applications [172,173]. In 2015, it gave detailed supervision towards the general management and technical development aspects. Guided by the European Commission, the European Aviation Safety Agency (EASA) was founded aiming to set rules for the utilization of aerial UAVs. In 2019, Commission Implementing Regulation (EU) formulated the rules and procedures for the operation of unmanned aircraft [174]. In the same year, the Commission Delegated Regulation (EU) on unmanned aircraft systems and on third-country operators of unmanned aircraft systems was also published [175]. JARUS published the guidelines on Specific Operations Risk Assessment (SORA) [176,177], which recommended a single set of technical, safety, and operational requirements for all aspects linked to the safe operation of the Unmanned Aircraft Systems (UAS). SORA is regarded as the state-of-the-art risk assessment approach to all BVLOS UAS operations. At the Chinese level, Civil Aerial Administration of China (CAAC) is organized to provide development supervision and regulations of civil aerial UAVs. Its affiliated aerial standard office published the operational regulation statements of lightweight UAS in December 2015, which gave direct references for executive issues.

4.3. Future Recommendations

Forest operations and management applications present new concerns in both technical and non-technical aspects when applying forestry robotic systems. Currently, extensive prototypes are proposed by researchers, but most of the fully automatic robots are still in the early stage as few are successfully implemented or commercially available. To better apply robotic systems into forestry, multiple underdeveloped areas must be further explored regarding sub-system capability enhancement and customized adaptation for synchronizing related sub-systems into fully complex systems.
For variations of objectives, it demands more accurate recognition and detection techniques to be employed in forestry applications. In view of this aspect, the desired capabilities of forestry robots include more adaptive and intelligent algorithms towards better performance. More robust systems with advanced integration of environmental sensing, and real-time decision making and planning are essential for forestry field applications. For future research, a ground vehicle must improve its obstacle surmounting and avoidance performance in a wild forested environment. Future research should focus on optimizing the algorithm framework and improving the recognition accuracy. Hardware reliability, such as sensors and controller of forest UAV, should be improved. The intelligent software program and algorithm design can be integrated to realize the intelligent recognition of multi-targets and dynamic targets.
Towards highly unstructured terrains, more flexible ground platforms are desired for accomplishing complex forest missions under dynamic scenarios. While GPS can hardly satisfy the accurate localization requirement, the simultaneous localization and mapping (SLAM) method may improve the sub-system performance. The poor communication quality in the forested environment is a major limiting factor due to the signal blocking phenomenon of tree height when implementing unmanned practices. Therefore, the risk of losing signals must be addressed to guarantee more efficient data transmission. The efficiency of communication of the proposed system must be evaluated to ensure a trade-off between low cost and high accuracy, and between simplicity and efficiency.
Additionally, the cooperative methodologies between aerial and ground vehicles should further upgrade towards a more closely linked and intelligent stage. The robustness and reliability of applied platforms require extensive experiments to ensure their performance. The heterogeneity and complementarity of UAV and UGV in terms of dynamics, speed, and communication enable the air-ground cooperative system to accomplish its mission efficiently. These advantages are better than the performance of individual robots. The key advances are to embed more intelligent computing algorithms for collaborative systems and to develop new distributed frameworks.
It is also necessary to combine artificial intelligence (AI) technology with data acquisition and intervention practices in forest, including deep convolutional neural network (CCN), deep reinforcement learning (DRL), imitation learning (IL), meta-learning (ML), etc., and furthermore, to study the automatic compilation technology for high-precision UAV remote sensing images. At the same time, the spectral reflection image database should also be constructed based on AI technology to improve the accuracy of forest resource investigation and monitoring, so as to accurately identify forest diseases, wildfires, and wildlife.
In the new development process, the integration of fuel performance, emission specification, and corresponding bearing capacity against forest soils are major concerns. For further development of forestry, barriers regarding policy and regulatory issues must be reduced. A more systematic and appropriate system is demanded by researchers and stakeholders in the forest industry.

5. Conclusions

Unmanned air vehicle (UAV) systems have constantly contributed to a variety of forestry practices. Their technical feasibility in the forest industry has been validated by past research efforts. Multiple advantages of utilizing unmanned alternatives for forestry purposes were addressed in this effort by reviewing recent studies. The unmanned systems including UGVs, UAVs, and heterogeneous platforms, and their applied scenarios were presented accordingly herein, and the locomotion modes of robotic systems in an unstructured environment are compared. The UAV systems contributed to the improvement of coverage availability, passability, and technical progress of forestry applications. Based on forestry automation equipment, different cases of utilized systems in the forestry industry, including tree quantification, disease control, wildfire management, tree estimation and wildlife monitoring, etc., were studied, and the associated technologies were discussed, and the onboard sensors of UAV platforms are continuously developing towards further improvement. Furthermore, in terms of mobility enhancement and customized sensory adaptation, a detailed discussion was conducted with respect to the current status and future development guidance.
The advantages of applying unmanned systems in forestry are developed to a new level, and their separated demerits are avoided. The heterogeneous air vehicle unmanned system can effectively improve productivity, maneuverability, spatial resolution, and real-time and valid range concerns, but there are still some certain gaps and strains that need to be further studied. (1) The complex terrain structure of the forest and the weak endurance of the robot make it difficult to adapt to large-scale intervention practices tasks. (2) Due to the influence of environmental uncertainty, the stability of robots needs to be further strengthened. (3) The dynamic nature of forest ecosystems requires timely and repeated data acquisition and intervention practices, while the multitemporal and long-term monitoring cannot be sufficiently performed due to the limitations of communication reliability and battery. (4) Though unmanned systems are a great tool in many domains, regulatory constraint may make it either impossible or excessively expensive to use, especially for individuals and small businesses. To better apply robotic systems into forestry, multiple underdeveloped areas must be further explored regarding sub-system capability enhancement, environment characterizing, maneuverability, and mobility improvement.

Author Contributions

Conceptualization, H.S. and H.Y.; methodology, J.Z.; investigation, J.Z.; writing—original draft preparation, H.S. and H.Y.; writing—review and editing, M.H. and A.A.; visualization, A.A. All authors have read and agreed to the published version of the manuscript.

Funding

This study was financially supported by the Natural Science Foundation of Beijing Municipality under Grant No. 6192019, and the Science and Technology Project of Hebei Education Department under Grant No. QN2021312.

Data Availability Statement

Data available on request from the authors.

Acknowledgments

All individuals have consented to the acknowledgement.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hamedianfar, A.; Mohamedou, C.; Kangas, A.; Vauhkonen, J. Deep learning for forest inventory and planning: A critical review on the remote sensing approaches so far and prospects for further applications. Forestry 2022, 95, 451–465. [Google Scholar] [CrossRef]
  2. Bechar, A.; Vigneault, C. Agricultural robots for field operations. Part 2: Operations and systems. Biosyst. Eng. 2017, 153, 110–128. [Google Scholar] [CrossRef]
  3. Idrissi, M.; Hussain, A.; Barua, B.; Osman, A.; Abozariba, R.; Aneiba, A.; Asyhari, T. Evaluating the Forest Ecosystem through a Semi-Autonomous Quadruped Robot and a Hexacopter UAV. Sensors 2022, 22, 5497. [Google Scholar] [CrossRef] [PubMed]
  4. Ishii, K.; Hayashi, E.; Bin Misron, N.; Thornton, B. Special Issue on Advanced Robotics in Agriculture, Forestry and Fisheries. J. Robot. Mechatron. 2018, 30, 163–164. [Google Scholar] [CrossRef]
  5. Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
  6. Zhu, Y.; Kan, J. Prediction of the lateral stability of a forestry chassis with an articulated body and fitted with luffing wheel-legs. Biosyst. Eng. 2022, 224, 143–160. [Google Scholar] [CrossRef]
  7. Reid, D.A.; Hassan, M.A. Response of In-Stream Wood to Riparian Timber Harvesting: Field Observations and Long-Term Projections. Water Resour. Res. 2020, 56, e2020WR027077. [Google Scholar] [CrossRef]
  8. Visser, R.; Obi, O.F. Automation and Robotics in Forest Harvesting Operations: Identifying Near-Term Opportunities. Croat. J. For. Eng. 2021, 42, 13–24. [Google Scholar] [CrossRef]
  9. Yu, Z.; Zhang, Y.; Jiang, B.; Yu, X. Fault-Tolerant Time-Varying Elliptical Formation Control of Multiple Fixed-Wing UAVs for Cooperative Forest Fire Monitoring. J. Intell. Robot. Syst. 2021, 101, 48. [Google Scholar] [CrossRef]
  10. Tranchitella, M.; Fujikawa, S.; Ng, T.L.; Yoel, D.; Tatum, D.; Roy, P.; Mazel, C.; Herwitz, S.; Hinkley, E. Using Tactical Unmanned Aerial Systems to Monitor and Map Wildfires. AIAA J. 2013, 5, 381–388. [Google Scholar]
  11. Zhang, Y.; Zhang, Y.; Yu, Z. A Solution for Searching and Monitoring Forest Fires Based on Multiple UAVs. In Proceedings of the 2019 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 11–14 June 2019; pp. 661–666. [Google Scholar]
  12. Sudhakar, S.; Vijayakumar, V.; Kumar, C.S.; Priya, V.; Ravi, L.; Subramaniyaswamy, V. Unmanned Aerial Vehicle (UAV) based Forest Fire Detection and monitoring for reducing false alarms in forest-fires. Comput. Commun. 2020, 149, 1–16. [Google Scholar] [CrossRef]
  13. Inoue, T.; Nagai, S.; Yamashita, S.; Fadaei, H.; Ishii, R.; Okabe, K.; Taki, H.; Honda, Y.; Kajiwara, K.; Suzuki, R. Unmanned aerial survey of fallen trees in a deciduous broadleaved forest in eastern Japan. PLoS ONE 2014, 9, e109881. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Guoshuai, Z. Application and Demand Analysis of Unamnned Aerial Vehicle Remote Sensing in Forestry. J. Fujian For. Sci. Technol. 2017, 44, 136–140. [Google Scholar]
  15. Hassaan, O.; Nasir, A.K.; Roth, H.; Khan, M.F. Precision Forestry: Trees Counting in Urban Areas Using Visible Imagery based on an Unmanned Aerial Vehicle. IFAC Pap. Online 2016, 49, 16–21. [Google Scholar] [CrossRef]
  16. Christensen, B.R. Use of UAV or remotely piloted aircraft and forward-looking infrared in forest, rural and wildland fire management: Evaluation using simple economic analysis. N. Z. J. For. Sci. 2015, 45, 16. [Google Scholar] [CrossRef] [Green Version]
  17. Tan, L.T.; Tan, K.H. Alternative air vehicles for sterile insect technique aerial release. J. Appl. Entomol. 2013, 137, 126–141. [Google Scholar] [CrossRef]
  18. Schweier, J.; Spinelli, R.; Magagnotti, N.; Becker, G. Mechanized coppice harvesting with new small-scale feller-bunchers: Results from harvesting trials with newly manufactured felling heads in Italy. Biomass Bioenergy 2015, 72, 85–94. [Google Scholar] [CrossRef]
  19. McEachran, Z.P.; Karwan, D.L.; Slesak, R.A. Direct and Indirect Effects of Forest Harvesting on Sediment Yield in Forested Watersheds of the United States. JAWRA J. Am. Water Resour. Assoc. 2020, 57, 1–31. [Google Scholar] [CrossRef]
  20. Duarte, A.; Borralho, N.; Cabral, P.; Caetano, M. Recent Advances in Forest Insect Pests and Diseases Monitoring Using UAV-Based Data: A Systematic Review. Forests 2022, 13, 911. [Google Scholar] [CrossRef]
  21. Urbina-Brito, N.; Guerrero-Sánchez, M.E.; Valencia-Palomo, G.; Hernández-González, O.; López-Estrada, F.R.; Hoyo-Montaño, J.A. A Predictive Control Strategy for Aerial Payload Transportation with an Unmanned Aerial Vehicle. Mathematics 2021, 9, 1822. [Google Scholar] [CrossRef]
  22. Gonzalez, L.F.; Montes, G.A.; Puig, E.; Johnson, S.; Mengersen, K.; Gaston, K.J. Unmanned Aerial Vehicles (UAVs) and Artificial Intelligence Revolutionizing Wildlife Monitoring and Conservation. Sensors 2016, 16, 97. [Google Scholar] [CrossRef] [Green Version]
  23. Jiao, Z.; Zhang, Y.; Mu, L.; Xin, J.; Jiao, S.; Liu, H.; Liu, D. A YOLOv3-based Learning Strategy for Real-time UAV-based Forest Fire Detection. In Proceedings of the 2020 Chinese Control and Decision Conference (CCDC), Hefei, China, 22–24 August 2020; pp. 4963–4967. [Google Scholar]
  24. Ju, C.; Son, H.I. Modeling and Control of Heterogeneous Agricultural Field Robots Based on Ramadge–Wonham Theory. IEEE Robot. Autom. Lett. 2020, 5, 48–55. [Google Scholar] [CrossRef]
  25. Ecke, S.; Dempewolf, J.; Frey, J.; Schwaller, A.; Endres, E.; Klemmt, H.-J.; Tiede, D.; Seifert, T. UAV-Based Forest Health Monitoring: A Systematic Review. Remote Sens. 2022, 14, 3205. [Google Scholar] [CrossRef]
  26. Aydin, B.; Selvi, E.; Tao, J.; Starek, M.J. Use of fire-extinguishing balls for a conceptual system of drone-assisted wildfire fighting. Drones 2019, 3, 17. [Google Scholar] [CrossRef] [Green Version]
  27. Akhloufi, M.A.; Couturier, A.; Castro, N.A. Unmanned aerial vehicles for wildland fires: Sensing, perception, cooperation and assistance. Drones 2021, 5, 15. [Google Scholar] [CrossRef]
  28. Ivanova, S.; Prosekov, A.; Kaledin, A. A Survey on Monitoring of Wild Animals during Fires Using Drones. Fire 2022, 5, 60. [Google Scholar] [CrossRef]
  29. Kumar, M.; Cohen, K.; Hom Chaudhuri, B. Cooperative control of multiple uninhabited aerial vehicles for monitoring and fighting wildfires. J. Aerosp. Comput. Inf. Commun. 2011, 8, 1–16. [Google Scholar] [CrossRef]
  30. Rocha, A.M.; Casau, P.; Cunha, R. A Control Algorithm for Early Wildfire Detection Using Aerial Sensor Networks: Modeling and Simulation. Drones 2022, 6, 44. [Google Scholar] [CrossRef]
  31. Guerrero-Sánchez, M.E.; Hernández-González, O.; Valencia-Palomo, G.; López-Estrada, F.R.; Rodríguez-Mata, A.E.; Garrido, J. Filtered observer-based ida-pbc control for trajectory tracking of a quadrotor. IEEE Access 2021, 9, 114821–114835. [Google Scholar] [CrossRef]
  32. Lee, K.W.; Park, J.K. Economic evaluation of unmanned aerial vehicle for forest pest monitoring. J. Korea Acad. Ind. Coop. Soc. 2019, 20, 440–446. [Google Scholar]
  33. Poley, L.G.; McDermid, G.J. A Systematic Review of the Factors Influencing the Estimation of Vegetation Aboveground Biomass Using Unmanned Aerial Systems. Remote Sens. 2020, 12, 1052. [Google Scholar] [CrossRef] [Green Version]
  34. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef] [Green Version]
  35. Eugenio, F.C.; Schons, C.T.; Mallmann, C.L.; Schuh, M.S.; Fernandes, P.; Badin, T.L. Remotely piloted aircraft systems and forests: A global state of the art and future challenges. Can. J. For. Res. 2020, 50, 705–716. [Google Scholar] [CrossRef]
  36. Torresan, C.; Berton, A.; Carotenuto, F.; Di Gennaro, S.F.; Gioli, B.; Matese, A.; Miglietta, F.; Vagnoli, C.; Zaldei, A.; Wallace, L. Forestry applications of UAVs in Europe: A review. Int. J. Remote Sens. 2017, 38, 2427–2447. [Google Scholar] [CrossRef]
  37. Gambella, F.; Sistu, L.; Piccirilli, D.; Corposanto, S.; Caria, M.; Arcangeletti, E.; Proto, A.R.; Chessa, G.; Pazzona, A. Forest and UAV: A bibliometric review. Contemp. Eng. Sci. 2016, 9, 1359–1370. [Google Scholar] [CrossRef]
  38. Zhang, J.; Yan, H.; Hu, C.; Li, T.; Yu, M. Application and future development of unmanned aerial vehicle in forestry. J. For. Eng. 2019, 4, 8–16. [Google Scholar]
  39. Yildiz, K.; Eken, S.; Kaya, M.O. Optimal Control Procedure Application for Dynamic Response of Adaptive Aircraft Wings Modeled as Thin-Walled Composite Beams. Appl. Mech. Mater. 2015, 798, 292–296. [Google Scholar]
  40. Bella, S.; Belbachir, A.; Belalem, G. A hybrid air-sea cooperative approach combined with a swarm trajectory planning method. Paladyn J. Behav. Robot. 2020, 11, 118–139. [Google Scholar] [CrossRef]
  41. Muskardin, T.; Coelho, A.; Della Noce, E.R.; Ollero, A.; Kondak, K. Energy-Based Cooperative Control for Landing Fixed-Wing UAVs on Mobile Platforms Under Communication Delays. IEEE Robot. Autom. Lett. 2020, 5, 5081–5088. [Google Scholar] [CrossRef]
  42. Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
  43. Zinovieva, I.S.; Artemchuk, V.O.; Iatsyshyn, A.V.; Popov, O.O.; Kovach, V.O.; Iatsyshyn, A.V.; Romanenko, Y.O.; Radchenko, O.V. The use of online coding platforms as additional distance tools in programming education. J. Phys. Conf. Ser. 2021, 1840, 012029. [Google Scholar] [CrossRef]
  44. Banu, T.P.; Borlea, G.F.; Banu, C. The Use of Drones in Forestry. J. Environ. Sci. Eng. 2016, 5, 557–562. [Google Scholar]
  45. Mesas-Carrascosa, F.J.; Notario-García, M.D.; de Larriva, J.E.M.; de la Orden, M.S.; Porras, A.G.F. Validation of measurements of land plot area using UAV imagery. Int. J. Appl. Earth Obs. Geoinf. 2014, 33, 270–279. [Google Scholar] [CrossRef]
  46. Hassanalian, M.; Abdelkefi, A. Classifications, applications, and design challenges of drones: A review. Prog. Aerosp. Sci. 2017, 91, 99–131. [Google Scholar] [CrossRef]
  47. Hassanalian, M.; Khaki, H.; Khosravi, M. A new method for design of fixed wing micro air vehicle. Proc. Inst. Mech. Eng. Part G J. Aerosp. Eng. 2015, 229, 837–850. [Google Scholar] [CrossRef]
  48. Tang, L.; Shao, G. Drone remote sensing for forestry research and practices. J. For. Res. 2015, 26, 791–797. [Google Scholar] [CrossRef]
  49. Whitehead, K.; Hugenholtz, C.H.; Myshak, S.; Brown, O.; LeClair, A.; Tamminga, A.; Barchyn, T.E.; Moorman, B.; Eaton, B. Remote sensing of the environment with small unmanned aircraft systems (UASs), part 1: A review of progress and challenges. J. Unmanned Veh. Syst. 2014, 2, 69–85. [Google Scholar] [CrossRef]
  50. Ubina, N.A.; Cheng, S.-C. A Review of Unmanned System Technologies with Its Application to Aquaculture Farm Monitoring and Management. Drones 2022, 6, 12. [Google Scholar] [CrossRef]
  51. Goodbody, T.R.; Coops, N.C.; Marshall, P.L.; Tompalski, P.; Crawford, P. Unmanned aerial systems for precision forest inventory purposes: A review and case study. For. Chron. 2017, 93, 71–81. [Google Scholar] [CrossRef] [Green Version]
  52. Rasmussen, J.; Nielsen, J.; Garcia-Ruiz, F.; Christensen, S.; Streibig, J.C. Potential uses of small unmanned aircraft systems (UAS) in weed research. Weed Res. 2013, 53, 242–248. [Google Scholar] [CrossRef]
  53. Stone, C.; Mohammed, C. Application of remote sensing technologies for assessing planted forests damaged by insect pests and fungal pathogens: A review. Curr. For. Rep. 2017, 3, 75–92. [Google Scholar] [CrossRef]
  54. Launchbury, R. Unmanned Aerial Vehicles in Forestry. For. Chron. 2014, 90, 418–420. [Google Scholar] [CrossRef]
  55. Gupta, S.G.; Ghonge, D.; Jawandhiya, P.M. Review of unmanned aircraft system (UAS). Int. J. Adv. Res. Comput. Eng. Technol. IJARCET 2013, 2, 1646–1658. [Google Scholar] [CrossRef]
  56. Austin, R. Unmanned Aircraft Systems: UAVS Design, Development and Deployment; John Wiley & Sons: Hoboken, NJ, USA, 2011. [Google Scholar]
  57. Imdoukh, A.; Shaker, A.; Al-Toukhy, A.; Kablaoui, D.; El-Abd, M. Semi-autonomous indoor firefighting UAV. In Proceedings of the 2017 18th International Conference on Advanced Robotics (ICAR), Hong Kong, China, 10–12 July 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 310–315. [Google Scholar]
  58. Walter, V.; Spurný, V.; Petrlík, M.; Báča, T.; Žaitlík, D.; Saska, M. Extinguishing of ground fires by fully autonomous UAVs motivated by the MBZIRC 2020 competition. In Proceedings of the 2021 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 15–18 June 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 787–793. [Google Scholar]
  59. Ge, C.; Ma, X.; Liu, Z. A semi-autonomous distributed blockchain-based framework for UAVs system. J. Syst. Archit. 2020, 107, 101728. [Google Scholar] [CrossRef]
  60. Steenbeek, A.; Nex, F. CNN-based dense monocular visual SLAM for real-time UAV exploration in emergency conditions. Drones 2022, 6, 79. [Google Scholar] [CrossRef]
  61. Dong, J.; Ren, X.; Han, S.; Luo, S. UAV vision aided INS/odometer integration for land vehicle autonomous navigation. IEEE Trans. Veh. Technol. 2022, 71, 4825–4840. [Google Scholar] [CrossRef]
  62. Berenguer, Y.; Payá, L.; Valiente, D.; Peidró, A.; Reinoso, O. Relative altitude estimation using omnidirectional imaging and holistic descriptors. Remote Sens. 2019, 11, 323. [Google Scholar] [CrossRef] [Green Version]
  63. Moses, P.L.; Rausch, V.L.; Nguyen, L.T.; Hill, J.R. NASA hypersonic flight demonstrators—Overview, status, and future plans. Acta Astronaut. 2004, 55, 619–630. [Google Scholar] [CrossRef]
  64. Zhang, T.; Lu, W.; Li, Y. Intelligent unmanned aerial vehicle review. Aviat. Manuf. Technol. 2013, 12, 32–35. [Google Scholar]
  65. Lideskog, H.; Karlberg, M.; Bergsten, U. Development of a Research Vehicle Platform to Improve Productivity and Value-extraction in Forestry. Procedia CIRP 2015, 38, 68–73. [Google Scholar] [CrossRef] [Green Version]
  66. Tavares, T.D.O.; de Oliveira, B.R.; Silva, V.D.A.; Pereira da Silva, R.; Dos Santos, A.F.; Okida, E.S. The times, movements and operational efficiency of mechanized coffee harvesting in sloped areas. PLoS ONE 2019, 14, e0217286. [Google Scholar] [CrossRef]
  67. Nasiruddin Khilji, T.; Lopes Amaral Loures, L.; Rezazadeh Azar, E. Distress Recognition in Unpaved Roads Using Unmanned Aerial Systems and Deep Learning Segmentation. J. Comput. Civ. Eng. 2021, 35, 04020061. [Google Scholar] [CrossRef]
  68. Gázquez, J.A.; Castellano, N.N.; Manzano-Agugliaro, F. Intelligent low cost telecontrol system for agricultural vehicles in harmful environments. J. Clean. Prod. 2016, 113, 204–215. [Google Scholar] [CrossRef]
  69. Koszewnik, A.; Oldziej, D. Performance assessment of an energy harvesting system located on a copter. Eur. Phys. J. Spec. Top. 2019, 228, 1677–1692. [Google Scholar] [CrossRef] [Green Version]
  70. Gerasimov, Y.; Sokolov, A. Ergonomic evaluation and comparison of wood harvesting systems in Northwest Russia. Appl. Ergon. 2014, 45, 318–338. [Google Scholar] [CrossRef]
  71. Vandapel, N.; Donamukkala, R.R.; Hebert, M. Unmanned Ground Vehicle Navigation Using Aerial Ladar Data. Int. J. Robot. Res. 2006, 25, 31–51. [Google Scholar] [CrossRef] [Green Version]
  72. Saputra, H.M.; Mirdanies, M. Controlling Unmanned Ground Vehicle Via 4 Channel Remote Control. Energy Procedia 2015, 68, 381–388. [Google Scholar] [CrossRef] [Green Version]
  73. Tang, J.; Chen, Y.; Kukko, A.; Kaartinen, H.; Jaakkola, A.; Khoramshahi, E.; Hakala, T.; Hyyppä, J.; Holopainen, M.; Hyyppä, H. SLAM-aided stem mapping for forest inventory with small-footprint mobile LiDAR. Forests 2015, 6, 4588–4606. [Google Scholar] [CrossRef] [Green Version]
  74. Zhang, C.; Yong, L.; Chen, Y.; Zhang, S.; Ge, L.; Wang, S.; Li, W. A rubber-tapping robot forest navigation and information collection system based on 2D LiDAR and a gyroscope. Sensors 2019, 19, 2136. [Google Scholar] [CrossRef] [Green Version]
  75. CSIRO. Amazon 360: Testing Self-Navigation in a Novel Landscape. 2020. Available online: https://algorithm.data61.csiro.au/amazon-360-testing-self-navigation-in-a-novel-landscape/ (accessed on 21 August 2018).
  76. Freitas, G.; Gleizer, G.; Lizarralde, F.; Hsu, L.; dos Reis, N.R.S. Kinematic reconfigurability control for an environmental mobile robot operating in the amazon rain forest. J. Field Robot. 2010, 27, 197–216. [Google Scholar] [CrossRef]
  77. Sun, T.; Xiang, X.; Su, W.; Wu, H.; Song, Y. A transformable wheel-legged mobile robot: Design, analysis and experiment. Robot. Auton. Syst. 2017, 98, 30–41. [Google Scholar] [CrossRef]
  78. Ismoilov, A.; Sellgren, U.; Andersson, K.; Löfgren, B. A comparison of novel chassis suspended machines for sustainable forestry. J. Terramechanics 2015, 58, 59–68. [Google Scholar] [CrossRef]
  79. Bruzzone, L.; Quaglia, G. Review article: Locomotion systems for ground mobile robots in unstructured environments. Mech. Sci. 2012, 3, 49–62. [Google Scholar] [CrossRef] [Green Version]
  80. Pijuan, J.; Comellas, M.; Nogués, M.; Roca, J.; Potau, X. Active bogies and chassis levelling for a vehicle operating in rough terrain. J. Terramechanics 2012, 49, 161–171. [Google Scholar] [CrossRef]
  81. Grocholsky, B.; Keller, J.; Kumar, V.; Pappas, G. Cooperative air and ground surveillance. Robot. Autom. Mag. IEEE 2006, 13, 16–25. [Google Scholar] [CrossRef] [Green Version]
  82. Quintin, F.; Iovino, S.; Savvaris, A.; Tsourdos, A. Use of Co-operative UAVs to Support/Augment UGV Situational Awareness and/or Inter-Vehicle Communications. IFAC Pap. Online 2017, 50, 8037–8044. [Google Scholar] [CrossRef]
  83. Sakai, S.; Iida, M.; Osuka, K.; Umeda, M. Design and control of a heavy material handling manipulator for agricultural robots. Auton. Robot. 2008, 25, 189–204. [Google Scholar] [CrossRef] [Green Version]
  84. Stentz, A.; Kelly, A.; Rander, P.; Herman, H.; Amidi, O.; Mandelbaum, R.; Salgian, G.; Pedersen, J. Real-Time, Multi-Perspective Perception for Unmanned Cround Vehicles. Ph.D. Thesis, Carnegie Mellon University, Pittsburgh, PA, USA, 2003. [Google Scholar]
  85. Tanveer, M.H.; Zhu, H.; Ahmed, W.; Thomas, A.; Imran, B.M.; Salman, M. Mel-spectrogram and deep cnn based representation learning from bio-sonar implementation on UAVs. In Proceedings of the 2021 International Conference on Computer, Control and Robotics (ICCCR), Shanghai, China, 8–10 January 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 220–224. [Google Scholar]
  86. Käslin, R.; Fankhauser, P.; Stumm, E.; Taylor, Z.; Mueggler, E.; Delmerico, J.; Scaramuzza, D.; Siegwart, R.; Hutter, M. Collaborative localization of aerial and ground robots through elevation maps. In Proceedings of the 2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Lausanne, Switzerland, 23–27 October 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 284–290. [Google Scholar]
  87. Couceiro, M.S.; Portugal, D.; Ferreira, J.F.; Rocha, R.P. SEMFIRE: Towards a new generation of forestry maintenance multi-robot systems. In Proceedings of the 2019 IEEE/SICE International Symposium on System Integrations (SII 2019), Paris, France, 14–16 January 2019; IEEE: Piscataway, NJ, USA, 2019. [Google Scholar]
  88. Magagnotti, N.; Spinelli, R.; Kärhä, K.; Mederski, P.S. Multi-tree cut-to-length harvesting of short-rotation poplar plantations. Eur. J. For. Res. 2020, 140, 345–354. [Google Scholar] [CrossRef]
  89. Bergerman, M.; Billingsley, J.; Reid, J.; van Henten, E. Robotics in Agriculture and Forestry. In Springer Handbook of Robotics; Springer International Publishing: Cham, Switzerland, 2016. [Google Scholar]
  90. Li, X.; Li, W.; Yang, Q.; Yan, W.; Zomaya, A.Y. An Unmanned Inspection System for Multiple Defects Detection in Photovoltaic Plants. IEEE J. Photovolt. 2020, 10, 568–576. [Google Scholar] [CrossRef]
  91. Salmanova, R. Distribution of Species of the Orchidaceae Juss. in the Shrubbery and Forest Vegetation of the Nakhchivan Autonomous Republic. Bull. Sci. Pract. 2020, 6, 62–68. [Google Scholar] [CrossRef]
  92. Chen, Z.; Liu, J.; Gao, F. Real-time gait planning method for six-legged robots to optimize the performances of terrain adaptability and walking speed. Mech. Mach. Theory 2022, 168, 104545. [Google Scholar] [CrossRef]
  93. Halme, A.; Luksch, T.; Ylönen, S. Biomimicing motion control of the WorkPartner robot. Ind. Robot. Int. J. 2004, 31, 209–217. [Google Scholar] [CrossRef]
  94. Nakahata, C.; Uemura, R.; Saito, M.; Kanetsuki, K.; Aruga, K. Estimating harvest costs and projecting quantifies of logging residues for small-scale forestry in Nasushiobara, Tochigi Prefecture, Japan. For. Res. 2014, 12, 965–974. [Google Scholar]
  95. Lindroos, O.; Ringdahl, O.; La Hera, P.; Hohnloser, P.; Hellström, T.H. Estimating the Position of the Harvester Head—A Key Step towards the Precision Forestry of the Future? Croat. J. For. Eng. 2015, 36, 147–164. [Google Scholar]
  96. Guerra, S.P.S.; Oguri, G.; Junior, H.D.J.E.; de Melo, R.X.; Spinelli, R. Mechanized harvesting of bamboo plantations for energy production: Preliminary tests with a cut-and-shred harvester. Energy Sustain. Dev. 2016, 34, 62–66. [Google Scholar] [CrossRef]
  97. Pongpat, P.; Gheewala, S.H.; Silalertruksa, T. An assessment of harvesting practices of sugarcane in the central region of Thailand. J. Clean. Prod. 2017, 142, 1138–1147. [Google Scholar] [CrossRef]
  98. Xu, J.; Meng, Q.; Wu, J.; Zheng, J.X.; Zhang, X.; Sharma, S. Efficient and Lightweight Data Streaming Authentication in Industrial Automation and Control Systems. IEEE Trans. Ind. Inform. 2020, 17, 4279–4287. [Google Scholar] [CrossRef]
  99. Adamski, W.; Pazderski, D.; Herman, P. Robust 3D tracking control of an underactuated autonomous airship. IEEE Robot. Autom. Lett. 2020, 5, 4281–4288. [Google Scholar] [CrossRef]
  100. Mohan, M.; Silva, C.A.; Klauberg, C.; Jat, P.; Catts, G.; Cardil, A.; Hudak, A.T.; Dia, M. Individual Tree Detection from Unmanned Aerial Vehicle (UAV) Derived Canopy Height Model in an Open Canopy Mixed Conifer Forest. Forests 2017, 8, 340. [Google Scholar] [CrossRef] [Green Version]
  101. Locks, C.J.; Ferreira, M.E.; Rufino, L.; Chaves, J.H. Estimating wood volume in sawmill yards of Brazilian Amazon by Remotely Piloted Aircraft Systems. In Proceedings of the XVIII Simpósio Brasileiro de Sensoriamento Remoto, São Paulo, Brazil, 28–31 May 2017. [Google Scholar]
  102. Lin, Y.; Jiang, M.; Yao, Y.; Zhang, L.; Lin, J. Use of UAV oblique imaging for the detection of individual trees in residential environments. Urban For. Urban Green. 2015, 14, 404–412. [Google Scholar] [CrossRef]
  103. Dempewolf, J.; Nagol, J.; Hein, S.; Thiel, C.; Zimmermann, R. Measurement of Within-Season Tree Height Growth in a Mixed Forest Stand Using UAV Imagery. Forests 2017, 8, 231. [Google Scholar] [CrossRef] [Green Version]
  104. Birdal, A.C.; Avdan, U.; Türk, T. Estimating tree heights with images from an unmanned aerial vehicle. Geomat. Nat. Hazards Risk 2017, 8, 1144–1156. [Google Scholar] [CrossRef] [Green Version]
  105. Łukasik, A.; Szuszkiewicz, M.; Wanic, T.; Gruba, P. Three-dimensional model of magnetic susceptibility in forest topsoil: An indirect method to discriminate contaminant migration. Environ. Pollut. 2021, 273, 116491. [Google Scholar] [CrossRef] [PubMed]
  106. Lindberg, E.; Holmgren, J. Individual tree crown methods for 3D data from remote sensing. Curr. For. Rep. 2017, 3, 19–31. [Google Scholar] [CrossRef] [Green Version]
  107. Liang, X.; Kankare, V.; Hyyppä, J.; Wang, Y.; Kukko, A.; Haggrén, H.; Yu, X.; Kaartinen, H.; Jaakkola, A.; Guan, F.; et al. Terrestrial laser scanning in forest inventories. Isprs J. Photogramm. Remote Sens. 2016, 115, 63–77. [Google Scholar] [CrossRef]
  108. Li, Q.; Zheng, J.; Zhou, H.; Shu, Y.; Xu, B. Three-dimensional green biomass measurement for individual tree using mobile two-dimensional laser scanning. J. Nanjing For. Univ. 2018, 1, 130–135. [Google Scholar]
  109. Pierzchała, M.; Giguère, P.; Astrup, R. Mapping forests using an unmanned ground vehicle with 3D LiDAR and graph-SLAM. Comput. Electron. Agric. 2018, 145, 217–225. [Google Scholar] [CrossRef]
  110. Faiçal, B.S.; Freitas, H.; Gomes, P.H.; Mano, L.Y.; Pessin, G.; de Carvalho, A.C.; Krishnamachari, B.; Ueyama, J. An adaptive approach for UAV-based pesticide spraying in dynamic environments. Comput. Electron. Agric. 2017, 138, 210–223. [Google Scholar] [CrossRef]
  111. Dash, J.P.; Watt, M.S.; Pearse, G.D.; Heaphy, M.; Dungey, H.S. Assessing very high resolution UAV imagery for monitoring forest health during a simulated disease outbreak. Int. J. Photogramm. Remote Sens. 2017, 131, 1–14. [Google Scholar] [CrossRef]
  112. Junguo, Z.; Huanqing, H.; Chunhe, H.; Youqing, L. Identification Method of Pinus yunnanensis Pest Area Based on UAV Multispectral Images. Trans. Chin. Soc. Agric. Mach. 2018, 49, 249–255. [Google Scholar]
  113. Näsi, R.; Honkavaara, E.; Lyytikäinen-Saarenmaa, P.; Blomqvist, M.; Litkey, P.; Hakala, T.; Viljanen, N.; Kantola, T.; Tanhuanpää, T.; Holopainen, M. Using UAV-Based Photogrammetry and Hyperspectral Imaging for Mapping Bark Beetle Damage at Tree-Level. Remote Sens. 2015, 7, 15467–15493. [Google Scholar] [CrossRef] [Green Version]
  114. Verma, N.; Singh, D. Analysis of cost-effective sensors: Data Fusion approach used for Forest Fire Application. Mater. Today Proc. 2020, 24, 2283–2289. [Google Scholar] [CrossRef]
  115. Glauber, A.J.; Moyer, S.; Adriani, M.; Gunawan, I. The Cost of Fire. World Bank Other Oper. Stud. 2016, 1–8. Available online: https://openknowledge.worldbank.org/entities/publication/391e19f4-18f7-56c2-8c28-bb89cc9e7d6f (accessed on 2 December 2022).
  116. Hesseln, H. Wildland fire prevention: A review. Curr. For. Rep. 2018, 4, 178–190. [Google Scholar] [CrossRef]
  117. Pastor, E.; Barrado, C.; Royo, P.; Santamaria, E.; Lopez, J.; Salami, E. Architecture for a helicopter-based unmanned aerial systems wildfire surveillance system. Geocarto Int. 2011, 26, 113–131. [Google Scholar] [CrossRef]
  118. Singh, V.; Sharma, N.; Singh, S. A review of imaging techniques for plant disease detection. Artif. Intell. Agric. 2020, 4, 229–242. [Google Scholar] [CrossRef]
  119. Ambrosia, V.G.; Wegener, S.S.; Sullivan, D.V.; Buechel, S.W.; Dunagan, S.E.; Brass, J.A.; Stoneburner, J.; Schoenung, S.M. Demonstrating UAV-Acquired Real-Time Thermal Data over Fires. Photogramm. Eng. Remote Sens. 2015, 69, 391–402. [Google Scholar] [CrossRef]
  120. Ambrosia, V.G.; Wegener, S.; Zajkowski, T.; Sullivan, D.V.; Buechel, S.; Enomoto, F.; Lobitz, B.; Johan, S.; Brass, J.; Hinkley, E. The Ikhana unmanned airborne system (UAS) western states fire imaging missions: From concept to reality (2006–2010). Geocarto Int. 2011, 26, 85–101. [Google Scholar] [CrossRef]
  121. Watts, A.C.; Ambrosia, V.G.; Hinkley, E.A. Unmanned Aircraft Systems in Remote Sensing and Scientific Research: Classification and Considerations of Use. Remote Sens. 2012, 4, 1671–1692. [Google Scholar] [CrossRef] [Green Version]
  122. Charvat, R.; Ozburn, R.; Bushong, S.; Cohen, K.; Kumar, M. SIERRA Team Flight of Zephyr UAS at West Virginia Wild Land Fire Burn. In Proceedings of the Infotech@Aerospace 2012, Garden Grove, CA, USA, 19–21 June 2012. [Google Scholar]
  123. Zhang, L.; Wang, B.; Peng, W.; Li, C.; Lu, Z.; Guo, Y. Forest fire detection solution based on UAV aerial data. Int. J. Smart Home 2015, 9, 239–250. [Google Scholar] [CrossRef]
  124. Cruz, H.; Eckert, M.; Meneses, J.; Martínez, J. F Efficient Forest Fire Detection Index for Application in Unmanned Aerial Systems (UASs). Sensors 2016, 16, 893. [Google Scholar] [CrossRef] [Green Version]
  125. Merino, L.; Caballero, F.; Martínez-de Dios, J.R.; Ferruz, J.; Ollero, A. A cooperative perception system for multiple UAVs: Application to automatic detection of forest fires. J. Field Robot. 2006, 23, 165–184. [Google Scholar] [CrossRef]
  126. Gayathri Devi, K.; Yasoda, K.; Roy, M.N. Automatic Firefighting System Using Unmanned Aerial Vehicle. In International Conference on Artificial Intelligence for Smart Community; Springer: Singapore, 2022; pp. 1023–1031. [Google Scholar]
  127. Ghamry, K.A.; Dong, Y.; Kamel, M.A.; Zhang, Y. Real-Time Autonomous Take-off, Tracking and Landing of UAV on a Moving UGV Platform. In Proceedings of the 2016 24th Mediterranean Conference on Control and Automation (MED), Athens, Greece, 21–24 June 2016; IEEE: Piscataway, NJ, USA, 2016. [Google Scholar]
  128. Chu, T.; Guo, X. Remote Sensing Techniques in Monitoring Post-Fire Effects and Patterns of Forest Recovery in Boreal Forest Regions: A Review. Remote Sens. 2014, 6, 470–520. [Google Scholar] [CrossRef] [Green Version]
  129. Dennis, F.C. Fire-Resistant Landscaping; Colorado State University: Fort Collins, CO, USA, 2012; Volume 6, pp. 1–4. [Google Scholar]
  130. Badescu, A.M.; Cotofana, L. A wireless sensor network to monitor and protect tigers in the wild. Ecol. Indic. 2015, 57, 447–451. [Google Scholar] [CrossRef]
  131. Andrew, M.E.; Shephard, J.M. Semi-automated detection of eagle nests: An application of very high-resolution image data and advanced image analyses to wildlife surveys. Remote Sens. Ecol. Conserv. 2017, 3, 66–80. [Google Scholar] [CrossRef] [Green Version]
  132. Radiansyah, S.; Kusrini, M.D.; Prasetyo, L.B. Quadcopter applications for wildlife monitoring. IOP Conf. Ser. Earth Environ. Sci. 2017, 54, 012066. [Google Scholar] [CrossRef] [Green Version]
  133. Tremblay, J.A.; Desrochers, A.; Aubry, Y.; Pace, P.; Bird, D.M. A low-cost technique for radio-tracking wildlife using a small standard unmanned aerial vehicle. J. Unmanned Veh. Syst. 2017, 5, 102–108. [Google Scholar] [CrossRef] [Green Version]
  134. Rey, N.; Volpi, M.; Joost, S.; Tuia, D. Detecting animals in African Savanna with UAVs and the crowds. Remote Sens. Environ. 2017, 200, 341–351. [Google Scholar] [CrossRef] [Green Version]
  135. Fust, P.; Loos, J. Development perspectives for the application of autonomous, unmanned aerial systems (UASs) in wildlife conservation. Biol. Conserv. 2020, 241, 108380. [Google Scholar] [CrossRef]
  136. Bondi, E.; Dey, D.; Kapoor, A.; Piavis, J.; Shah, S.; Fang, F.; Dilkina, B.; Hannaford, R.; Iyer, A.; Joppa, L.; et al. Airsim-w: A simulation environment for wildlife conservation with UAVs. In Proceedings of the 1st ACM SIGCAS Conference on Computing and Sustainable Societies, Menlo Park/San Jose, CA, USA, 20–22 June 2018; pp. 1–12. [Google Scholar]
  137. Thapa, G.J.; Thapa, K.; Thapa, R.; Jnawali, S.R.; Wich, S.A.; Poudyal, L.P.; Karki, S. Counting crocodiles from the sky: Monitoring the critically endangered gharial (Gavialis gangeticus) population with an unmanned aerial vehicle (UAV). J. Unmanned Veh. Syst. 2018, 6, 71–82. [Google Scholar] [CrossRef] [Green Version]
  138. Jones, G.P.I.V.; Pearlstine, L.G.; Percival, H.F. An assessment of small unmanned aerial vehicles for wildlife research. Wildl. Soc. Bull. 2006, 34, 750–758. [Google Scholar] [CrossRef]
  139. Mangewa, L.J.; Ndakidemi, P.A.; Munishi, L.K. Integrating UAV Technology in an Ecological Monitoring System for Community Wildlife Management Areas in Tanzania. Sustainability 2019, 11, 6116. [Google Scholar] [CrossRef] [Green Version]
  140. Mangewa, L.J.; Ndakidemi, P.A.; Alward, R.D.; Kija, H.K.; Bukombe, J.K.; Nasolwa, E.R.; Munishi, L.K. Comparative Assessment of UAV and Sentinel-2 NDVI and GNDVI for Preliminary Diagnosis of Habitat Conditions in Burunge Wildlife Management Area, Tanzania. Earth 2022, 3, 769–787. [Google Scholar] [CrossRef]
  141. Fassnacht, F.E.; Latifi, H.; Stereńczak, K.; Modzelewska, A.; Lefsky, M.; Waser, L.T.; Straub, C.; Ghosh, A. Review of studies on tree species classification from remotely sensed data. Remote Sens. Environ. 2016, 186, 64–87. [Google Scholar] [CrossRef]
  142. Gini, R.; Passoni, D.; Pinto, L.; Sona, G. Use of Unmanned Aerial Systems for multi-spectral survey and tree classification: A test in a park area of northern Italy. Eur. J. Remote Sens. 2017, 16, 251–269. [Google Scholar]
  143. Lu, B.; He, Y. Species classification using Unmanned Aerial Vehicle (UAV)-acquired high spatial resolution imagery in a heterogeneous grassland. ISPRS J. Photogramm. Remote Sens. 2017, 128, 73–85. [Google Scholar] [CrossRef]
  144. Lisein, J.; Pierrot-Deseilligny, M.; Bonnet, S.; Lejeune, P. A Photogrammetric Workflow for the Creation of a Forest Canopy Height Model from Small Unmanned Aerial System Imagery. Forests 2013, 4, 922–944. [Google Scholar] [CrossRef] [Green Version]
  145. Iglhaut, J.; Cabo, C.; Puliti, S.; Piermattei, L.; O’Connor, J.; Rosette, J. Structure from motion photogrammetry in forestry: A review. Curr. For. Rep. 2019, 5, 155–168. [Google Scholar] [CrossRef] [Green Version]
  146. Lucieer, A.; Robinson, S.; Turner, D.; Harwin, S.; Kelcey, J. Using a Micro-UAV for Ultra-High Resolution Multi-Sensor Observations of Antarctic Moss Beds. In Proceedings of the XXII Isprs Congress 2012, Melbourne, Australia, 25 August–1 September 2012. [Google Scholar]
  147. Hellström, T.; Hohnloser, P.; Ringdahl, O. Tree Diameter Estimation Using Laser Scanner; Umeå Universitet: Umeå, Sweden, 2014; pp. 1–15. [Google Scholar]
  148. Esposito, F.; Rufino, G.; Moccia, A.; Donnarumma, P.; Esposito, M.; Magliulo, V. An Integrated Electro-Optical Payload System for Forest Fires Monitoring from Airborne Platform. In Proceedings of the 2007 Aerospace Conference, Big Sky, MT, USA, 3–10 March 2007; IEEE: Piscataway, NJ, USA, 2007. [Google Scholar]
  149. Suárez, J.C.; Smith, S.; Bull, G.; Malthus, T.J.; Donoghue, D.; Knox, D. The use of remote sensing techniques in operational forestry. J. Jpn. Soc. Surg. Hand 2005, 17, 190–193. [Google Scholar]
  150. Pajares, G. Overview and Current Status of Remote Sensing Applications Based on Unmanned Aerial Vehicles (UAVs). Photogramm. Eng. Remote Sens. 2015, 81, 281–330. [Google Scholar] [CrossRef] [Green Version]
  151. Nebiker, S.; Lack, N.; Abächerli, M.; Läderach, S. Light-weight multispectral UAV sensors and their capabilities for predicting grain yield and detecting plant diseases. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing & Spatial Information Sciences, Prague, Czech Republic, 12–19 July 2016; p. 41. [Google Scholar]
  152. Balestrieri, E.; Daponte, P.; De Vito, L.; Picariello, F.; Tudosa, I. Sensors and Measurements for UAV Safety: An Overview. Sensors 2021, 21, 8253. [Google Scholar] [CrossRef]
  153. Li, L.; Chen, J.; Mu, X.; Li, W.; Yan, G.; Xie, D.; Zhang, W. Quantifying Understory and Overstory Vegetation Cover Using UAV-Based RGB Imagery in Forest Plantation. Remote Sens. 2020, 12, 298. [Google Scholar] [CrossRef] [Green Version]
  154. Dalponte, M.; Marzini, S.; Solano-Correa, Y.T.; Tonon, G.; Vescovo, L.; Gianelle, D. Mapping forest windthrows using high spatial resolution multispectral satellite images. Int. J. Appl. Earth Obs. Geoinf. 2020, 93, 102206. [Google Scholar] [CrossRef]
  155. Shin, J.-I.; Seo, W.-W.; Kim, T.; Park, J.; Woo, C.-S. Using UAV Multispectral Images for Classification of Forest Burn Severity—A Case Study of the 2019 Gangneung Forest Fire. Forests 2019, 10, 1025. [Google Scholar] [CrossRef] [Green Version]
  156. Shoot, C.; Andersen, H.-E.; Moskal, L.M.; Babcock, C.; Cook, B.D.; Morton, D.C. Classifying Forest Type in the National Forest Inventory Context with Airborne Hyperspectral and Lidar Data. Remote Sens. 2021, 13, 1863. [Google Scholar] [CrossRef]
  157. Yang, S.; Hu, L.; Wu, H.; Ren, H.; Qiao, H.; Li, P.; Fan, W. Integration of crop growth model and random forest for winter wheat yield estimation from UAV hyperspectral imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 6253–6269. [Google Scholar] [CrossRef]
  158. Maffei, C.; Lindenbergh, R.; Menenti, M. Combining multi-spectral and thermal remote sensing to predict forest fire characteristics. ISPRS J. Photogramm. Remote Sens. 2021, 181, 400–412. [Google Scholar] [CrossRef]
  159. Still, C.; Powell, R.; Aubrecht, D.; Kim, Y.; Helliker, B.; Roberts, D.; Richardson, A.D.; Goulden, M. Thermal imaging in plant and ecosystem ecology: Applications and challenges. Ecosphere 2019, 10, e02768. [Google Scholar] [CrossRef] [Green Version]
  160. Coops, N.C.; Tompalski, P.; Goodbody, T.R.; Queinnec, M.; Luther, J.E.; Bolton, D.K.; White, J.C.; Wulder, M.A.; van Lier, O.R.; Hermosilla, T. Modelling lidar-derived estimates of forest attributes over space and time: A review of approaches and future trends. Remote Sens. Environ. 2021, 260, 112477. [Google Scholar] [CrossRef]
  161. Meng, P.; Wang, H.; Qin, S.; Li, X.; Song, Z.; Wang, Y.; Yang, Y.; Gao, J. Health assessment of plantations based on LiDAR canopy spatial structure parameters. Int. J. Digit. Earth 2022, 15, 712–729. [Google Scholar] [CrossRef]
  162. Kamoske, A.G.; Dahlin, K.M.; Stark, S.C.; Serbin, S.P. Leaf area density from airborne LiDAR: Comparing sensors and resolutions in a temperate broadleaf forest ecosystem. For. Ecol. Manag. 2019, 433, 364–375. [Google Scholar] [CrossRef]
  163. Tanase, M.A.; Villard, L.; Pitar, D.; Apostol, B.; Petrila, M.; Chivulescu, S.; Leca, S.; Borlaf-Mena, I.; Pascu, I.S.; Dobre, A.C.; et al. Synthetic aperture radar sensitivity to forest changes: A simulations-based study for the Romanian forests. Sci. Total Environ. 2019, 689, 1104–1114. [Google Scholar] [CrossRef] [PubMed]
  164. Wang, R.; Liu, Y.; Müller, R. Detection of passageways in natural foliage using biomimetic sonar. Bioinspiration Biomim. 2022, 17, 056009. [Google Scholar] [CrossRef]
  165. Puliti, S.; Saarela, S.; Gobakken, T.; Ståhl, G.; Næsset, E. Combining UAV and Sentinel-2 auxiliary data for forest growing stock volume estimation through hierarchical model-based inference. Remote Sens. Environ. 2018, 204, 485–497. [Google Scholar] [CrossRef]
  166. Goodbody, T.R.H.; Coops, N.C.; White, J.C. Digital Aerial Photogrammetry for Updating Area-Based Forest Inventories: A Review of Opportunities, Challenges, and Future Directions. Curr. For. Rep. 2019, 5, 55–75. [Google Scholar] [CrossRef] [Green Version]
  167. Burdziakowski, P.; Bobkowska, K. UAV Photogrammetry under Poor Lighting Conditions—Accuracy Considerations. Sensors 2021, 21, 3531. [Google Scholar] [CrossRef] [PubMed]
  168. Elsayed, M.; Mohamed, M. The impact of airspace regulations on unmanned aerial vehicles in last-mile operation. Transp. Res. Part D: Transp. Environ. 2020, 87, 102480. [Google Scholar] [CrossRef]
  169. CASA. Unmanned Aircraft and Rockets:Model Aircraft.Advisory Circular AC-101-3; CIVIL Aviation Safety Authority Australia: Canberra, Australia, 2002.
  170. Celt, V.; Jurakić, G.; Mađer, M.; Toćić, H. Unmanned Aircraft Systems—Successful Usage Limited by the Regulation? In Proceedings of the International Symposium on Engineering Geodesy-Sig, Varaždin, Croatia, 20–22 May 2016. [Google Scholar]
  171. Salvano, D.P. Unmanned Aircraft Systems (UAS)—Regulatory Policy and Processes: A Moving Landscape—A US Perspective; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2016. [Google Scholar]
  172. Joosen, R.; Haverland, M.; de Bruijn, E. Shaping EU agencies’ rulemaking: Interest groups, national regulatory agencies and the European Union Aviation Safety Agency. Comp. Eur. Politics 2022, 20, 441–442. [Google Scholar] [CrossRef]
  173. Lamon, M. Remotely Piloted Aircraft Systems: The future of aviation. Rev. Derecho Transp. Terr. Marítimo Aéreo Multimodal 2022, 29, 151–167. [Google Scholar]
  174. C/2019/3824; Commission Implementing Regulation (EU) 2019/947 on the Rules and Procedures for the Operation of Unmanned Aircraft. European Commission: Brussels, Belgium, 2019; p. 4571. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32019R0947 (accessed on 24 May 2019).
  175. C/2019/182; Commission Delegated Regulation (EU) 2019/945 of 12 March 2019 on Unmanned Aircraft Systems and on Third-Country Operators of Unmanned Aircraft System. European Commission: Brussels, Belgium, 2019; pp. 1–40. Available online: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:32019R0945 (accessed on 12 March 2019).
  176. Jarus. JARUS Guidelines on Specific Operations Risk Assessment (SORA); SORA STS-01; Jarus: Rome, Italy, 2019; Available online: http://jarus-rpas.org/content/jar-doc-06-sora-package (accessed on 6 March 2019).
  177. Jarus. JARUS Guidelines on Specific Operations Risk Assessment (SORA); SORA Version 2.5; Jarus: Rome, Italy, 2022; Available online: http://jarus-rpas.org/jarus-external-consultation-sora-version (accessed on 2 December 2022).
Figure 1. Typical prototypes of fixed-wing and rotary wing UAVs. (a) Single rotor UAV, (b) Multi-rotor UAV, (c) Fixed-wing UAV, (d) Hybrid UAV.
Figure 1. Typical prototypes of fixed-wing and rotary wing UAVs. (a) Single rotor UAV, (b) Multi-rotor UAV, (c) Fixed-wing UAV, (d) Hybrid UAV.
Aerospace 10 00317 g001
Figure 2. Different types of UGVs with (a) wheels (Figure from [73]) (b) tracks (Figure from [74]) (c) legs (Figure from [75]) (d) arm robot (Figure from [76]).
Figure 2. Different types of UGVs with (a) wheels (Figure from [73]) (b) tracks (Figure from [74]) (c) legs (Figure from [75]) (d) arm robot (Figure from [76]).
Aerospace 10 00317 g002
Figure 3. Schematic views of collaborations between UAVs and UGVs (Figure from [82]).
Figure 3. Schematic views of collaborations between UAVs and UGVs (Figure from [82]).
Aerospace 10 00317 g003
Figure 4. Diagram overview of forestry multi-robot strategy (Figure from [87]).
Figure 4. Diagram overview of forestry multi-robot strategy (Figure from [87]).
Aerospace 10 00317 g004
Figure 5. Major unmanned system-supported applications in forestry.
Figure 5. Major unmanned system-supported applications in forestry.
Aerospace 10 00317 g005
Figure 6. Typical applications of Tree quantification. (a) Individual tree detection (Figure from [100]). (b) Result of image segmentation (Figure from [102]). (c) Tree height growth measurements (Figure from [103]). (d) Mobile platform (Figure from [109]).
Figure 6. Typical applications of Tree quantification. (a) Individual tree detection (Figure from [100]). (b) Result of image segmentation (Figure from [102]). (c) Tree height growth measurements (Figure from [103]). (d) Mobile platform (Figure from [109]).
Aerospace 10 00317 g006
Figure 7. Classification results of tree health conditions (Figures from [113]). (a) RGB, (b) images in background.
Figure 7. Classification results of tree health conditions (Figures from [113]). (a) RGB, (b) images in background.
Aerospace 10 00317 g007
Figure 8. (a) Hot-spot detection and identification using image fusion (Figure from [117]), (b) Views of flight mission in burning area (Figure from [10]), and (c) Forest fire detections from aerial perspective (Figure from [124]).
Figure 8. (a) Hot-spot detection and identification using image fusion (Figure from [117]), (b) Views of flight mission in burning area (Figure from [10]), and (c) Forest fire detections from aerial perspective (Figure from [124]).
Aerospace 10 00317 g008
Figure 9. Deployment and workflow of Ranger-Scout multi-robot system. ➀ Multi-purpose Ranger ➁ Scouts (Figure from [87]).
Figure 9. Deployment and workflow of Ranger-Scout multi-robot system. ➀ Multi-purpose Ranger ➁ Scouts (Figure from [87]).
Aerospace 10 00317 g009
Figure 10. Identification result of wildlife. (a) Result footage for kangaroos (Figure from [22]), (b) Image inverted (Figure from [22]), (c) White-bellied Sea Eagles (Figure from [132]), (d) Proboscis Monkey (Figure from [132]).
Figure 10. Identification result of wildlife. (a) Result footage for kangaroos (Figure from [22]), (b) Image inverted (Figure from [22]), (c) White-bellied Sea Eagles (Figure from [132]), (d) Proboscis Monkey (Figure from [132]).
Aerospace 10 00317 g010
Figure 11. Views of (ac) illumination variations in performing wild red deer monitoring task in the forest, Liaoning Province, and (df) dynamic variations of target objectives (highly dynamic wildlife creatures in the wild environment).
Figure 11. Views of (ac) illumination variations in performing wild red deer monitoring task in the forest, Liaoning Province, and (df) dynamic variations of target objectives (highly dynamic wildlife creatures in the wild environment).
Aerospace 10 00317 g011
Table 1. Defined autonomy level of the aircraft system by NASA.
Table 1. Defined autonomy level of the aircraft system by NASA.
GradesControl ModeDescriptionCharacteristic
0Remote controlFlight by operator on the ground (100% control time)Manually controlled aircraft
1Simple automationPerform tasks under the supervision of an operator with the assistance of an automatic control device (80% control time)Autopilot instrument
2Remote automatic operationExecute preprogrammed tasks (50% control time)Fly according to the preset waypoints
3Semi-automaticPerform complex tasks autonomously. Has environmental awareness. Make routine decisions. (20% control of time)Automatic takeoff/landing
The task can continue after the link is broken
4Fully autonomous controlHave broad situational awareness and have the ability and authority to make comprehensive decisions (<5% Control time)Automatic task planning; Has the ability to cooperate with other units or systems
5Cooperative controlMultiple UAVs work in teamsLearning by itself and has ability of self-organization and coordination
Table 2. Harvesting systems for various mechanization levels and tasks.
Table 2. Harvesting systems for various mechanization levels and tasks.
CategoriesApplicationUtilized EquipmentHarvesting
Motor-manual (MM) full treeTree felling, extraction, and processingChainsaw, cable skidder, delimberFT
Fully mechanized (FM) full treeTree felling, extraction, and processingFeller, buncher, skidder, delimber/processorFM, FT
Motor-manual
tree length
Tree felling, delimbering, and extractionChainsaw, skidderMM, TL
Motor-manual
cut-to-length
Tree felling, extraction, and processingChainsaw, forwarderMM, CTL
Fully mechanized
cut-to-length
Tree felling, extraction, and processingHarvester, forwarderFM, CTL
Table 3. Locomotion feature comparison of ground vehicles (NA: Not available).
Table 3. Locomotion feature comparison of ground vehicles (NA: Not available).
FeatureDefinitionWheeledTrackedLegged
Maximum speedmaximum speed on flat and compact surfaces in the absence of obstaclesHighmedium/highlow/medium
Obstacle crossingthe capability of crossing obstacles with random shapes in unstructured environments (e.g., rocks)Lowmedium/highhigh
Step/stair climbingcapability of climbing up single steps and stairs in environments structured for humansLowmediumhigh
Slope climbingcapability of climbing compact slopes with a sufficient friction coefficient (>0.5)low/mediumhighmedium/high
Walking capability (soft terrain)capability of walking on soft and yielding terrains (e.g., sand)Lowhighlow/medium
Walking capability (uneven terrain)capability of walking on uneven terrains (e.g., grassy ground, rocky ground)Lowmedium/highhigh
Energy efficiencyenergy efficiency in normal operating conditions, on flat and compact terrainsHighmediumlow/medium
Mechanical complexitylevel of complexity of the mechanical architectureLowlowhigh
Control complexitylevel of complexity of the control system (hardware and software)Lowlowhigh
Technology readinesslevel of maturity of the necessary enabling technologiesFullfullfull/in progress
Table 4. Ikhana (and its predecessor, Altair) during the 2006–2009 fire seasons [121].
Table 4. Ikhana (and its predecessor, Altair) during the 2006–2009 fire seasons [121].
Year Aircraft Flights Hours Wildfire Mission
2006Altair 468Mono Lake Prescribed Fire
2007Ikhana1289Columbine, Hardscrabble
2008Ikhana421North Mountain, American River
2009Ikhana211Piute, Station Fire
Table 5. UAV onboard sensors: auxiliary and specific [150].
Table 5. UAV onboard sensors: auxiliary and specific [150].
AuxiliarySpecific
  • GPS
  • IMU
  • Gyroscopes
  • Accelerometers
  • Altimeters
  • Video stabilizer
  • Image transmitter
  • Communication antennas
  • (VHF, UHF)
  • Communication modems
  • Video cameras (visible spectrum): EOS, stereoscopic, omnidirectional, fisheye lens.
  • Thermal cameras
  • Infrared cameras
  • FLIR
  • LIDAR (Laser scanner)
  • Multi-Hyperspectral (HyperUAS)
  • Irradiance
  • Radar/SAR
  • Radiometer (multi-frequency)
  • Infrared spectroscopy
  • Electronic nose
  • VCSEL
  • WMS
  • Ultraviolet spectrometer
  • Multi-gas detector
  • Sonar
  • Smartphone
  • Particle counters (optical, condensation)
  • Photometer, aethalometer
  • Aerosol sampling
  • Probes (temperature, humidity, and pressure)
  • Cloud droplet spectrometer
  • Pyranometer
  • Electrostatic collector
  • Radiation gauge
  • Magnetic sensor
  • Ultraviolet flame detector
  • Gas/smoke detector
Table 6. UAVs samples used for forestry applications (NA: Not Available).
Table 6. UAVs samples used for forestry applications (NA: Not Available).
SamplePlatformTypeObjectiveApplicationCountryAreaCamera
Midhun [100]DJI phantom 3Rotary-wingLive plantIndividual tree detectionUSA32 haCompact RGB digital
Charton [100]Swinglet CAMFixed-wingStack and logs in sawmillWood volume extractionCanada50 hectaresDigital RGB
Lin [102]Microdrone Md4-200Rotary-wingUrban forestry and greeningIndividual tree detectionFinlandNADigital compact
Dash [111]Aeronavics SkyJibRotary-wingPinus radiata D. Don treesAssessing forest healthNew Zealand2.7 haMicaSense RedEdge 3
Dempewolf [103]DJI phantom 3 ProRotary-wingDeciduous speciesTree height measurementGermany2 haRGB camera
Birdal [104] eBeeFixed-wingBlack and Scots pinesTree height estimatingTurkey1 haCanon IXUS 12 7HS
Luis [22]DJI S800Rotary-wingKoala, deer, and kangarooWildlife monitoringAustraliaNA Thermal camera/RGB camera
Gini [136] Microdrone MD4-200Rotary-wingDeciduous speciesTree classificationItaly1 haPentax Optio A40/Sigma DP1
Puliti [165] (2018)eBeeFixed-wingDeciduous speciesStock volume estimationNorway7300 haCanon IXUS/ELPH
Nicolas [134] eBeeFixed-wingLarge mammalsWildlife monitoringNamibia10,300 haCompact camera
Pierrot [144] (Lisein J. et al., 2013) Gatewing X100, Fixed-wingDeciduous speciesForest canopy modellingBelgium200 haRicoh GR3 still camera
Michale [10]The Vector PFixed-wingWildfiresWildfire monitoringUSANAColor/infrared cameras
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sun, H.; Yan, H.; Hassanalian, M.; Zhang, J.; Abdelkefi, A. UAV Platforms for Data Acquisition and Intervention Practices in Forestry: Towards More Intelligent Applications. Aerospace 2023, 10, 317. https://doi.org/10.3390/aerospace10030317

AMA Style

Sun H, Yan H, Hassanalian M, Zhang J, Abdelkefi A. UAV Platforms for Data Acquisition and Intervention Practices in Forestry: Towards More Intelligent Applications. Aerospace. 2023; 10(3):317. https://doi.org/10.3390/aerospace10030317

Chicago/Turabian Style

Sun, Huihui, Hao Yan, Mostafa Hassanalian, Junguo Zhang, and Abdessattar Abdelkefi. 2023. "UAV Platforms for Data Acquisition and Intervention Practices in Forestry: Towards More Intelligent Applications" Aerospace 10, no. 3: 317. https://doi.org/10.3390/aerospace10030317

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop