Next Article in Journal
A Q-Learning-Based Two-Layer Cooperative Intrusion Detection for Internet of Drones System
Next Article in Special Issue
Validation of the Flight Dynamics Engine of the X-Plane Simulator in Comparison with the Real Flight Data of the Quadrotor UAV Using CIFER
Previous Article in Journal
Study of Power and Trajectory Optimization in UAV Systems Regarding THz Band Communications with Different Fading Channels
Previous Article in Special Issue
Development of an Automatic Solar Tracker Control System for a Tandem-Winged UAV and Its Implementation Strategies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Decomposition and Modeling of the Situational Awareness of Unmanned Aerial Vehicles for Advanced Air Mobility

by
Sorelle Audrey Kamkuimo
1,*,
Felipe Magalhaes
1,
Rim Zrelli
1,
Henrique Amaral Misson
1,
Maroua Ben Attia
2 and
Gabriela Nicolescu
1
1
Computer Engineering and Software Engineering Department, Polytechnique Montreal, Montreal, QC H3T 1J4, Canada
2
Humanitas Solutions, 276 Rue Saint-Jacques #300, Montreal, QC H2Y 1N3, Canada
*
Author to whom correspondence should be addressed.
Drones 2023, 7(8), 501; https://doi.org/10.3390/drones7080501
Submission received: 14 June 2023 / Revised: 27 July 2023 / Accepted: 27 July 2023 / Published: 1 August 2023
(This article belongs to the Special Issue Conceptual Design, Modeling, and Control Strategies of Drones-II)

Abstract

:
The use of unmanned aerial aircrafts (UAVs) is governed by strict regulatory frameworks that prioritize safety. To guarantee safety, it is necessary to acquire and maintain situational awareness (SA) throughout the operation. Existing Canadian regulations require pilots to operate their aircrafts in the visual line-of-sight. Therefore, the task of acquiring and maintaining SA primary falls to the pilots. However, the development of aerial transport is entering a new era with the adoption of a highly dynamic and complex system known as advanced air mobility (AAM), which involves UAVs operating autonomously beyond the visual line-of-sight. SA must therefore be acquired and maintained primarily by each UAV through specific technologies and procedures. In this paper, we review these technologies and procedures in order to decompose the SA of the UAV in the AAM. We then use the system modeling language to provide a high-level structural and behavioral representation of the AAM as a system having UAV as its main entity. In a case study, we analyze one of the flagship UAVs of our industrial partner. Results show that this UAV does not have all of the technologies and methodologies necessary to achieve all of the identified SA goals for the safety of the AAM. This work is a theoretical framework intended to contribute to the realization of the AAM, and we also expect to impact the future design and utilization of UAVs.

1. Introduction

The transportation industry in Canada is undergoing a technological revolution following the development and adoption of the unmanned aircraft system (UAS) [1]. A UAS is made up of an unmanned aerial vehicle (UAV) communicating through suitable devices with operators located at a ground control station (GCS) [2]. Their exploitation is expected to result in significant economic benefits due to their involvement in various economic sectors, ranging from personal operation to more advanced operations including search and rescue, infrastructure inspections, land mapping, environmental management, crisis management and monitoring [3]. The use of UAS is governed by strict regulatory frameworks that prioritize safety [4]. To guarantee safety, it is necessary to acquire and maintain situational awareness (SA) throughout the operation [5,6]. Existing Canadian government regulations mainly concern UASs operating in the visual line-of-sight (VLOS), i.e., within maximum horizontal and vertical limits of 500 m and 120 m, respectively [5,7]. During these operations, the pilot must keep an imaginary and uninterrupted straight line between him or herself and the UAV, in order to control its position in relation to its surroundings. In certain scenarios where weather or environmental conditions lead to a loss of visibility within the spatial limits defined for VLOS, the pilot can be assisted by other operators, generally referred to as visual observers [5]. Their task is to maintain SA in operating contexts where the SA of the primary pilot is not optimal. They communicate in real time with the main pilot to transmit information relevant to the operation [5]. The presence of visual observers can also allow operations to be conducted in extended visual line-of-sight (EVLOS), which is defined by an extension of the authorized spatial limits for VLOS [7,8,9].
Following the consideration of VLOS and EVLOS, past decomposition of SA for UAS operations placed the human at the center of the operation, making them the primary entity responsible for decision making [6]. However, aerial transport is entering a new era, in which UAVs will be able to serve large urban, suburban and rural areas through the application of advanced air mobility (AAM) [1]. AAM involves performing operations beyond the visual line-of-sight (BVLOS), i.e., beyond the spatial limits which define VLOS and EVLOS [1,10]. Figure 1 illustrates the spatial limits of VLOS, EVLOS and BVLOS.
AAM is characterized by the adoption of a highly dynamic and complex air transport system, in which a network of UAVs is required to operate autonomously to accomplish their missions [10]. The development of a regulatory structure that covers all aspects of AAM requires a deep analysis of the problem. At the request of the National Aeronautics and Space Administration (NASA), the National Academies of Sciences, Engineering, and Medicine (NASEM) conducted a study to assess the issues and challenges associated with the AAM vision [10]. The NASEM study found that, in addition to the regulatory and societal issues, close attention should be paid to the technical and technological aspects of the safe management of AAM [10]. In anticipation of these needs, NASA has set up the UAS traffic management system: a platform that takes advantage of digitalization to facilitate automatic air traffic management [11,12]. The UAS traffic management system assumes that UAVs operating in AAM are intended to be autonomous, i.e., capable of making decisions and taking actions while communicating directly with each other, with the GCS and with other airspace occupants, in order to achieve operational safety requirements. Safety issues can only be addressed if the task of acquiring and maintaining SA is not limited to the operators located at the GCS, but is also carried out individually by each UAV [10]. This includes responsibility for making decisions and performing actions. There is, therefore, a need to analyze and understand the key aspects of SA, as they relate to AAM. Such an analysis is based on the principle that AAM is a dynamic context in which each autonomous UAV must adjust itself according to its environment in order to ensure its own safety, that of other occupants of the airspace and that of other elements of the environment, in the air or on the ground [10]. The purpose of this article is, therefore, to propose a theoretical framework that represents a SA model associated with the UAV in AAM. With this work, we intend to contribute to research which will allow the realization of AAM, and we also expect to impact the safety of the current and future UAV operations on two main levels. Firstly, the SA decomposition contained in this paper may be used as a safety checklist for UAV users before BVLOS operations; and secondly, this contribution may inform the establishment or the updating of regulations relating to current BVLOS operations, and future operations in the AAM system.
This paper is structured as follows. We begin by discussing SA in Section 2. Then, in Section 3, we review the technologies and procedures that are currently used or that should be developed to facilitate the autonomy of UAVs. From there, we extract SA information relevant to the safety of UAV operations and present it in a goal-directed task analysis (GDTA) diagram (see Section 4). In Section 5, we use the system modeling language (SysML) to produce a high-level representation of the AAM. To this end, we present an overall structural vision of the AAM system in a block definition diagram (BDD) which highlights all of the entities involved in the system and their relationship to the UAV. We continue with an activity diagram in which we model the overall behavior that UAVs in AAM should exhibit, in order to achieve the SA goals described in Section 4. In Section 6, we present a case study in which we analyze a UAV to determine whether the technologies it uses are sufficient to achieve all of the SA requirements necessary for safe operations. We then discuss this analysis and make some suggestions for the further use of this work. Finally, we end this paper with a conclusion in which we present some directions to be explored in future research.

2. Situational Awareness

The concept of situational awareness (SA) applies in operational conditions which require, from one or more entities, the perception, filtering and organization of information, in order to guide decision making [13]. This occurs in complex adaptive systems characterized by “a dynamic network of entities acting simultaneously while continuously reacting to each other’s actions” [14]. The proper functioning of such systems requires communication and well-developed coordination between the various entities that constitute them. Interactions are, therefore, carried out according to the knowledge that each entity has of the system and the environment at each moment. The work of Endsley et al. [15] on SA in dynamic systems has long been considered a point of reference in this field, and specifies that the process of acquiring and maintaining SA takes place at three levels [15,16]:
  • Perception of the status, attributes and dynamics of the elements of the environment that are relevant to understanding a specific situation.
  • Understanding of the meaning and importance of the elements perceived in the first stage, depending on the situation and the intended goals.
  • Projection which facilitates proactive decision making and the anticipation of possible consequences through prediction of the future state of the situation according to the dynamics of the elements perceived and understood at the two previous levels [15,16].
These three levels of SA guide the decisions to be made and the actions to be taken to achieve the goals of the entity concerned. Figure 2 presents a simplified representation of the SA model devised by Endsley et al. [15].
In this paper, we address the issue of SA in AAM from a safety perspective. In fact, the basis for the AAM vision as proposed by the NASEM ranks safety as the highest priority consideration to be brought to the fore, for the design and planning of AAM systems [10]. As NASA states, the AAM is positioned to be an alternative to everyday transport [17]. The main aircraft operating in AAM, autonomous UAVs, have limitations in size and mass, ranging from small cargo-carrying drones to passenger-carrying air taxis [17]. These aircrafts carry out short range missions in urban and rural areas, not usually accessible to traditional airplanes [17,18]. Such missions include aerial observation, crop monitoring and treatment, package delivery, people transport, aerial photography, agriculture, mapping and inspection of buildings and power lines [17]. Given that such missions will somehow be integrated into daily life activities and environments, the consideration of safety in the foreground would promote the success of the AAM and its acceptance at the economic and social levels [10]. Thus, as an autonomous entity in the AAM system, UAVs must maintain “a safe flight operational state despite the absence of a highly qualified onboard flight crew” [10]. This implies for the UAV to be always aware of what is happening throughout the situational context of the operation. For that, it needs to perceive and understand the dynamism of the AAM system in order to make effective decisions and take necessary actions without human oversight.
So, to better address the issue of SA in AAM from a safety perspective, we assume that each autonomous UAV must consider itself to be the central entity in the system. As such, it is responsible to ensure its own safety and that of other airspace occupants. To do so, it must be equipped with a set of appropriate technologies and procedures which allow it to acquire and maintain a good SA. In the next section, we analyze these technologies and procedures which are currently developed for BVLOS operations, and which are therefore necessary in AAM.

3. Analysis of Technologies and Procedures for beyond Visual Line-of-Sight Operations

Recent work has performed reviews of the technologies and procedures used to maintain control over operations beyond visual line-of-sight (BVLOS), and to capture, analyze and convey information relevant to the safety of BVLOS operations [7,8,9,19,20]. We have analyzed these technologies and procedures and categorized them into four main groups: command and control of the UAV, detect and avoid (DAA), weather detection and knowledge of the state of the UAV. We will now describe each of these groups.

3.1. Command and Control of the UAV

Although UAVs have to be autonomous in AAM, the presence of one or more qualified humans at the ground control station (GCS) remains essential to maintain the possibility of commanding and controlling the UAV, so that these humans can act quickly when unexpected special events occur and are beyond the control of the UAV. An example of an unexpected special event would be a cyber attack on the UAV’s system. Indeed, given the current challenges related to cybersecurity and the fact that AAM’s UAVs will be software-dependent, the AAM system is expected to be a tempting target in any attack scenarios [10]. If the software of a UAV is attacked, that UAV may behave unsafely and human intervention will become necessary to maintain the safety of the AAM. Ensuring this requires command-and-control (C2) links to be maintained between the GCS and the autonomous UAV throughout the duration of the operation [21]. The main C2 links are the autopilot–GCS link, the manual radio control (RC)–GCS link, and the first-person view (FPV) link [19].
The autopilot–GCS communication link allows the human located at the GCS to maintain control over the general mission parameters of the UAV, such as its spatial location, its trajectory and mission progress [22,23]. The RC–GCS link allows the human to maintain the possibility of taking manual control of the aircraft at all times [24]. If one of these links is lost, an emergency procedure such as return-to-land (RTL) or a flight termination system (FTS) may be triggered [19,25,26]. The RTL procedure is used to command the UAV to return when the autopilot–GCS link is lost for a certain period of time; about ten seconds [19]. The goal of the RTL procedure is to try to re-establish this link to prevent the FTS procedure from being triggered [19,25,26]. If the autopilot–GCS link is lost for too long (about two minutes for example) and the RC–GCS link is also lost, the FTS procedure stops the UAV’s motor and leads to the safe destruction of the aircraft [19,26]. These procedures help to prevent the UAV from flying outside authorized limits without supervision. Authorized flight limits are defined by a geo-fence function, which fixes the minimum and maximum altitudes of the flight and delimits the spatial area of operation using a set of GPS points around the intended flight area [19]. FTS and RTL procedures can also be manually triggered by the human at the GCS.
The FPV link is used to keep the human visually aware of the situation around the UAV [27,28]. The UAV is equipped with a camera that transmits a video stream to the GCS. This video stream allows the human at the GCS to perceive the scene as if they are on board the aircraft [27]. The human at the GCS is, thus, provided with information including (but not limited to) the aircraft’s position, altitude, speed, direction and power consumption [19]. FPV allows the human at the GCS to visually monitor the environment around the aircraft in order to, for example, avoid possible dangers by taking manual control of the aircraft when necessary [29]. To avoid collision risks, recently produced UAVs integrate technologies and procedures which allow them to detect and avoid obstacles.

3.2. Detect and Avoid

During an operation, each UAV must ensure its own safety and that of other airspace occupants [30]. To do so, it must be equipped with technological devices and functions which allow it to detect objects in a limited spatial area in order to anticipate any risk of collision [19,30]. Commonly used technologies include radar for detecting non-cooperative traffic obstacles (buildings, hot air balloons, etc.) and automatic dependent surveillance-broadcast (ADS-B) for cooperative traffic detection [19,31,32]. For cooperative traffic, sensors allow the UAV to self-determine and broadcast its own spatial information such as its position and altitude, while cooperating with other objects in the operating context to receive their information [19,33,34]. The detect and avoid (DAA) process can be resumed in three main steps: sense, detect and avoid [33]. The execution of the sense step is carried out by sensors which collect the spatial locations of obstacles, their speed and their rate of acceleration or deceleration [35]. Based on this information, the obstacles likely to collide with the UAV are identified in the detect step. In the avoid step, the UAV can reroute, or trigger the FTS procedure described in Section 3.1 above.

3.3. Detection of Weather Conditions

The impact of meteorological conditions on operations is a crucial element of AAM. Weather conditions such as clouds, mist, fog or rain can degrade the quality of information transmitted in FPV, resulting in poor SA [6]. In some cases, infrared cameras can improve SA around the UAV [36]. Other weather conditions such as low temperatures and strong winds can lead to batteries draining rapidly or the deterioration of electronic components [37]. Strong winds can compromise the safety of an operation by negatively influencing the stability of the UAV and the DAA [6,19,20,37]. This is because BVLOS operations include small vehicles with limited power, as well as low maximum take-off mass [6,19,20,37]. For SA of weather conditions to be effective, Jacob et al. [20] specifically proposed that aerial vehicles should be equipped with real-time predictive capabilities, in order to detect wind speed and direction. Then, weather conditions unfavorable to the continuation of operations could be detected, and the UAV could take a safe action such as waiting in a secure area or executing an RTL or FTS procedure [37]. In order to execute these corrective actions, it is imperative for the UAV to be in a good working condition. We highlight this issue in the next section.

3.4. Awareness of the State of the UAV

The malfunction of one of the components of the UAV can represent a major risk for the safety of the AAM. Thus, in addition to the technologies and procedures for the external control of the environment described above, the UAV must also know about its state at each moment before and during the flight. Both the human [6] and the UAV itself must ensure the proper functioning of all its essential components (batteries, motor, cameras, propellers, wheels, etc.) [38]. If one of its components is malfunctioning, remedial actions such as RTL or FTS procedures must be taken quickly to maintain the safety of the AAM.
Beyond the four categories of SA, described in Section 3.1, Section 3.2, Section 3.3 and Section 3.4, the UAV must also have information on the characteristics of the operating area. More specifically, the spatial operating limits must be defined so that it can respect the limits of authorized airspace, the topography, the relief, the maximum flight altitude achievable in the area and the boundaries of the area, as well as the presence of external objects (water bodies, airfields, manned aircraft, reserve landing sites in case of precautionary or emergency landing) [4,5].
In this section, we analyze the technologies and procedures necessary for BVLOS operation. This analysis allows us to extract four general SA goals that each UAV should accomplish in the AAM. These goals are associated with the four groups of identified technologies and procedures (UAV command and control, DAA, weather detection and UAV status). In the next section, we represent these requirements in a decomposition of UAV SA from the AAM safety perspective.

4. Analysis of Technologies and Procedures for beyond Visual Line-of-Sight Operations

In this section, before presenting our UAV-related SA decomposition according to the AAM vision, we discuss a previous decomposition of SA, as it relates to UAV flight operations in general [6].

4.1. Related Work

An SA decomposition related to UAV operations has been previously published which approached SA from a human–UAV interaction perspective [6]. The SA requirements identified by that decomposition concerned the spatial location of the UAV vis-à-vis other airspace objects, the weather conditions around the UAV, the state of the UAV in terms of its logic (RTL and FTS), the UAV components (the camera, for example), the mission information, the commands necessary to direct the UAV and the UAV ability to execute those commands. Emphasis was placed on the SA of the human vis-à-vis the UAV. Indeed, the only requirements associated with the UAV were that it understood the commands received remotely from the pilot, and that it executed the pre-programmed safety procedures (RTL and FTS), if necessary.
This examination of SA in a flight operation context clearly separates the SA of the human from that of the UAV, considering the human as the central entity in the operation and the entity mainly responsible for the acquisition and maintenance of SA to ensure flight safety. The underlying logic gives the human primary responsibility for making decisions and taking actions. We speculate that such an approach to SA in the context of UAV operations was due to UAV technologies available at that time. Today’s technology has made possible more autonomous UAVs that can perceive their environment themselves, analyze it, make decisions and act. This will potentially solve the problem of human error when acquiring SA during operations. Another SA analysis should therefore be performed which considers the UAV as the main entity and emphasizes its self-localization in the airspace, its communication with the GCS and its means of making decisions and undertaking actions to ensure the safety of BVLOS operations. Such an analysis is the focus of this work and is presented in the next section.

4.2. UAV-Related Situational Awareness in Advanced Air Mobility

Recent technological advances and the AAM vision mean that it is no longer a question of dividing the responsibility for SA between the UAV and the human, but of attributing it entirely to the UAV, which reports to the human located at the GCS. In addition, the UAV operating in AAM must be responsible for decision making and taking consequent actions, under the supervision of the human at the GCS. Thus, in this section, we present a new decomposition of the SA goals that UAVs operating in AAM must achieve to ensure safe operation. To perform this decomposition, we first consider the SA goals outlined in Section 4.1. In addition, we extract other SA requirements for consideration from the analysis of the technologies and procedures necessary for BVLOS detailed in Section 3, particularly DAA and communication with the GCS.
Following the approach proposed by Endsley et al. [13], designing for SA requires the goals attached to the different tasks of the entity in which we are interested to be initially defined. These goals can be presented in a goal-directed task analysis (GDTA) diagram. The GDTA diagram identifies and prioritizes goals and decisions along with the elementary SA requirements necessary to resolve them [13]. The root of the GDTA tree is the main goal of the studied entity, the internal nodes are a hierarchy of sub-goals (represented by rectangular shapes Drones 07 00501 i001) or decisions (represented by hexagonal shapes Drones 07 00501 i002) and the leaves are the elementary SA requirements necessary for the achievement of these sub-goals and, consequently, the main goal. The goal, sub-goals and SA requirements are defined in relation to the other entities in the system [13].
As we have emphasized, the UAV is the central entity of the AAM system. Its main goal is to ensure the safety of the AAM. To achieve this, it must be able to: (1) communicate with the GCS, (2) detect and avoid obstacles, (3) monitor weather conditions, and (4) be aware of its state. We illustrate these four sub-goals in Figure 3, which represents the first level of the GDTA.
Figure 4 illustrates the decomposition of the SA sub-goal (1): communicate with the GCS. To achieve this sub-goal, the UAV must be able to perceive the status of its C2 links (autopilot–GCS, GCS, and FPV). If one link is malfunctioning, it must take corrective action to maintain the safety of the AAM (by adjusting its behavior, rerouting, or performing RTL or FTS procedures).
Figure 5 shows the decomposition of the SA goal (2): detect and avoid obstacles. This sub-goal is broken down into a decision (represented by a hexagonal shape) and two other sub-goals denoted as sub-goals 2.1 and 2.2. The UAV must decide how to ensure that the technologies necessary to detect and avoid cooperative and non-cooperative traffic (ADS-B and radar, respectively) are working properly. If there is a malfunction, it will take corrective action (RTL or FTS) to maintain the safety of the AAM.
In Figure 6 and Figure 7 it is assumed that radar and ADS-B are working as expected. In Figure 6, the UAV, thus, acquires its own spatial information (position, speed, altitude and acceleration).
In Figure 7 the UAV receives spatial information about obstacles. It then evaluates whether there is a risk of collision based on that information. If there is, it will execute a corrective action (adjusting its behavior, rerouting, or performing RTL or FTS procedures), in order to maintain the safety of the AAM.
In Figure 8 we decompose SA sub-goal (3): monitor weather conditions. To achieve this sub-goal, the UAV must check whether the weather radar is functioning correctly. If it is not the case, it must undertake a remediation action (rerouting or initiating an FTS or RTL procedure).
In Figure 9, it is assumed that the weather radar is functioning as expected. The UAV must then monitor the wind and other significant weather conditions such as cloud, mist, fog and rain. If at least one weather condition is likely to compromise the safety of the AAM, the UAV must execute a corrective action (waiting above a safe area or initiating an RTL or FTS procedure) to maintain the safety of the AAM.
In Figure 10, we decompose the UAV SA sub-goal (4): to be aware of its state. To achieve this sub-goal, the UAV must monitor the status of its components (e.g., its batteries, motor, propellers or FPV camera). If one or more components are malfunctioning, it must perform a corrective action to maintain the safety of the AAM (by initiating an RTL or FTS procedure).
Following this decomposition of the UAV’s SA into a goal and several sub-goals, we next propose an abstract model of the structure and behavior of the AAM system which places the UAV at its center.

5. High-Level SysML Modeling of the Advanced Air Mobility

System modeling language (SysML) is a general-purpose modeling language that can be used “to create and visualize models that represent many different aspects of a system” both at the hardware and software level [39,40]. SysML consists of a set of diagrams that can be used to represent the system requirements, structure and behavior, or to specify “analysis and verification cases used to analyse and verify the system” [40]. We use SysML to represent the structure and the behavior of the AAM system from a SA perspective by performing high-level and simplified modeling of that operating context, in which the UAV is considered to be the basic entity of the system. The structure of the AAM system is represented through a block definition diagram and the behavior, through an activity diagram.

5.1. Block Definition Diagram for the Advanced Air Mobility

Block definition diagrams (BDDs) are used to represent the structural organization of a system [40]. In a BDD, relationships between entities are represented by arrows. Arrows with filled diamonds at one end (Drones 07 00501 i003) represent composition relationships (i.e., the block at the filled diamond end of the arrow is composed of the object at the other end). Arrows with an empty diamond at one end (Drones 07 00501 i004) indicate aggregation relationships (i.e., the block at the empty diamond end of the arrow may contain the object at the other end). Simple arrows with a triangle at one end (Drones 07 00501 i005) represent specialization relationships (i.e., the block at the triangle end is a generalization of the object at the other end). Simple arrows (Drones 07 00501 i006) represent independent associations between two blocks.
Figure 11 is a BDD of the AAM system. Since the UAV is the main entity in the AAM system, our modeling of the structure of the system consists of representing its main physical components and the subsystems which enable it to achieve SA goals (the DAA management system, weather detection system, autopilot and the C2 links management system). The UAV’s SA depends on the reciprocity of the relationship between the UAV and the other components of the system. These other components include obstacles, weather, GCS, area of operation and landing points.

5.2. Activity Diagram of the UAV in the Advanced Air Mobility

Activity diagrams are used to describe the behavior of a system in terms of control logic and actions [40]. An activity diagram consists of an initial node (represented by a solid circle Drones 07 00501 i007), actions (represented by rectangles with rounded corners Drones 07 00501 i008), control flows between actions (represented by arrows Drones 07 00501 i009), decisions/merges (represented by diamonds Drones 07 00501 i010) and one or more final nodes (represented by an empty circle containing a solid one Drones 07 00501 i011). An activity diagram can also contain fork nodes which allow an action to be duplicated into a set of parallel sub-actions, each of which produces a token at the end of its execution. For the main action of the fork node to be considered complete, all of the tokens produced by the parallel actions must compose a join node which produces the result of the action (the input of the associated fork node). Both fork and join nodes are represented by a thick line (Drones 07 00501 i012).
Figure 12 represents, at a high level, the SA activities that the UAV must perform to ensure the safety of the AAM. The decisions which lead to these actions are based on the information shared between the blocks of the UAV. The actions are performed repeatedly and in real time until the operation is completed.

5.2.1. The Sub-Activity Call a Sequence of Generic Actions

In Figure 13 we describe the generic actions underlined in the sub-activity Call a sequence of generic actions. These consist of generic resolution actions that can be taken to resolve detected problems. We represent the sub-activity in this way because, according to the decomposition of SA performed in Section 4, the problems that may arise during an operation have the same set of possible solutions. These solutions are divided into two main categories: solutions that could lead to the interruption of the operation, and solutions which lead to the continuation of the operation. The actions included in the second category consist, depending on the situation, of adjusting the behavior of the UAV, rerouting the UAV or the UAV waiting above a safe area. The actions in the first category are the RTL and FTS emergency procedures.

5.2.2. The Sub-Activity Monitor C2 Links Status

Figure 14 is a representation of SA actions and decisions relating to the UAV’s C2 links management subsystem. The main test to guide decision making concerns the status of each of the three links supported by this subsystem (i.e., RC–GCS, autopilot–GCS and FPV). If one or more of these links is lost or is not functioning correctly, an appropriate sequence of generic actions must be performed to resolve the issue in order to maintain the safety of the AAM. If all of the links are working correctly, the activity continues to the Final Join node to wait for synchronization with the parallel sub-activities.

5.2.3. The Sub-Activity Monitor Threats

Figure 15 presents SA actions and decisions relating to the DAA system. The first action of this sub-activity consists of evaluating the status of the radar and ADS-B systems. A decision is then made according to the status of these systems. If one or more of them is malfunctioning, an appropriate sequence of generic actions must be performed to resolve the issue, in order to maintain the safety of the AAM. However, if everything works correctly, the token produced by the DAA fork node disconnects into two parallel actions consisting of the self-detection of the UAV’s spatial information and the spatial location of obstacles. Once this is completed, the product tokens synchronize at the DAA join node, and the product token is used to decide whether there is a risk of collision. If there is a risk of collision, an appropriate sequence of generic actions must be performed to resolve the issue, in order to maintain the safety of the AAM. If there is no risk of collision, the activity continues to the Final join node to wait for synchronization with the parallel sub-activities.

5.2.4. The Sub-Activity Monitor Weather Conditions

Figure 16 represents the actions and SA decisions related to the weather management system. The principle is the same as the previous sub-activity. The first action consists of evaluating the status of the weather radar. A decision is then made according to the status of the weather radar. If the radar is malfunctioning, an appropriate sequence of generic actions must be performed to resolve the problem. However, if everything works correctly, the token produced by the Weather detection fork node splits in parallel to control parameters relating to the wind and to other weather conditions. If at least one of the weather conditions detected is unfavorable for the operation, an appropriate sequence of generic actions must be performed to resolve the issue. On the other hand, if weather conditions are generally favourable, the activity continues at the Weather detection join node and the token is sent to the Final join node to wait for synchronization with the parallel sub-activities.

5.2.5. The Sub-Activity Monitor the State of the UAV

Figure 17 represents the SA actions and decisions related to the state of the UAV. Here, the decision concerns the operation of one or more components of the UAV. If one or more components are at risk of compromising the safety of the operation, an appropriate sequence of generic actions must be performed to resolve the problem. If all of the components work correctly, the activity continues to the Final join node to wait for synchronization with the parallel sub-activities.
When the Final join node receives all the tokens from the four parallel sub-activities, it performs the synchronization and passes the resulting token to the next step. This transmission means that all necessary checks and adjustments have been made and the operation can continue. The next action is either the Fly action if the UAV has already taken off, or the take off action if not. At any time, the UAV will check if the operation has been completed and will land if it has; if it has not, the UAV will continuously verify that the safety of the AAM is not being compromised by checking that it has resolved the four main SA goals. The loop continues in this way until the operation is completed.
This activity diagram presents, at a high level, the internal process that the UAV must perform to consistently achieve the SA goals necessary for the safety of an operation.
In the next section we present a case study as an example of application of the work of this paper. In that case study, we analyze a specific UAV to verify whether the technologies it uses allow it to achieve the four SA goals described throughout this paper.

6. Case Study

For this research project we are working with an industrial partner which uses several types of UAVs, including the DJI Matrice 300 RTK released in May 2020. In this section, we analyze the DJI Matrice 300 RTK to determine whether the technologies it uses are sufficient to achieve the SA goals necessary for UAV operation in the AAM.

6.1. Analysis of the DJI Matrice 300 RTK

The dimensions of the DJI Matrice 300 RTK, when unfolded and excluding propellers, are 810 × 670 × 430 mm (L × W × H). It has a maximum takeoff weight of 9 kg and can operate for a maximum flight time of 55 min at a maximum speed of 23 m/s in manual mode and 17 m/s in automatic mode. It can operate at temperatures between −20 °C and 50 °C (−4 °F to 122 °F) and has a maximum wind resistance of 15 m/s (12 m/s when taking off or landing). Table 1 shows the technologies used by this UAV to acquire and maintain SA according to the four main SA goals described above.

6.2. Results and Discussion

The specifications of the DJI Matrice 300 RTK indicate that it has the technologies necessary to autonomously achieve SA goals related to the command and control of the UAV and the DAA. The specifications for the remote controller describe the radio frequencies used and other associated characteristics such as the maximum transmitting distance and the maximum effective isotropic radiated power. The 2.4000–2.4835 GHz radio frequency is commonly used for manual radio control from GCS [19], and the 5.725–5.850 GHz radio frequency is used for the real-time airborne video transmission associated with FPV [19]. The DAA specifications described relate to the vision system and the sensors available to scan objects around the UAV in order to gain their spatial information [41]. Some physical parts are also described in the specifications list, including the batteries, propellers, and wheelbase.
However, we found some technological shortcomings. Firstly, the autopilot–GCS C2 link is not explicitly described. Secondly, the available specifications do not mention any technology for the detection of weather conditions. We note for example that there is a specification describing maximum wind resistance of the UAV, but there is no description of technologies for monitoring the characteristics of the wind. Thirdly, although the specifications detail some of the physical parts of the UAV, no description is provided of the procedures employed by the UAV to self-check the status of these physical parts (e.g., procedures to check battery level).
We have devised some hypotheses that may justify each of these shortcomings. With regard to the lack of a description of the GCS-autopilot link, we assume that, since this is an UAV suitable for BVLOS operations, it must have an autopilot and that the autopilot’s link with the GCS can be implicitly attested to by the statuses of the other C2 links (RC–GCS and FPV). Nevertheless, authors in [19] specify that a 900 MHz C2 link is generally used for the GCS-autopilot link. The lack of a description of the procedures for the evaluation of the state of physical components could be because they are integrated, and this information will thus be provided immediately to the GCS during the operation https://www.youtube.com/watch?v=7VOvYxGUYMU, (accessed on 13 July 2023). Finally, the absence of weather detection technologies may be due to the current unavailability of technologies such as miniaturized weather radar systems which fit the size, weight and power requirements of small UAVs [20].
Considering the above results, it is important for a pilot using the DJI Matrice 300 RTK to monitor and assess at any time during the operation, the characteristics of wind, cloud, mist, fog and rain. This could be conducted through the FPV or by analysing other SA elements (for example, UAV instability or rapid battery drain could be caused by high winds or low temperatures). A lack of pilot attention to these uncovered SA requirements could have disastrous consequences. Indeed, in a previous study in which the human was the main entity responsible for acquiring and maintaining SA, the authors had performed an experiment in which the objective was to examine the incidents encountered during operations in order to identify the SA faults that caused them [6]. According to the results, incidents were associated with poor acquisition of SA by the human operator. Particularly the human lack to have a good SA on the UAV’s spatial location, mission information, weather conditions around the UAVs, and the logic of the UAV. These incidents included: the UAV becoming stuck in orbit, confusion when trying to avoid collisions, imprecise tracking of spatial locations, inconsistency between the camera and control displays, operational information being misremembered by human, and UAV crash.

7. Conclusions

As the AAM vision is developed, the future of aerial transport promises to be dense, complex, and highly dynamic. As a result, operations could quickly prove to be dangerous for UAVs, other occupants of the airspace, people participating in the operation, civilians, and material goods on the ground. Major safety issues must be foregrounded and fully considered. During operations, the task of acquiring and maintaining SA must be carried out by the UAV and reported to the human at the GCS, as the UAV is the main autonomous entity of the AAM system. This requires the use of technologies and procedures to allow the UAV to self-locate in the airspace, communicate with the GCS and other airspace occupants, and autonomously make decisions and take actions to effectively manage the operation. In this article, we have analyzed these technologies and procedures and we have presented, in a GDTA, a decomposition of the SA of an UAV in the AAM. We have then carried out SysML modeling to represent, at a high level, the structure and behavior of the AAM system while considering the UAV as the central entity of the AAM system. We have also presented a case study in which we have analyzed the DJI Matrice 300 RTK, one of the flagships UAVs of our industrial partner. This analysis had two main objectives: to present an example of the exploitation of this work in current operations, and to determine whether the analyzed UAV has technologies which could allow it to acquire and maintain good SA so that it could be deemed safe for the AAM. This analysis has allowed us to highlight that the UAV did not have any technology to achieve the SA goal related to weather conditions. Thus, our conclusion is that the DJI Matrice 300 RTK cannot achieve all SA goals necessary to ensure the safety of the AAM. So, we have suggested that, when performing a BVLOS operation with that UAV, the user must ensure to achieve not covered SA goals themselves. The work of this paper takes place upstream in the design and planning phase of the AAM and we intend to provide a theoretical framework to guide reflections towards achieving the safety objective for the realization of AAM. We believe that the decomposition of SA performed in this paper advances research in the UAV field. We also consider that this study can be used by UAV designers to identify the SA safety requirements that a UAV must cover in order to include appropriate technologies to satisfy these requirements. In addition, we propose that a section in the general specifications of UAVs should be clearly focused on the description of SA specifications. Such a section should individually identify each SA goal and clearly specify the technologies which the UAV uses (or does not use) to achieve it. We also recommend that the user assesses the SA goals that will be achieved by each UAV before beginning an operation. This will allow them to pay particular attention to the goals which are not achieved during the operation. Such an assessment will also involve taking into consideration the air risk classes that will be involved in the operation in order to measure the associated risk ratio [42]. Finally, we believe that this work may add value to the information required to develop regulations for BVLOS operations, which are currently only authorized on a case-by-case basis. The next major step of our researches will be to implement the SA module to be included in UAV’s embedded system. So, from the high-level modeling presented in this paper, we will add appropriate details to make a lower-level modeling using more robust formal methods suitable to the verification and validation of safety-critical systems. This will be conducted either by adapting the transformation approaches already existing in the literature [43,44,45,46,47], or by proposing a new transformation approach. Future research should focus on analyzing distributed SA, in which UAVs work together towards a common goal as a swarm. In addition, for the evaluation of SA in an operational context, it would be interesting to propose an approach for generating various scenarios simulating situations that may be encountered when undertaking an operation using a single UAV or a swarm of UAVs.

Author Contributions

Under the supervision of G.N. and M.B.A., S.A.K., R.Z. and H.A.M. conducted the study. S.A.K. was in charge of conceptualization, methodology and writing the original drafts. Review and editing have been made by F.M. and S.A.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by Mitacs (grant No. MITACS IT12973).

Data Availability Statement

No data available.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Transport Canada. Transport Canada’s Drone Strategy to 2025; Transport Canada: Ottawa, ON, Canada, 2021.
  2. Puliti, S.; Ørka, H.O.; Gobakken, T.; Næsset, E. Inventory of small forest areas using an unmanned aerial system. Remote Sens. 2015, 7, 9632–9654. [Google Scholar] [CrossRef] [Green Version]
  3. Sweeney, N. Civilian Drone Use in Canada. 2017. Available online: https://lop.parl.ca/sites/PublicWebsite/default/en_CA/ResearchPublications/201723E? (accessed on 22 April 2022).
  4. Government of Canada. Canadian Aviation Regulations (SOR/96-433); Government of Canada: Ottawa, ON, Canada, 2022.
  5. Transport Canada. Aeronautical Information Manual; Transport Canada: Ottawa, ON, Canada, 2021.
  6. Drury, J.L.; Riek, L.; Rackliffe, N. A decomposition of UAV-related situation awareness. In Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction, Salt Lake City, UT, USA, 2–3 March 2006. [Google Scholar]
  7. Davies, L.; Bolam, R.C.; Vagapov, Y.; Anuchin, A. Review of unmanned aircraft system technologies to enable beyond visual line of sight (BVLOS) operations. In Proceedings of the 2018 X International Conference on Electrical Power Drive Systems (ICEPDS), Novocherkassk, Russia, 3–6 October 2018; IEEE: New York, NY, USA, 2018. [Google Scholar]
  8. Bloise, N.; Primatesta, S.; Antonini, R.; Fici, G.P.; Gaspardone, M.; Guglieri, G.; Rizzo, A. A survey of unmanned aircraft system technologies to enable safe operations in urban areas. In Proceedings of the 2019 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 11–14 June 2019; IEEE: New York, NY, USA, 2019. [Google Scholar]
  9. Politi, E.; Panagiotopoulos, I.E.; Varlamis, I.; Dimitrakopoulos, G. A Survey of UAS Technologies to Enable beyond Visual Line of Sight (BVLOS) Operations. In Proceedings of the 7th International Conference on Vehicle Technology and Intelligent Transport Systems (VEHITS), Prague, Czech Republic, 28–30 April 2021. [Google Scholar]
  10. National Academies of Sciences, Engineering and Medicine. Advancing Aerial Mobility: A National Blueprint; National Academies Press: Cambridge, MA, USA, 2020. [Google Scholar] [CrossRef]
  11. Prevot, T.; Rios, J.; Kopardekar, P.; Robinson, J.E., III; Johnson, M.; Jung, J. UAS traffic management (UTM) concept of operations to safely enable low altitude flight operations. In Proceedings of the 16th AIAA Aviation Technology, Integration, and Operations Conference, Washington, DC, USA, 13–17 June 2016. [Google Scholar]
  12. Blake, T. What Is Unmanned Aircraft Systems Traffic Management; NASA: Washington, DC, USA, 2021. [Google Scholar]
  13. Endsley, M.R.; Bolté, B.; Jones, D.G. Designing for Situation Awareness: An Approach to User-Centered Design; CRC Press: Boca Raton, FL, USA, 2003. [Google Scholar]
  14. Vankipuram, M.; Kahol, K.; Cohen, T.; Patel, V.L. Toward automated workflow analysis and visualization in clinical environments. J. Biomed. Inform. 2011, 44, 432–440. [Google Scholar] [CrossRef] [PubMed]
  15. Endsley, M.R. Toward a theory of situation awareness in dynamic systems. Hum. Factors 1995, 37, 32–64. [Google Scholar] [CrossRef]
  16. Jane, G.V. Human Performance and Situation Awareness Measures; CRC Press: Boca Raton, FL, USA, 2019. [Google Scholar]
  17. NASA. Advanced Air Mobility: What Is AAM? Student Guide; NASA: Washington, DC, USA, 2020. [Google Scholar]
  18. Goyal, R.; Cohen, A. Advanced Air Mobility: Opportunities and Challenges Deploying eVTOLs for Air Ambulance Service. Appl. Sci. 2022, 12, 1183. [Google Scholar] [CrossRef]
  19. Fang, S.X.; O’Young, S.; Rolland, L. Development of small uas beyond-visual-line-of-sight (bvlos) flight operations: System requirements and procedures. Drones 2018, 2, 13. [Google Scholar] [CrossRef] [Green Version]
  20. Jacob, J.; Chilson, P.B.; Houston, A.L.; Pinto, J.O.; Smith, S. Real-time Weather Awareness for Enhanced Advanced Aerial Mobility Safety Assurance. In Proceedings of the AGU Fall Meeting Abstracts, Online, Everywhere, 1–17 December 2020. [Google Scholar]
  21. Vasile, P.; Cioacă, C.; Luculescu, D.; Luchian, A.; Pop, S. Consideration about UAV command and control. Ground Control Station. In Proceedings of the 5th International Scientific Conference SEA-CONF 2019, Constanta, Romania, 17–18 May 2019; IOP Publishing: Bristol, UK, 2019. [Google Scholar]
  22. Unmanned Systems Technology. UAV Autopilot Systems. 13 January 2022. Available online: https://www.unmannedsystemstechnology.com/expo/uav-autopilot-systems/ (accessed on 30 May 2022).
  23. Nickols, F.; Lin, Y.J. Creating Precision Robots: A Project-Based Approach to the Study of Mechatronics and Robotics; Butterworth-Heinemann: Oxford, UK, 2018. [Google Scholar]
  24. Stevenson, J.D.; O’Young, S.; Rolland, L. Assessment of alternative manual control methods for small unmanned aerial vehicles. J. Unmanned Veh. Syst. 2015, 3, 73–94. [Google Scholar] [CrossRef] [Green Version]
  25. Ro, K.; Oh, J.-S.; Dong, L. Lessons learned: Application of small uav for urban highway traffic monitoring. In Proceedings of the 45th AIAA Aerospace Sciences Meeting and Exhibit, Reno, NV, USA, 8–11 January 2007. [Google Scholar]
  26. Stansbury, R.; Wilson, T.; Tanis, W. A technology survey of emergency recovery and flight termination systems for uas. In Proceedings of the AIAA Infotech@Aerospace Conference, Seattle, WA, USA, 6–9 April 2009. [Google Scholar]
  27. Kim, D.-H.; Go, Y.-G.; Choi, S.-M. An aerial mixed-reality environment for first-person-view drone flying. Appl. Sci. 2020, 10, 5436. [Google Scholar]
  28. Smolyanskiy, N.; Gonzalez-Franco, M. Stereoscopic first person view system for drone navigation. Front. Robot. AI 2017, 4, 11. [Google Scholar] [CrossRef] [Green Version]
  29. Transport Canada. Flying Your Drone Safely and Legally; Transport Canada: Ottawa, ON, Canada, 2020.
  30. Lyu, H. Detect and avoid system based on multi sensor fusion for UAV. In Proceedings of the 2018 International Conference on Information and Communication Technology Convergence (ICTC), Jeju, Republic of Korea, 17–19 October 2018; IEEE: New York, NY, USA, 2018. [Google Scholar]
  31. Minucci, F.; Vinogradov, E.; Pollin, S. Avoiding collisions at any (low) cost: ADS-B like position broadcast for UAVs. IEEE Access 2020, 8, 121843–121857. [Google Scholar] [CrossRef]
  32. Shadab, N.; Xu, H. A systematic approach to mission and scenario planning for UAVs. In Proceedings of the 2016 Annual IEEE Systems Conference (SysCon), Orlando, FL, USA, 18–21 April 2016; IEEE: New York, NY, USA, 2016. [Google Scholar]
  33. Fasano, G.; Accado, D.; Moccia, A.; Moroney, D. Sense and avoid for unmanned aircraft systems. IEEE Aerosp. Electron. Syst. Mag. 2016, 31, 82–110. [Google Scholar] [CrossRef]
  34. Zimmerman, J. ADS-B 101: What It Is and Why You Should Care. Air Facts Journal. January 2013. [Online]. Available online: https://airfactsjournal.com/2013/01/ads-b-101-what-it-is-and-why-you-should-care/ (accessed on 12 June 2023).
  35. Lim, C.; Li, B.; Ng, E.M.; Liu, X.; Low, K.H. Three-dimensional (3D) dynamic obstacle perception in a detect-and-avoid framework for unmanned aerial vehicles. In Proceedings of the 2019 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 11–14 June 2019; IEEE: New York, NY, USA, 2019. [Google Scholar]
  36. Müller, T. Robust drone detection for day/night counter-UAV with static VIS and SWIR cameras. In Proceedings of the Ground/Air Multisensor Interoperability, Integration, and Networking for Persistent ISR VIII, Anaheim, CA, USA, 10–13 April 2017; SPIE: Bellingham, WA, USA, 2017. [Google Scholar]
  37. National Civil Aviation Agency—Brazil. Drones and Meteorology; National Civil Aviation Agency: Brasilia, Brazil, 2020.
  38. Dai, X.; Ke, C.; Quan, Q.; Cai, K.-Y. RFlySim: Automatic test platform for UAV autopilot systems with FPGA-based hardware-in-the-loop simulations. Aerosp. Sci. Technol. 2021, 114, 106727. [Google Scholar] [CrossRef]
  39. Object Management Group (Standard Development Organization). OMG System Modeling Language. In SysML; Object Management Group: Milford, MA, USA, 2023. [Google Scholar]
  40. Holt, J.; Perry, S. What is SysML? In SysML for Systems Engineering: A Model-Based Approach; The Institution of Engineering and Technology, Savoy Place in London, England: London, UK, 2018; p. 110. [Google Scholar]
  41. Transport Canada. Regulations for Remotely Piloted Aircraft Systems (Civilian Drones); Transport Canada: Ottawa, ON, Canada, 2022; Volume 2022.
  42. Transport Canada. Advisory Circular (AC) No. 903-001. 2021. Available online: https://tc.canada.ca/en/aviation/reference-centre/advisory-circulars/advisory-circular-ac-no-903-001 (accessed on 26 February 2023).
  43. Ali, S.; Basit-Ur-Rahim, M.A.; Arif, F. Formal verification of internal block diagram of SysML for modeling real-time system. In Proceedings of the 2015 IEEE/ACIS 16th International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD), Takamatsu, Japan, 1–3 June 2015; IEEE: New York, NY, USA, 2015. [Google Scholar]
  44. Rahim, M.; Hammad, A.; Boukala-Ioualalen, M. Towards the formal verification of sysml specifications: Translation of activity diagrams into modular petri nets. In Proceedings of the 2015 3rd International Conference on Applied Computing and Information Technology/2nd International Conference on Computational Science and Intelligence, Okayama, Japan, 12–16 July 2015; IEEE: New York, NY, USA, 2015. [Google Scholar]
  45. Salunkhe, S.; Berglehner, R.; Rasheeq, A. Automatic transformation of SysML model to event-B model for railway CCS application. In Proceedings of the International Conference on Rigorous State-Based Methods, Ulm, Germany, 9–11 June 2021; Springer: Cham, Switzerland, 2021. [Google Scholar]
  46. Szmuc, W.; Szmuc, T. Towards embedded systems formal verification translation from SysML into Petri nets. In Proceedings of the 2018 25th International Conference” Mixed Design of Integrated Circuits and System”(MIXDES), Gdynia, Poland, 21–23 June 2018; IEEE: New York, NY, USA, 2018. [Google Scholar]
  47. Wang, H.; Zhong, D.; Zhao, T.; Ren, F. Integrating model checking with SysML in complex system safety analysis. IEEE Access 2019, 7, 16561–16571. [Google Scholar]
Figure 1. Illustration of spatial limits of visual line-of-sight (VLOS), extended visual line-of-sight (EVLOS) and beyond visual line-of-sight (BVLOS). The maximum horizontal and vertical limitations around the pilot are 500 m and 200 m, respectively [5,7]. EVLOS operations involve additional visual observers to cover the space in which the pilot cannot perceive the Unmanned Aerial Vehicle (UAV) due to the limits of the human vision range or critical weather or environmental conditions [1,5,7]. In EVLOS, the communication should be maintained between the pilot and the participating visual observers at all times during the operation. If an operation is beyond the VLOS and EVLOS constraints, it should be referred to as a BVLOS operation because neither the pilot nor any observer has a direct visual line-of-sight on the UAV [9,10].
Figure 1. Illustration of spatial limits of visual line-of-sight (VLOS), extended visual line-of-sight (EVLOS) and beyond visual line-of-sight (BVLOS). The maximum horizontal and vertical limitations around the pilot are 500 m and 200 m, respectively [5,7]. EVLOS operations involve additional visual observers to cover the space in which the pilot cannot perceive the Unmanned Aerial Vehicle (UAV) due to the limits of the human vision range or critical weather or environmental conditions [1,5,7]. In EVLOS, the communication should be maintained between the pilot and the participating visual observers at all times during the operation. If an operation is beyond the VLOS and EVLOS constraints, it should be referred to as a BVLOS operation because neither the pilot nor any observer has a direct visual line-of-sight on the UAV [9,10].
Drones 07 00501 g001
Figure 2. A simplified representation of the SA model devised by Endsley et al. [13]. This model is an iterative process during which SA acquisition depends on the state of the environment [13]. Decision making is based on SA and according to the goals of the entity concerned. Decisions can lead to actions that modify the state of the environment in the interest of achieving the intended goals.
Figure 2. A simplified representation of the SA model devised by Endsley et al. [13]. This model is an iterative process during which SA acquisition depends on the state of the environment [13]. Decision making is based on SA and according to the goals of the entity concerned. Decisions can lead to actions that modify the state of the environment in the interest of achieving the intended goals.
Drones 07 00501 g002
Figure 3. First level of the GDTA of the UAV in the AAM system. The main goal of the UAV is to ensure the safety of the AAM system. To achieve this goal, it should (1) ensure a good communication with the ground control station, (2) detect and avoid obstacles, (3) monitor weather conditions, and (4) be aware of its own state.
Figure 3. First level of the GDTA of the UAV in the AAM system. The main goal of the UAV is to ensure the safety of the AAM system. To achieve this goal, it should (1) ensure a good communication with the ground control station, (2) detect and avoid obstacles, (3) monitor weather conditions, and (4) be aware of its own state.
Drones 07 00501 g003
Figure 4. Decomposition of the first sub-goal of the UAV’s GDTA: (1) communicate with the GCS. The hexagonal shape represents the decision about how to ensure a good communication with the GCS. The answer to this question leads to the description of the three SA levels associated with communication with the GCS.
Figure 4. Decomposition of the first sub-goal of the UAV’s GDTA: (1) communicate with the GCS. The hexagonal shape represents the decision about how to ensure a good communication with the GCS. The answer to this question leads to the description of the three SA levels associated with communication with the GCS.
Drones 07 00501 g004
Figure 5. Decomposition of the second sub-goal of the UAV’s goal-directed task analysis: (2) detect and avoid obstacles. The hexagonal shape represents the decision about how to ensure that obstacles are detected. The answer to this question leads to the description of the three SA levels associated with obstacle detection technologies. The shapes denoted as 2.1 and 2.2 represent two other sub-goals which must be considered in order detect obstacles.
Figure 5. Decomposition of the second sub-goal of the UAV’s goal-directed task analysis: (2) detect and avoid obstacles. The hexagonal shape represents the decision about how to ensure that obstacles are detected. The answer to this question leads to the description of the three SA levels associated with obstacle detection technologies. The shapes denoted as 2.1 and 2.2 represent two other sub-goals which must be considered in order detect obstacles.
Drones 07 00501 g005
Figure 6. Decomposition of sub-goal 2.1 of the UAV’s goal-directed task analysis. The decomposition of this sub-goal allows the identification of SA requirements relating to the spatial information of the UAV, which is necessary to ensure the safety of the AAM.
Figure 6. Decomposition of sub-goal 2.1 of the UAV’s goal-directed task analysis. The decomposition of this sub-goal allows the identification of SA requirements relating to the spatial information of the UAV, which is necessary to ensure the safety of the AAM.
Drones 07 00501 g006
Figure 7. Decomposition of sub-goal 2.2 of the UAV’s GDTA. The achievement of this sub-goal requires three other sub-goals to be achieved. The decomposition of sub-goal 2.2.1 allows spatial information relating to other flying obstacles to be identified, which is necessary to assess the risk of collision. The decomposition of sub-goal 2.2.2 allows identification of information relating to nearby fixed obstacles in order to assess the risk of collision with them. The decomposition of sub-goal 2.2.3 allows identification of the information necessary to assess the proximity between the UAV and people on the ground.
Figure 7. Decomposition of sub-goal 2.2 of the UAV’s GDTA. The achievement of this sub-goal requires three other sub-goals to be achieved. The decomposition of sub-goal 2.2.1 allows spatial information relating to other flying obstacles to be identified, which is necessary to assess the risk of collision. The decomposition of sub-goal 2.2.2 allows identification of information relating to nearby fixed obstacles in order to assess the risk of collision with them. The decomposition of sub-goal 2.2.3 allows identification of the information necessary to assess the proximity between the UAV and people on the ground.
Drones 07 00501 g007
Figure 8. Decomposition of the third sub-goal of the UAV’s GDTA: (3) monitor weather conditions. The hexagonal shape on the left is a decision which leads to the identification of SA requirements related to weather detection technologies. The hexagonal shape on the right is a decision which identifies SA requirements related to weather conditions likely to affect the operation.
Figure 8. Decomposition of the third sub-goal of the UAV’s GDTA: (3) monitor weather conditions. The hexagonal shape on the left is a decision which leads to the identification of SA requirements related to weather detection technologies. The hexagonal shape on the right is a decision which identifies SA requirements related to weather conditions likely to affect the operation.
Drones 07 00501 g008
Figure 9. Decomposition of the decision about weather conditions to be monitored. These are wind cloud, mist fog and rain.
Figure 9. Decomposition of the decision about weather conditions to be monitored. These are wind cloud, mist fog and rain.
Drones 07 00501 g009
Figure 10. Decomposition of the fourth sub-goal of the of the UAV’s goal-directed task analysis: (4) monitor weather conditions. This sub-goal allows to identify main internal elements of the UAV that should be monitored during the operation.
Figure 10. Decomposition of the fourth sub-goal of the of the UAV’s goal-directed task analysis: (4) monitor weather conditions. This sub-goal allows to identify main internal elements of the UAV that should be monitored during the operation.
Drones 07 00501 g010
Figure 11. A high-level block definition diagram representing the structure of the advanced air mobility with the UAV as the central entity. Blue blocks represent UAV’s components and green blocks represents external components.
Figure 11. A high-level block definition diagram representing the structure of the advanced air mobility with the UAV as the central entity. Blue blocks represent UAV’s components and green blocks represents external components.
Drones 07 00501 g011
Figure 12. High-level modeling of the UAV behavior in the AAM system, shown in an activity diagram. Before taking off, the UAV must verify that it has achieved all its SA goals. To do so, it must follow sequences of parallel sub-activities which allow it to maintain SA and ensure the safety of the AAM system. The first step is followed by a fork node (labeled Initial Fork Node) which represents the main goal: ensuring the safety of the AAM system. From this node, a parallel branch is made to the sub-activities Monitor C2 links status, Monitor threats, Monitor weather conditions, Monitor the state of the UAV, which are related, respectively, to the C2 management system, the DAA management system, the weather detection system, and the system responsible for monitoring the state of the UAV. If the execution of one of these four actions leads to a need for remediation, the process continues through the sub-activity Call a sequence of generic actions, and an appropriate remediation action is executed. These generic actions and each of the four other sub-activities are described below.
Figure 12. High-level modeling of the UAV behavior in the AAM system, shown in an activity diagram. Before taking off, the UAV must verify that it has achieved all its SA goals. To do so, it must follow sequences of parallel sub-activities which allow it to maintain SA and ensure the safety of the AAM system. The first step is followed by a fork node (labeled Initial Fork Node) which represents the main goal: ensuring the safety of the AAM system. From this node, a parallel branch is made to the sub-activities Monitor C2 links status, Monitor threats, Monitor weather conditions, Monitor the state of the UAV, which are related, respectively, to the C2 management system, the DAA management system, the weather detection system, and the system responsible for monitoring the state of the UAV. If the execution of one of these four actions leads to a need for remediation, the process continues through the sub-activity Call a sequence of generic actions, and an appropriate remediation action is executed. These generic actions and each of the four other sub-activities are described below.
Drones 07 00501 g012
Figure 13. Modeling of the sub-activity Call a sequence of generic actions, containing the sequence of actions to resolve a problem detected during the verification of the situational awareness requirements.
Figure 13. Modeling of the sub-activity Call a sequence of generic actions, containing the sequence of actions to resolve a problem detected during the verification of the situational awareness requirements.
Drones 07 00501 g013
Figure 14. Modeling of the sub-activity Monitor C2 links status containing the sequence of actions to achieve the first SA goal.
Figure 14. Modeling of the sub-activity Monitor C2 links status containing the sequence of actions to achieve the first SA goal.
Drones 07 00501 g014
Figure 15. Modeling of the sub-activity Detect and avoid obstacles containing the sequence of actions to achieve the second SA goal.
Figure 15. Modeling of the sub-activity Detect and avoid obstacles containing the sequence of actions to achieve the second SA goal.
Drones 07 00501 g015
Figure 16. Modeling of the sub-activity Monitor weather conditions containing the sequence of actions to achieve the third SA goal.
Figure 16. Modeling of the sub-activity Monitor weather conditions containing the sequence of actions to achieve the third SA goal.
Drones 07 00501 g016
Figure 17. Modeling of the sub-activity Monitor the state of the UAV containing the sequence of actions to achieve the fourth SA goal.
Figure 17. Modeling of the sub-activity Monitor the state of the UAV containing the sequence of actions to achieve the fourth SA goal.
Drones 07 00501 g017
Table 1. Analysis of the DJI Matrice 300 RTK according to the four main SA goals described above. The SA goals are identified in the left column. In the right column, we analyze the technologies present in the specifications and we associate them with each of these goals.
Table 1. Analysis of the DJI Matrice 300 RTK according to the four main SA goals described above. The SA goals are identified in the left column. In the right column, we analyze the technologies present in the specifications and we associate them with each of these goals.
UAVMatrice 300 RTK
C2 links management system (autopilot, manual radio control, FPV)
Operating frequency
2.4000–2.4835 GHz (commonly used for the RC–GCS link [19])
5.725–5.850 GHz (commonly used for the FPV [19])
Max transmitting distance (unobstructed, free of interference)
FCC 1: 15 km
EIRP 2
  • 2.4000–2.4835 GHz: 29.5 dBm (FCC)
  • 5.725–5.850 GHz: 28.5 dBm (FCC).
DAA management system
Vision system with forward/backward/left/right and upward/downward obstacle sensing range of 0.7–40 m and 0.6–30 m, respectively. This also includes forward/backward/downward and left/right/upward fields of view (FOVs) of 65° (H), 50° (V) and 75° (H), 60° (V), respectively.
Infrared ToF sensing system 3 with an obstacle range between 0.1–8 m and FOV of 30° (±15°)
Top and bottom auxiliary lights with an effective lighting distance of 5 m.
Weather management systemNot described.
Underlined physical parts
An external LiPo battery, the WB37 Intelligent Battery with a charging time (using the BS60 Intelligent Battery Station) of 70 min at 15 °C to 45 °C or 130 min at 0 °C to 15 °C and an operating temperature of −4 °F to 104 °F (−20 °C to 40 °C)
An 18650 Li-ion built-in battery (5000 mAh @ 7.2 V) with a charging time of 135 min (using a USB charger with a specification of 12V/2A) and an operating temperature of −4 °F to 122 °F (−20 °C to 50 °C).
Propellers
Diagonal wheelbase of 895 mm
1 FCC: Federal Communications Commission of the USA, an authority responsible for radio services—the information in the drone data sheet relates to US airspace. https://www.sir-apfelot.de/en/drone-range-remote-control-ce-fcc-srrc-20548/ (accessed on 13 July 2022); 2 EIRP: Maximum Effective Isotropic Radiated Power. https://afar.net/tutorials/fcc-rules/ (accessed on 13 July 2022); 3 The infrared Time-of-Flight Sensing system makes is used to determine, using a light signal sent by the sensor and reflected by the object towards the sensor, the distance and depth information of an object. https://www.pocket-lint.com/phones/news/147024-what-is-a-time-of-flight-camera-and-which-phones-have-it; https://www.terabee.com/time-of-flight-principle/ (accessed on 13 July 2022).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kamkuimo, S.A.; Magalhaes, F.; Zrelli, R.; Misson, H.A.; Attia, M.B.; Nicolescu, G. Decomposition and Modeling of the Situational Awareness of Unmanned Aerial Vehicles for Advanced Air Mobility. Drones 2023, 7, 501. https://doi.org/10.3390/drones7080501

AMA Style

Kamkuimo SA, Magalhaes F, Zrelli R, Misson HA, Attia MB, Nicolescu G. Decomposition and Modeling of the Situational Awareness of Unmanned Aerial Vehicles for Advanced Air Mobility. Drones. 2023; 7(8):501. https://doi.org/10.3390/drones7080501

Chicago/Turabian Style

Kamkuimo, Sorelle Audrey, Felipe Magalhaes, Rim Zrelli, Henrique Amaral Misson, Maroua Ben Attia, and Gabriela Nicolescu. 2023. "Decomposition and Modeling of the Situational Awareness of Unmanned Aerial Vehicles for Advanced Air Mobility" Drones 7, no. 8: 501. https://doi.org/10.3390/drones7080501

Article Metrics

Back to TopTop