Next Article in Journal
Development and Field Testing of a Wireless Data Relay System for Amphibious Drones
Previous Article in Journal
Radiometric Improvement of Spectral Indices Using Multispectral Lightweight Sensors Onboard UAVs
 
 
Article
Peer-Review Record

High-Altitude Precision Landing by Smartphone Video Guidance Sensor and Sensor Fusion

by Joao Leonardo Silva Cotta 1,*, Hector Gutierrez 2, Ivan R. Bertaska 3, John P. Inness 3 and John Rakoczy 3
Reviewer 2:
Reviewer 3: Anonymous
Submission received: 18 November 2023 / Revised: 18 January 2024 / Accepted: 19 January 2024 / Published: 25 January 2024

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

The work is very interesting and presented in a descriptive way.  Emphasizes aiding navigation without addressing in the control area.  Although the title mentions guidance, this subject is not a priority in the work.   Very little is said about the control system, although it is essential for the accomplishment of the mission.  Important points, such as the use of multiple sampling rates, were not addressed (Figure 8).  Possible transients due to the switches described in the algorithms were not even mentioned. Even though it uses a complex kalman filter, very little is said about its design and the difficulties of tuning.  Simply, the final result is shown.  As it stands, the reader gets to know what it's about but can't take advantage of the knowledge.

Author Response

Thanks for your valuable comments. Please refer to the attached document for our answers.

Author Response File: Author Response.pdf

Reviewer 2 Report

Comments and Suggestions for Authors

1. The main focus of the paper is on sensor fusion, utilizing the novel SVGS system as an auxiliary positioning unit. A Kalman filter algorithm was used to merge SVGS and pixhawk4 flight control's IMU information, making the aircraft landing safer and more accurate. Compared to the latest automatic landing technology based on infrared beacons, it provides better performance in attitude estimation during landing. Compared to the latest high-altitude landing technologies based on sensors like LIDAR, the proposed technique also shows significant advantages. What are the essential differences and advantages compared with vSLAM?

2. In addition, it is hoped that a video link or a group of pictures of the experimental flight landing can be added to the article.

Author Response

Thanks for your valuable comments. Please refer to the attached document for our answers.

Author Response File: Author Response.pdf

Reviewer 3 Report

Comments and Suggestions for Authors

The authors presented a technique for landing at high altitudes using beacons and smartphones in the absence of GPS or external infrastructure, demonstrating the functionality of the proposed system through experiments. However, there is a need for reinforcement in terms of system configuration, operation, and experimental details. Additionally, there is a need for enhancement in other parts of the manuscript. More specifically, the following areas require improvement:

  1. Introduction and Literature Review: The sections are too brief, and to enhance readability and comprehension, there is a need to expand on the content in Introduction and Literature Review (or Related Works).
  2. System Operation: The detailed explanation of the system configuration and operation is lacking. The authors seem to take a simplistic approach, and improvements are needed in the following aspects:
    • The use of GPS on a PX4-based drone contradicts the claim of achieving precise landing without GPS. Basically, PX4 based flight controller utilize the information from the GPS to acquire the local and global position of the drone.
    • The delay between SVGS and flight controller seems considerable.
    • Information regarding the appearance of beacons at different altitudes or distances in smartphone images is needed.
  3. Drone Movement and Scanning: Clarification is needed on how the distance of 0.5j was determined for drone movement during zigzag scanning. Additionally, information on the area a smartphone can capture in a single shot at a specific altitude is required.
  4. Excessive Detail: The authors cover the implementation and operation of the proposed system in too broad a scope, resulting in insufficient explanation of the proposed aspects. Specific points for improvement include:
    • In mission sequence (Figure 4 and 5), only the mission performance sequence is necessary. Other sequences could be removed.
    • In the SVGS interconnection diagram (Figure 6), the components related to the power and propulsion are not necessary to describe the proposed system.
    • If no modifications were made to PX4 flight stack, details on PX4 motion controller may not be necessary. Otherwise, the authors must address the difference between default version of the flight controller and the authors’ version.
  5. Flow Charts and Figures: Simplify and improve the readability of flow charts and ensure that figures align with the discussed content. Specific suggestions include:
    • The size of text in Figure 11 is too small.
    • Figures 11 and 12 may not need a full page each. Also, the figures could be enhanced with more explanation within the manuscript.
  6. Experimentation: Additional details and analysis are needed for the experiments, including:
    • Confirmation of the system's ability to land without GPS. For example, the authors should modify the PX4 flight stack to disable the GPS functionality, or physically disconnect the GPS.
    • The experiment configuration should be presented within the manuscript.
    • Further analysis of IR lock errors compared to SVGS errors.
    • Clarification of the apparent discrepancy between the reported altitude and the actual landing location in Figures 17, 18, and 19.
  7. Conclusion: The authors presented their proposed system and its results through 20 pages of the manuscript but concluded the manuscript with only one short paragraph. It should provide more than just a listing of results; instead, it should offer the authors proposed systems’ strengths and additional insights for an improved manuscript quality.

Author Response

Thanks for your valuable comments. Please refer to the attached document for our answers.

Author Response File: Author Response.pdf

Round 2

Reviewer 3 Report

Comments and Suggestions for Authors

The authors addressed reviewer’s comments within the revised manuscript. However, there are several points are not addressed appropriately. The comments about the authors’ reply are as follows:

 

Q1

The authors addressed the reviewers comment about the Q1.

 

Q2

The authors mentioned that they addressed the comment about the blob detection algorithm and GPS. However, the beacon appearance is neither about the blob detection algorithm nor the resolution of the captured image. The appearance of the beacon means that how’s the size of the blob in the captured image with respect to the distance between the blob and the camera (or smartphone).

Although, the authors proposed the novel landing guidance system which operates in the GPS-less environment. But the experiment was conducted with UAV based on PX4-based flight controller, connected to an RTK GPS, providing highly precise position information. The experimental result seems to be affected by the high accuracy of the RTK GPS connected to PX4-based flight controller, with no modification to the internal flight control algorithm or loop. In former comment, I asked the usage of GPS and modification of flight stack. However, the authors do not address about the comment.

 

Q3

The authors mentioned that they selected the value of 0.5j through trial and error. However, the authors must provide justification for choosing 0.5j. To demonstrate that 0.5j is suitable for the system, the authors need to compare the results obtained with 0.5j to those with other values (such as 1j, 2j, 0.25j). Without such a comparison, it appears that the authors chose 0.5j arbitrarily.

 

Q4

The authors revised figure 4, 5, and 6 correctly. However, the authors added the Figure 8 without the any modification of the PX4 motion controller. Therefore, the authors must remove the Figure 8. And with the same reason, the Figure 7 must be removed.

 

Q5

The authors addressed the reviewers comment about the Q5.

 

Q6

The suggestion that landing without the support of systems like GPS is almost impossible is understood as the authors' claim that the landing technique utilizing SVGS reading, as proposed by the authors, is practically unfeasible. Furthermore, the authors mention that SVGS reading takes precedence over GPS, but since the PX4 flight stack's control loop already uses GPS information and the PX4-based flight controller basically uses the position information from the GPS, if connected. Therefore, the proposed experimental configuration cannot be considered a completely GPS-less environment. Moreover, if GPS RTK is activated, it is expected that the error in the position information handled by the PX4 flight stack will be less than 10 cm, indicating substantial GPS support.

Therefore, to demonstrate the effectiveness of SVGS reading, it seems necessary to examine the impact of SVGS latency on accuracy in an environment without GPS. It is recommended to use a standard GPS with RTK turned off, connected to PX4, and an independent RTK GPS installed on the UAV for ground truth.

Also, with the Figure 9, it appears that only one GPS module is attached to the drone. If this GPS has RTK functionality enabled, the position information handled internally by PX4 would be very accurate, with an expected error of less than 10 cm. Notably, the frequency of a standard GPS can be increased up to 10Hz, while the authors mention that the frequency of SVGS reading they propose is 30 Hz. If SVGS reading is effective, it is believed that it can perform as well as a typical GPS.

 

Also, the authors claimed the experimental configuration is presented, but it is not visible, and only the experimental results are shown. The experimental configuration should include the following information:

e.g., The UAVs initiated landing from specific coordinates (ex. (0,0,0)), with landing points designated as (x1, y1, z1), (x2, y2, z2), (x3, y3, z3). With the given landing points we conducted three experiments.

 

Moreover, in Figure 14 and 15, the environments for the IR lock and SVGS experiments appear different, requiring an explanation. While the authors mentioned initiating landing from an altitude of 100 m, it is necessary to specify the starting position on the x-y plane.

 

With the experimental results in Figure 17, 18, and 19, it seems that the movement on the x-y plane is less than 1 meter. This implies that the experiment shows a very limited aspect of the proposed technique. Therefore, it appears necessary to conduct experiments in environments where the drone moves significantly on the x-y plane (or performs zig-zag scans).

Additionally, displaying the overall trajectory of the proposed experiment in 2D (x-y plane) or 3D scatter plots would greatly aid in understanding the functioning and performance of the system as suggested by the authors.

Furthermore, an explanation is needed regarding the meaning of the colors in the scatter plots in Figures 17, 18, and 19 (red, magenta, cyan, green).

 

In Figure 16, Flight 1 appears to be partially cut off, suggesting that adjustments are necessary. (Possibly adjusting the x-axis of the graph)

 

Q7

The authors addressed the reviewers comment about the Q7.

Author Response

Please review the attached file for detailed reply to all comments.

Author Response File: Author Response.pdf

Round 3

Reviewer 3 Report

Comments and Suggestions for Authors

The authors appropriately addressed all comments except for the following ones.

Q2.

The authors did not correctly understand my comment. As mentioned in Algorithm 1, the authors are supposed to utilize the captured blobs to find the most suitable combination. However, when capturing the blobs with smartphone at a distance of 100 meters, if multiple blobs appear as one, it seems that the algorithm proposed by the authors may not be functioning correctly. Therefore, I provided comments as 'The appearance of the beacon in this context means how the size of the blob in the image relates to the distance between the blob and camera (or smartphone).' This is not a discussion about the state. However, the authors did not respond appropriately to my previous comments. A response is needed regarding this matter.

Q6.
The authors correctly addressed all comments except for the one related to the 3D scatter plot. The way the drone's landing path is currently presented in the paper, displaying the F, L, and U components separately in the FLU-frame would beneficial for the readers. However, for a better understanding of the system's operation and the drone's landing trajectory as proposed by the authors, it is also necessary to present FLU components in a 3D graph with XYZ coordinates. The graph would include both the trajectory based on the sensor fusion and SVGS readings.

Author Response

Q2. Thanks for the additional explanation. As you mention, SVGS requires correct discrimination of 4 separate blobs. This is only possible within a certain range of distances between target and camera. To operate over a large range, this is resolved by providing targets of different dimensions to cover different range intervals during the mission, in this case, three different size targets described in Figure 10 and Table 3. Additional text has been included after Figure 10 to clarify this point.

Q6. A new figure has been included (Figure 20) that shows the landing trajectory as a 3D plot, as you recommended.  This figure provides more clarity on the landing maneuver, thanks for this suggestion. 

 

Back to TopTop