Vision-Based UAV Navigation

A special issue of Aerospace (ISSN 2226-4310). This special issue belongs to the section "Aeronautics".

Deadline for manuscript submissions: closed (31 May 2023) | Viewed by 2946

Special Issue Editor


E-Mail Website
Guest Editor
Autonomous IoT Research Center, Korea Electronics Technology Institute, Gyeonggi 13509, Seongnam, Korea
Interests: multi-sensor SLAM; autonomous navigation; visual-inertial odometry
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Recently, the unmanned aerial vehicle (UAV) industry has been experiencing rapid growth, and UAVs are starting to play an important role in various missions such as search and rescue, environmental monitoring, security monitoring, transportation, and inspection. A lot of research into various aspects regarding sensor-based navigation for UAVs and various robots is being actively conducted in academia or by research institutes. This is because state estimation or active searches in an unknown environment using sensor information is essential indoors or in areas where GPS is not available.

This Special Issue aims to contribute to an increase in the level of knowledge in the context of vision-based UAV navigation. In this Special Issue, original research articles and reviews are welcome. Research areas may include (but are not limited to) the following:

  • Visual–inertial odometry/visual SLAM on UAVs;
  • Various optic-sensor-based technologies (event camera, thermal camera, etc.);
  • Multi-robot collaborative V-SLAM for autonomous navigation;
  • Deep-learning-based navigation;
  • Sensor fusion (vision, LiDAR, UWB, etc.);
  • Applications in real fields (subterranean, warehouses, etc.).

Dr. Sungwook Jung
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Aerospace is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • visual–inertial odometry
  • visual SLAM
  • visual navigation for UAVs
  • multi-robot navigation
  • multi-sensor fusion
  • state estimation

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

27 pages, 3739 KiB  
Article
Visual Navigation Algorithm for Night Landing of Fixed-Wing Unmanned Aerial Vehicle
by Zhaoyang Wang, Dan Zhao and Yunfeng Cao
Aerospace 2022, 9(10), 615; https://doi.org/10.3390/aerospace9100615 - 17 Oct 2022
Cited by 5 | Viewed by 1901
Abstract
In the recent years, visual navigation has been considered an effective mechanism for achieving an autonomous landing of Unmanned Aerial Vehicles (UAVs). Nevertheless, with the limitations of visual cameras, the effectiveness of visual algorithms is significantly limited by lighting conditions. Therefore, a novel [...] Read more.
In the recent years, visual navigation has been considered an effective mechanism for achieving an autonomous landing of Unmanned Aerial Vehicles (UAVs). Nevertheless, with the limitations of visual cameras, the effectiveness of visual algorithms is significantly limited by lighting conditions. Therefore, a novel vision-based autonomous landing navigation scheme is proposed for night-time autonomous landing of fixed-wing UAV. Firstly, due to the difficulty of detecting the runway caused by the low-light image, a strategy of visible and infrared image fusion is adopted. The objective functions of the fused and visible image, and the fused and infrared image, are established. Then, the fusion problem is transformed into the optimal situation of the objective function, and the optimal solution is realized by gradient descent schemes to obtain the fused image. Secondly, to improve the performance of detecting the runway from the enhanced image, a runway detection algorithm based on an improved Faster region-based convolutional neural network (Faster R-CNN) is proposed. The runway ground-truth box of the dataset is statistically analyzed, and the size and number of anchors in line with the runway detection background are redesigned based on the analysis results. Finally, a relative attitude and position estimation method for the UAV with respect to the landing runway is proposed. New coordinate reference systems are established, six landing parameters, such as three attitude and three positions, are further calculated by Orthogonal Iteration (OI). Simulation results reveal that the proposed algorithm can achieve 1.85% improvement of AP on runway detection, and the reprojection error of rotation and translation for pose estimation are 0.675 and 0.581%, respectively. Full article
(This article belongs to the Special Issue Vision-Based UAV Navigation)
Show Figures

Figure 1

Back to TopTop