remotesensing-logo

Journal Browser

Journal Browser

Multi-Sensor Systems and Data Fusion in Remote Sensing II

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Remote Sensing Image Processing".

Deadline for manuscript submissions: 15 August 2024 | Viewed by 7374

Special Issue Editors


E-Mail Website
Guest Editor
Faculty of Electronics, Military University of Technology, 00-908 Warsaw, Poland
Interests: multi-sensor data fusion; statistical estimation; integrated navigation systems; simultaneous localization and mapping; synthetic aperture radars; unmanned aerial systems
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Computer Science and Engineering (DISI), University of Bologna, Viale Risorgimento, 2 40136 Bologna, Italy
Interests: computer vision; machine-learning; 3D vision; embedded computer vision
Special Issues, Collections and Topics in MDPI journals
Department of Computer Science and Engineering (DISI), University of Bologna, Viale Risorgimento, 2 40136 Bologna, Italy
Interests: computer vision; machine-learning; 3D vision; embedded computer vision

Special Issue Information

Dear Colleagues,

The remote sensing of today is developing at a rapid pace due to the technological progress in many interconnected fields. It includes emergence of new sensors and refined and more capable traditional sensors, development of more and more sophisticated space, aerial, and ground platforms for mounting those sensors, as well as advances in signal and data processing algorithms. The technological progress in the fields of radar, optoelectronic, acoustic, magnetic, chemical and other sensors is really stunning. Whereas the mentioned sensors are currently more and more sensitive and accurate, have improved resolutions, data rates, and dynamical ranges, they still have their intrinsic deficiencies and limitations. Utilization of multi-sensor systems and joint processing of their signals or data has long been considered an effective solution for reducing the mentioned disadvantages and making the best use of their strengths, this way leading to a synergy effect. Emergence of new types of cutting-edge-technology sensors creates an excellent opportunity for scientists and engineers to propose and develop new and more capable integrated multi-sensor systems. At this point it is necessary to mention that the users’ demands and expectations with respect to the size of the observed area or volume, data resolution, accuracy, speed of operation, and functionality of remote sensing systems are still increasing. Extended frequency bands, improved resolutions and data rates of the new sensors as well as more and more common use of systems composed of many spatially distributed sensors increase the influx of data in contemporary multi-sensor systems. The above facts pose new challenges for the data fusion algorithms which must often employ the newest techniques and achievements from the areas of big data mining, statistical estimation, artificial intelligence, and the other concepts that have recently come to light. Therefore, it would be of great interest to the remote sensing community to have a fresh insight into the newest developments in the fields of multi-sensor systems and data fusion. We would like to invite you to submit theoretical or application-oriented papers presenting new developments including, but not limited to the following topics:

  • Multi-sensor remote-sensing systems in the Earth science, environmental monitoring, robotics, transportation, industrial process monitoring, security and military applications
  • Unconventional multi-sensor solutions
  • Spatially distributed networks of sensors
  • Distributed signal and data processing
  • Multi-sensor data fusion on raw data level, feature level and decision level
  • Statistical estimation in remote sensing
  • Artificial intelligence in remote sensing
  • Big data processing in remote sensing
  • Machine learning in remote sensing

Prof. Dr. Piotr Kaniewski
Dr. Stefano Mattoccia
Dr. Fabio Tosi
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • remote sensing
  • multi-sensor systems
  • multi-sensor data fusion
  • sensor networks
  • multi-sensor signal processing
  • multi-sensor data processing
  • artificial intelligence
  • big data mining
  • machine learning
  • distributed processing

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

22 pages, 38947 KiB  
Article
Infrared–Visible Image Fusion through Feature-Based Decomposition and Domain Normalization
by Weiyi Chen, Lingjuan Miao, Yuhao Wang, Zhiqiang Zhou and Yajun Qiao
Remote Sens. 2024, 16(6), 969; https://doi.org/10.3390/rs16060969 - 10 Mar 2024
Viewed by 653
Abstract
Infrared–visible image fusion is valuable across various applications due to the complementary information that it provides. However, the current fusion methods face challenges in achieving high-quality fused images. This paper identifies a limitation in the existing fusion framework that affects the fusion quality: [...] Read more.
Infrared–visible image fusion is valuable across various applications due to the complementary information that it provides. However, the current fusion methods face challenges in achieving high-quality fused images. This paper identifies a limitation in the existing fusion framework that affects the fusion quality: modal differences between infrared and visible images are often overlooked, resulting in the poor fusion of the two modalities. This limitation implies that features from different sources may not be consistently fused, which can impact the quality of the fusion results. Therefore, we propose a framework that utilizes feature-based decomposition and domain normalization. This decomposition method separates infrared and visible images into common and unique regions. To reduce modal differences while retaining unique information from the source images, we apply domain normalization to the common regions within the unified feature space. This space can transform infrared features into a pseudo-visible domain, ensuring that all features are fused within the same domain and minimizing the impact of modal differences during the fusion process. Noise in the source images adversely affects the fused images, compromising the overall fusion performance. Thus, we propose the non-local Gaussian filter. This filter can learn the shape and parameters of its filtering kernel based on the image features, effectively removing noise while preserving details. Additionally, we propose a novel dense attention in the feature extraction module, enabling the network to understand and leverage inter-layer information. Our experiments demonstrate a marked improvement in fusion quality with our proposed method. Full article
(This article belongs to the Special Issue Multi-Sensor Systems and Data Fusion in Remote Sensing II)
Show Figures

Figure 1

20 pages, 1608 KiB  
Article
Multi-Source T-S Target Recognition via an Intuitionistic Fuzzy Method
by Chuyun Zhang, Weixin Xie, Yanshan Li and Zongxiang Liu
Remote Sens. 2023, 15(24), 5773; https://doi.org/10.3390/rs15245773 - 18 Dec 2023
Viewed by 554
Abstract
To realize aerial target recognition in a complex environment, we propose a multi-source Takagi–Sugeno (T-S) intuitionistic fuzzy rules method (MTS-IFRM). In the proposed method, to improve the robustness of the training process of the model, the features of the aerial targets are classified [...] Read more.
To realize aerial target recognition in a complex environment, we propose a multi-source Takagi–Sugeno (T-S) intuitionistic fuzzy rules method (MTS-IFRM). In the proposed method, to improve the robustness of the training process of the model, the features of the aerial targets are classified as the input results of the corresponding T-S target recognition model. The intuitionistic fuzzy approach and ridge regression method are used in the consequent identification, which constructs a regression model. To train the premise parameter and reduce the influence of data noise, novel intuitionistic fuzzy C-regression clustering based on dynamic optimization is proposed. Moreover, a modified adaptive weight algorithm is presented to obtain the final outputs, which improves the classification accuracy of the corresponding model. Finally, the experimental results show that the proposed method can effectively recognize the typical aerial targets in error-free and error-prone environments, and that its performance is better than other methods proposed for aerial target recognition. Full article
(This article belongs to the Special Issue Multi-Sensor Systems and Data Fusion in Remote Sensing II)
Show Figures

Figure 1

30 pages, 20140 KiB  
Article
Comparative Analysis of Pixel-Level Fusion Algorithms and a New High-Resolution Dataset for SAR and Optical Image Fusion
by Jinjin Li, Jiacheng Zhang, Chao Yang, Huiyu Liu, Yangang Zhao and Yuanxin Ye
Remote Sens. 2023, 15(23), 5514; https://doi.org/10.3390/rs15235514 - 27 Nov 2023
Cited by 2 | Viewed by 1121
Abstract
Synthetic aperture radar (SAR) and optical images often present different geometric structures and texture features for the same ground object. Through the fusion of SAR and optical images, it can effectively integrate their complementary information, thus better meeting the requirements of remote sensing [...] Read more.
Synthetic aperture radar (SAR) and optical images often present different geometric structures and texture features for the same ground object. Through the fusion of SAR and optical images, it can effectively integrate their complementary information, thus better meeting the requirements of remote sensing applications, such as target recognition, classification, and change detection, so as to realize the collaborative utilization of multi-modal images. In order to select appropriate methods to achieve high-quality fusion of SAR and optical images, this paper conducts a systematic review of current pixel-level fusion algorithms for SAR and optical image fusion. Subsequently, eleven representative fusion methods, including component substitution methods (CS), multiscale decomposition methods (MSD), and model-based methods, are chosen for a comparative analysis. In the experiment, we produce a high-resolution SAR and optical image fusion dataset (named YYX-OPT-SAR) covering three different types of scenes, including urban, suburban, and mountain. This dataset and a publicly available medium-resolution dataset are used to evaluate these fusion methods based on three different kinds of evaluation criteria: visual evaluation, objective image quality metrics, and classification accuracy. In terms of the evaluation using image quality metrics, the experimental results show that MSD methods can effectively avoid the negative effects of SAR image shadows on the corresponding area of the fusion result compared with CS methods, while model-based methods exhibit relatively poor performance. Among all of the fusion methods involved in the comparison, the non-subsampled contourlet transform method (NSCT) presents the best fusion results. In the evaluation using image classification, most experimental results show that the overall classification accuracy after fusion is better than that before fusion. This indicates that optical-SAR fusion can improve land classification, with the gradient transfer fusion method (GTF) yielding the best classification results among all of these fusion methods. Full article
(This article belongs to the Special Issue Multi-Sensor Systems and Data Fusion in Remote Sensing II)
Show Figures

Figure 1

45 pages, 42285 KiB  
Article
Fusion of Identification Information from ESM Sensors and Radars Using Dezert–Smarandache Theory Rules
by Tadeusz Pietkiewicz
Remote Sens. 2023, 15(16), 3977; https://doi.org/10.3390/rs15163977 - 10 Aug 2023
Cited by 1 | Viewed by 792
Abstract
This paper presents a method of fusion of identification (attribute) information provided by two types of sensors: combined primary and secondary (IFF) surveillance radars and ESMs (electronic support measures). In the first section, the basic taxonomy of attribute identification is adopted in accordance [...] Read more.
This paper presents a method of fusion of identification (attribute) information provided by two types of sensors: combined primary and secondary (IFF) surveillance radars and ESMs (electronic support measures). In the first section, the basic taxonomy of attribute identification is adopted in accordance with the standards of STANAG 1241 ed. 5 and STANAG 1241 ed. 6 (draft). These standards provide the following basic values of the attribute identifications: FRIEND; HOSTILE; NEUTRAL; UNKNOWN; and additional values, namely ASSUMED FRIEND and SUSPECT. The basis of theoretical considerations is Dezert–Smarandache theory (DSmT) of inference. This paper presents and uses in practice six information-fusion rules proposed by DSmT, i.e., the proportional conflict redistribution rules (PCR1, PCR2, PCR3, PCR4, PCR5, and PCR6), for combining identification information from different ESM sensors and radars. This paper demonstrates the rules of determining attribute information by an ESM sensor equipped with the database of radar emitters. It is proposed that each signal vector sent by the ESM sensor contains an extension specifying a randomized identification declaration (hypothesis)—a basic belief assignment (BBA). This paper also presents a model for determining the basic belief assignment for a combined primary and secondary radar. Results of the PCR rules of sensor information combining for different scenarios of a radio electronic situation (deterministic and Monte Carlo) are presented in the final part of this paper. They confirm the legitimacy of the use of Dezert–Smarandache theory in information fusion for primary radars, secondary radars, and ESM sensors. Full article
(This article belongs to the Special Issue Multi-Sensor Systems and Data Fusion in Remote Sensing II)
Show Figures

Figure 1

18 pages, 12775 KiB  
Article
Fiber Optic Acoustic Sensing to Understand and Affect the Rhythm of the Cities: Proof-of-Concept to Create Data-Driven Urban Mobility Models
by Luz García, Sonia Mota, Manuel Titos, Carlos Martínez, Jose Carlos Segura and Carmen Benítez
Remote Sens. 2023, 15(13), 3282; https://doi.org/10.3390/rs15133282 - 26 Jun 2023
Cited by 3 | Viewed by 1744
Abstract
In the framework of massive sensing and smart sustainable cities, this work presents an urban distributed acoustic sensing testbed in the vicinity of the School of Technology and Telecommunication Engineering of the University of Granada, Spain. After positioning the sensing technology and the [...] Read more.
In the framework of massive sensing and smart sustainable cities, this work presents an urban distributed acoustic sensing testbed in the vicinity of the School of Technology and Telecommunication Engineering of the University of Granada, Spain. After positioning the sensing technology and the state of the art of similar existing approaches, the results of the monitoring experiment are described. Details of the sensing scenario, basic types of events automatically distinguishable, initial noise removal actions and frequency and signal complexity analysis are provided. The experiment, used as a proof-of-concept, shows the enormous potential of the sensing technology to generate data-driven urban mobility models. In order to support this fact, examples of preliminary density of traffic analysis and average speed calculation for buses, cars and pedestrians in the testbed’s neighborhood are exposed, together with the accidental presence of a local earthquake. Challenges, benefits and future research directions of this sensing technology are pointed out. Full article
(This article belongs to the Special Issue Multi-Sensor Systems and Data Fusion in Remote Sensing II)
Show Figures

Figure 1

25 pages, 11689 KiB  
Article
Estimation of Handheld Ground-Penetrating Radar Antenna Position with Pendulum-Model-Based Extended Kalman Filter
by Piotr Kaniewski and Tomasz Kraszewski
Remote Sens. 2023, 15(3), 741; https://doi.org/10.3390/rs15030741 - 27 Jan 2023
Cited by 2 | Viewed by 1485
Abstract
Landmines and explosive remnants of war are a significant threat in tens of countries and other territories, causing the deaths or injuries of thousands of people every year, even long after military conflicts. Effective technical means of remote detecting, localizing, imaging, and identifying [...] Read more.
Landmines and explosive remnants of war are a significant threat in tens of countries and other territories, causing the deaths or injuries of thousands of people every year, even long after military conflicts. Effective technical means of remote detecting, localizing, imaging, and identifying mines and other buried explosives are still sought and have a great potential utility. This paper considers a positioning system used as a supporting tool for a handheld ground penetrating radar. Accurate knowledge of the radar antenna position during terrain scanning is necessary to properly localize and visualize the shape of buried objects, which helps in their remote classification and makes demining safer. The positioning system proposed in this paper uses ultrawideband radios to measure the distances between stationary beacons and mobile units. The measurements are processed with an extended Kalman filter based on an innovative dynamics model, derived from the model of a pendulum motion. The results of simulations included in the paper prove that using the proposed pendulum dynamics model ensures a better accuracy than the accuracy obtainable with other typically used dynamics models. It is also demonstrated that our positioning system can estimate the radar antenna position with the accuracy of single centimeters which is required for appropriate imaging of buried objects with the ground penetrating radars. Full article
(This article belongs to the Special Issue Multi-Sensor Systems and Data Fusion in Remote Sensing II)
Show Figures

Figure 1

Back to TopTop