Satellite-Terrestrial Integrated Internet of Things

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Networks".

Deadline for manuscript submissions: 15 June 2024 | Viewed by 11987

Special Issue Editors

School of Electronics and Information Engineering, Harbin Institute of Technology, Harbin 150080, China
Interests: cognitive satellite; multiple access techniques; waveform design; IoT
School of Information Science and Technology, Dalian Maritime University, Dalian 116026, China
Interests: space-air-ground networks; UAV Communications; MEC; AI based communications
Special Issues, Collections and Topics in MDPI journals
Research Institute, China Unicom, Beijing 100048, China
Interests: big data; artificial intelligence; satellite network; mobile communication
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Satellite communications, machine to machine communications and internet of things (IoT) technology are drastically evolving to meet the demands of anywhere and anytime broadband spectrum access. Integrating the next generation of high-throughput satellite into IoT is viewed as a critical role in the foreseen satellite–terrestrial integrated IoT, which can provide global hybrid satellite-terrestrial broadband access in a cost-effective manner. However, when satellite networks encounter massive IoT terminals’ accessing, the traditional framework and protocol between satellite systems and terrestrial users should be adaptively redesigned. In the satellite–terrestrial integrated IoT, several key factors such as long propagation delay of satellite channel, remote communication link along with constrained geostationary orbit, information security of open satellite link and limited on-satellite data processing capability should be carefully handled. It is envisioned that caching networks, edge computing, security protection and big data can be utilized in the satellite-terrestrial Integrated IoT to improve reliability and real time of data transmissions. More elastic and efficient resource management schemes must be devised to fit in heterogeneous and complex IoT application environments. Such attributes call for a deep fusion of satellite communications, IoT technology, data security and dynamic resource management.

This Special Issue solicits original research and practical contributions which advance related technology for satellite–terrestrial integrated IoT, regarding its architecture, technologies and applications. Surveys and state-of-the-art tutorials are also welcome.

Potential topics may include, but are not limited to:

  • Framework, algorithms and protocol design for satellite–terrestrial integrated IoT;
  • Cognitive radio for satellite–terrestrial integrated IoT;
  • Innovative architecture, infrastructure, techniques and testbeds for satellite–terrestrial integrated IoT;
  • Privacy-preserving data aggregation and communications for satellite–terrestrial integrated IoT;
  • Security-protecting data transmission for satellite–terrestrial integrated IoT;
  • Dynamic spectrum access for satellite–terrestrial integrated IoT;
  • Low power cost and energy-efficient resource management for satellite–terrestrial integrated IoT;
  • Interference suppression for massive IoT terminals’ accessing;
  • Low-latency and high-reliability communication for satellite–terrestrial integrated IoT;
  • Hardware design and prototyping for satellite–terrestrial integrated IoT;
  • On-satellite big data processing.

Prof. Dr. Min Jia
Prof. Dr. Zhenyu Na
Dr. Xin Liu
Dr. Lexi Xu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • integrated Satellite-Terrestrial
  • internet of things
  • frequency efficient
  • energy efficient
  • multi-access

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

17 pages, 4433 KiB  
Article
VEDAM: Urban Vegetation Extraction Based on Deep Attention Model from High-Resolution Satellite Images
by Bin Yang, Mengci Zhao, Ying Xing, Fuping Zeng and Zhaoyang Sun
Electronics 2023, 12(5), 1215; https://doi.org/10.3390/electronics12051215 - 03 Mar 2023
Viewed by 1175
Abstract
With the rapid development of satellite and internet of things (IoT) technology, it becomes more and more convenient to acquire high-resolution satellite images from the ground. Extraction of urban vegetation from high-resolution satellite images can provide valuable suggestions for the decision-making of urban [...] Read more.
With the rapid development of satellite and internet of things (IoT) technology, it becomes more and more convenient to acquire high-resolution satellite images from the ground. Extraction of urban vegetation from high-resolution satellite images can provide valuable suggestions for the decision-making of urban management. At present, deep-learning semantic segmentation has become an important method for vegetation extraction. However, due to the poor representation of context and spatial information, the effect of segmentation is not accurate. Thus, vegetation extraction based on Deep Attention Model (VEDAM) is proposed to enhance the context and spatial information representation ability in the scenario of vegetation extraction from satellite images. Specifically, continuous convolutions are used for feature extraction, and atrous convolutions are introduced to obtain more multi-scale context information. Then the extracted features are enhanced by the Spatial Attention Module (SAM) and the atrous spatial pyramid convolution functions. In addition, image-level feature obtained by image pooling encoding global context further improves the overall performance. Experiments are conducted on real datasets Gaofen Image Dataset (GID). From the comparative experimental results, it is concluded that VEDAM achieves the best mIoU (mIoU = 0.9136) of vegetation semantic segmentation. Full article
(This article belongs to the Special Issue Satellite-Terrestrial Integrated Internet of Things)
Show Figures

Figure 1

17 pages, 1194 KiB  
Article
A Knowledge Inference and Sharing-Based Open-Set Device Recognition Approach for Satellite-Terrestrial-Integrated IoT
by Ying Yang and Lidong Zhu
Electronics 2023, 12(5), 1143; https://doi.org/10.3390/electronics12051143 - 27 Feb 2023
Cited by 1 | Viewed by 996
Abstract
Satellite-terrestrial-integrated internet of things (IoT) is an inevitable trend in future development, but open satellite link and massive IoT device access will bring serious security risks. However, most existing recognition models are unable to discover and reject malicious IoT devices since they lack [...] Read more.
Satellite-terrestrial-integrated internet of things (IoT) is an inevitable trend in future development, but open satellite link and massive IoT device access will bring serious security risks. However, most existing recognition models are unable to discover and reject malicious IoT devices since they lack the decision information of these unauthorized devices during training. To address this dilemma, this paper proposes a knowledge inference and sharing-based open-set recognition approach to protect satellite-terrestrial-integrated IoT. It proceeds in two steps. First, knowledge inference, where we construct ideal substitutes for unauthorized devices after reasonable inference on the training set, aims to compensate the model’s missing decision information. Second, knowledge sharing, where we inherit the existing knowledge and modify the model’s decision boundaries through model expansion and knowledge distillation, achieves accurate open-set recognition. Experiments on the ORACLE dataset demonstrated that our approach outperforms other state-of-the-art OSR methods in terms of accuracy and running time. In short, our approach has excellent performance while only slightly increasing computational complexity. Full article
(This article belongs to the Special Issue Satellite-Terrestrial Integrated Internet of Things)
Show Figures

Figure 1

13 pages, 1855 KiB  
Article
A Multi-Branch DQN-Based Transponder Resource Allocation Approach for Satellite Communications
by Wenyu Sun, Weijia Zhang, Ning Ma and Min Jia
Electronics 2023, 12(4), 916; https://doi.org/10.3390/electronics12040916 - 11 Feb 2023
Viewed by 1065
Abstract
In light of the increasing scarcity of frequency spectrum resources for satellite communication systems based on the transparent transponder, fast and efficient satellite resource allocation algorithms have become key to improving the overall resource occupancy. In this paper, we propose a reinforcement learning-based [...] Read more.
In light of the increasing scarcity of frequency spectrum resources for satellite communication systems based on the transparent transponder, fast and efficient satellite resource allocation algorithms have become key to improving the overall resource occupancy. In this paper, we propose a reinforcement learning-based Multi-Branch Deep Q-Network (MBDQN), which introduces TL-Branch and RP-Branch to extract features of satellite resource pool state and task state simultaneously, and Value-Branch to calculate the action-value function. On the one hand, MBDQN improves the average resource occupancy performance (AOP) through the selection of multiple actions, including task selection and resource priority actions. On the other hand, the trained MBDQN is more suitable for online deployment and significantly reduces the runtime overhead due to the fact that MBDQN does not need iteration in the test phase. Experiments on both non-zero waste and zero waste datasets demonstrate that our proposed method achieves superior performance compared to the greedy or heuristic methods on the generated task datasets. Full article
(This article belongs to the Special Issue Satellite-Terrestrial Integrated Internet of Things)
Show Figures

Figure 1

14 pages, 914 KiB  
Article
Random Routing Algorithm for Enhancing the Cybersecurity of LEO Satellite Networks
by Ruben Fratty, Yuval Saar, Rajnish Kumar and Shlomi Arnon
Electronics 2023, 12(3), 518; https://doi.org/10.3390/electronics12030518 - 19 Jan 2023
Cited by 1 | Viewed by 2766
Abstract
The recent expansion of networks of low-earth orbit (LEO) satellites such as Starlink, OneWeb, and Telesat and the evolution of communication systems toward B5G and 6G with densely interconnected devices could generate opportunities for various cyber attacks. As the satellite network offers many [...] Read more.
The recent expansion of networks of low-earth orbit (LEO) satellites such as Starlink, OneWeb, and Telesat and the evolution of communication systems toward B5G and 6G with densely interconnected devices could generate opportunities for various cyber attacks. As the satellite network offers many crucial services to the public and governmental organizations, cyberattacks pose severe risks to the communication infrastructure. In this study, we propose a random routing algorithm to prevent distributed denial-of-service (DDoS) attacks on an LEO satellite constellation network. The routing algorithm utilizes the classical algorithms, i.e., k-DG, k-DS, k-SP, and k-LO, by introducing randomness and selecting one with weighted probability distribution to increase the uncertainty in the algorithm. The study shows that the proposed random routing algorithm improves the average and median cost of the attacker against DDoS attacks while maintaining the functionality of the network. The algorithm is optimized by formulating a Bayesian optimization problem. In addition to providing an additional level of uncertainty in the routing, there is an improvement of 1.71% in the average cost and 2.05% in the median cost in a typical scenario. The algorithm causes the network to be robust to cyber attacks against LEO Satellite Networks (LSNs), however, similar to any other defensive measures, it reduces the network’s goodput. Full article
(This article belongs to the Special Issue Satellite-Terrestrial Integrated Internet of Things)
Show Figures

Figure 1

24 pages, 1837 KiB  
Article
Multi-UAV Clustered NOMA for Covert Communications: Joint Resource Allocation and Trajectory Optimization
by Xiaofei Qin, Xu Wu, Mudi Xiong, Ye Liu and Yue Zhang
Electronics 2022, 11(23), 4056; https://doi.org/10.3390/electronics11234056 - 06 Dec 2022
Cited by 2 | Viewed by 1254
Abstract
Due to strong survivability and flexible scheduling, multi-UAV (Unmanned Aerial Vehicle)-assisted communication networks have been widely used in civil and military fields. However, the open accessibility of wireless channels brings a huge risk of privacy disclosure to UAV-based networks. This paper considers a [...] Read more.
Due to strong survivability and flexible scheduling, multi-UAV (Unmanned Aerial Vehicle)-assisted communication networks have been widely used in civil and military fields. However, the open accessibility of wireless channels brings a huge risk of privacy disclosure to UAV-based networks. This paper considers a multi-UAV-assisted covert communication system based on Wireless Powered Communication (WPC) and Clustered-Non-Orthogonal-Multiple-Access (C-NOMA), aiming to hide the transmission behavior between UAVs and legitimate ground users (LGUs). Specifically, the UAVs serve as aerial base stations to provide services to LGUs, while avoiding detection by the ground warden. In order to improve the considered covert communication performance, the average uplink covert rate of all clusters in each slot is maximized by jointly optimizing the cluster scheduling variable, subslot allocation, LGU transmit power and multi-UAV trajectory subject to covertness constraints. The original problem is a mixed integer non-convex problem, which are typically difficult to solve directly. To solve this challenge, this paper decouples it into four sub-problems and solves the sub-problems by alternating iterations until the objective function converges. The simulation results show that the proposed multi-UAV-assisted covert communication scheme can effectively improve the average uplink covert rate of all clusters compared with the benchmark schemes. Full article
(This article belongs to the Special Issue Satellite-Terrestrial Integrated Internet of Things)
Show Figures

Figure 1

11 pages, 698 KiB  
Article
DNN Beamforming for LEO Satellite Communication at Sub-THz Bands
by Rajnish Kumar and Shlomi Arnon
Electronics 2022, 11(23), 3937; https://doi.org/10.3390/electronics11233937 - 28 Nov 2022
Cited by 2 | Viewed by 1812
Abstract
The 6G communication system will be designed at sub-THz frequencies due to increasing demand in data rates, emerging new applications and advanced communication technologies. These high-performing systems will heavily rely on artificial intelligence (AI) for efficient and robust design of transceivers. In this [...] Read more.
The 6G communication system will be designed at sub-THz frequencies due to increasing demand in data rates, emerging new applications and advanced communication technologies. These high-performing systems will heavily rely on artificial intelligence (AI) for efficient and robust design of transceivers. In this work, we propose a deep neural network (DNN) beamformer that will replace the use of phase shifters for a massive array of antenna elements employed at the ground station for wideband LEO satellite communication at sub-THz bands. We show that the signal processing algorithm employed using DNN is capable to match the performance of a true-time delay beamformer as the angle of arrival of the received wideband signal at the ground station is changing due to rapid movement of the LEO satellite. The implementation of DNN beamformer will be able to reduce the cost of receiver and provide a way for the efficient and compact design of the massive array beamforming for wideband LEO satellite applications. Full article
(This article belongs to the Special Issue Satellite-Terrestrial Integrated Internet of Things)
Show Figures

Figure 1

18 pages, 12861 KiB  
Article
A Computation Offloading Strategy in LEO Constellation Edge Cloud Network
by Feihu Dong, Tao Huang, Yasheng Zhang, Chenhua Sun and Chengcheng Li
Electronics 2022, 11(13), 2024; https://doi.org/10.3390/electronics11132024 - 28 Jun 2022
Cited by 4 | Viewed by 1803
Abstract
With the rise of a new generation of low Earth orbit (LEO) satellite constellations and the advancement of the 6G network, satellite–terrestrial integrated Internet of Things (IoT) in the future will achieve global coverage through the integration of LEO constellation, and extend computing [...] Read more.
With the rise of a new generation of low Earth orbit (LEO) satellite constellations and the advancement of the 6G network, satellite–terrestrial integrated Internet of Things (IoT) in the future will achieve global coverage through the integration of LEO constellation, and extend computing to the edge of the network through the deployment of edge computing services in LEO constellation so as to meet the demand of mass connection and low latency data processing. The LEO constellation network of the future will be an edge cloud network combining network and computing. In this paper, we propose a computation offloading strategy for the combined optimization of energy and computational load in a LEO constellation edge cloud network (hereinafter referred to as LEO-ECN). First, we establish the LEO-ECN system model, in which the user task can be offloaded to the satellite through the multi-hop path. Then, a cost model considering energy consumption and load calculation is proposed. Finally, a joint optimization problem to minimize energy consumption and balance the LEO-ECN load is established, which is a convex optimization problem. The simulation result demonstrates that, compared with the benchmark strategy, our proposed strategy has better performance and can improve the computing resource utilization of LEO-ECN. Full article
(This article belongs to the Special Issue Satellite-Terrestrial Integrated Internet of Things)
Show Figures

Figure 1

Back to TopTop