remotesensing-logo

Journal Browser

Journal Browser

Data Science and Machine Learning for Geodetic Earth Observation

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Earth Observation Data".

Deadline for manuscript submissions: closed (28 February 2023) | Viewed by 24623

Special Issue Editors


E-Mail Website
Guest Editor
Department of Civil, Environmental and Geomatic Engineering, ETH Zurich, 8093 Zurich, Switzerland
Interests: geodetic data analysis and parameter estimation; GNSS; very long baseline interferometry; machine learning; determination of atmospheric parameters; geodetic reference frames
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
GFZ German Research Centre for Geosciences, Department of Geodesy, 14473 Potsdam, Germany
Interests: geodesy; very long baseline interferometry; atmospheric refraction; ray-tracing; geophysical loading; integrated water vapor; numerical weather prediction; combination of space geodetic techniques
Special Issues, Collections and Topics in MDPI journals

E-Mail Website1 Website2
Guest Editor
1. NASA Jet Propulsion Laboratory, California Institute of Technology, Pasadena, CA 91011, USA
2. ASTRA, LLC., Louisville, CO 80027, USA
Interests: data science; upper atmosphere; solar–terrestrial coupling; space weather; complexity science; multiscale phenomena; transdisciplinary science; collaboration and team science

E-Mail Website
Guest Editor
Los Alamos National Laboratory, Los Alamos, NM 87545, USA
Interests: deep learning; data science; InSAR; fault physics; materials science

E-Mail Website1 Website2
Guest Editor
National Centre for Geodesy, Indian Institute of Technology Kanpur, Kanpur 208016, India
Interests: geodetic surveying; surface deformation; multitemporal InSAR techniques; GNSS; multisensor geodetic techniques; machine learning for remote sensing and geodetic Earth observation

Special Issue Information

Dear Colleagues,

Observation and monitoring of the Earth system by space geodetic techniques is essential in the battle against several challenges of scientific and societal importance, including natural hazards and climate change prevention, mitigation, and monitoring.

Recently, we have witnessed a dramatic increase in the amount of data from several space geodetic observing techniques. In particular, Global Navigation Satellite Systems (GNSS) and Interferometric Synthetic Aperture Radar (InSAR) have contributed to the expansive collection of geodetic data. “Big data” in geodesy create certain challenges, but also opportunities: computational capabilities have steadily increased over the past few decades, and new mathematical methods have been introduced. Specifically, strategies and methodologies from the fields of data science and machine learning have shown great potential and sparked new developments in geodetic data analysis.

This Special Issue will address recent progress in the application of methods from data science and machine learning to geodetic Earth observation. Special emphasis will be placed on innovative approaches for harnessing geodetic “big data” for scientific purposes using deep learning. In particular, we encourage investigations related to (but not limited to) improved geodetic parameter prediction (e.g., Earth orientation parameters), detection of spatiotemporal patterns and anomalies (in both images and time series, for example, jump detection), automation of geodetic data processing, and the combination of inhomogeneous observational data and geophysical models (including the exploitation of auxiliary information). Furthermore, we specifically invite contributions that deal with aspects of machine learning sometimes critically seen by geodesists, including challenges related to the quantification of uncertainties, interpretability of results, as well as the integration of physical information. Studies based on more limited data sets from various space geodetic techniques with the goal to solve complex nonlinear problems are welcome as well.

Prof. Dr. Benedikt Soja
Prof. Dr. Mattia Crespi
Dr. Kyriakos Balidakis
Dr. Ryan McGranaghan
Dr. Bertrand Rouet-Leduc
Dr. Ashutosh Tiwari
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Geodetic Earth observation
  • GNSS
  • InSAR
  • Gravity satellite missions
  • Machine learning
  • Deep learning
  • Artificial intelligence
  • Data science
  • Big data

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

17 pages, 2223 KiB  
Article
Estimation of Earth Rotation Parameters and Prediction of Polar Motion Using Hybrid CNN–LSTM Model
by Kehao Yu, Kai Yang, Tonghui Shen, Lihua Li, Haowei Shi and Xu Song
Remote Sens. 2023, 15(2), 427; https://doi.org/10.3390/rs15020427 - 10 Jan 2023
Cited by 2 | Viewed by 2193
Abstract
The Earth rotation parameters (ERPs), including polar motion (PMX and PMY) and universal time (UT1-UTC), play a central role in functions such as monitoring the Earth’s rotation and high-precision navigation and positioning. Variations in ERPs reflect not only the overall state of movement [...] Read more.
The Earth rotation parameters (ERPs), including polar motion (PMX and PMY) and universal time (UT1-UTC), play a central role in functions such as monitoring the Earth’s rotation and high-precision navigation and positioning. Variations in ERPs reflect not only the overall state of movement of the Earth, but also the interactions among the atmosphere, ocean, and land on the spatial and temporal scales. In this paper, we estimated ERP series based on very long baseline interferometry (VLBI) observations between 2011–2020. The results show that the average root mean square errors (RMSEs) are 0.187 mas for PMX, 0.205 mas for PMY, and 0.022 ms for UT1-UTC. Furthermore, to explore the high-frequency variations in more detail, we analyzed the polar motion time series spectrum based on fast Fourier transform (FFT), and our findings show that the Chandler motion was approximately 426 days and that the annual motion was about 360 days. In addition, the results also validate the presence of a weaker retrograde oscillation with an amplitude of about 3.5 mas. This paper proposes a hybrid prediction model that combines convolutional neural network (CNN) and long short-term memory (LSTM) neural network: the CNN–LSTM model. The advantages can be attributed to the CNN’s ability to extract and optimize features related to polar motion series, and the LSTM’s ability to make medium- to long-term predictions based on historical time series. Compared with Bulletin A, the prediction accuracies of PMX and PMY are improved by 42% and 13%, respectively. Notably, the hybrid CNN–LSTM model can effectively improve the accuracy of medium- and long-term polar motion prediction. Full article
(This article belongs to the Special Issue Data Science and Machine Learning for Geodetic Earth Observation)
Show Figures

Graphical abstract

34 pages, 3622 KiB  
Article
Ensemble Machine Learning of Random Forest, AdaBoost and XGBoost for Vertical Total Electron Content Forecasting
by Randa Natras, Benedikt Soja and Michael Schmidt
Remote Sens. 2022, 14(15), 3547; https://doi.org/10.3390/rs14153547 - 24 Jul 2022
Cited by 33 | Viewed by 4807
Abstract
Space weather describes varying conditions between the Sun and Earth that can degrade Global Navigation Satellite Systems (GNSS) operations. Thus, these effects should be precisely and timely corrected for accurate and reliable GNSS applications. That can be modeled with the Vertical Total Electron [...] Read more.
Space weather describes varying conditions between the Sun and Earth that can degrade Global Navigation Satellite Systems (GNSS) operations. Thus, these effects should be precisely and timely corrected for accurate and reliable GNSS applications. That can be modeled with the Vertical Total Electron Content (VTEC) in the Earth’s ionosphere. This study investigates different learning algorithms to approximate nonlinear space weather processes and forecast VTEC for 1 h and 24 h in the future for low-, mid- and high-latitude ionospheric grid points along the same longitude. VTEC models are developed using learning algorithms of Decision Tree and ensemble learning of Random Forest, Adaptive Boosting (AdaBoost), and eXtreme Gradient Boosting (XGBoost). Furthermore, ensemble models are combined into a single meta-model Voting Regressor. Models were trained, optimized, and validated with the time series cross-validation technique. Moreover, the relative importance of input variables to the VTEC forecast is estimated. The results show that the developed models perform well in both quiet and storm conditions, where multi-tree ensemble learning outperforms the single Decision Tree. In particular, the meta-estimator Voting Regressor provides mostly the lowest RMSE and the highest correlation coefficients as it averages predictions from different well-performing models. Furthermore, expanding the input dataset with time derivatives, moving averages, and daily differences, as well as modifying data, such as differencing, enhances the learning of space weather features, especially over a longer forecast horizon. Full article
(This article belongs to the Special Issue Data Science and Machine Learning for Geodetic Earth Observation)
Show Figures

Graphical abstract

17 pages, 3380 KiB  
Article
Transformer-Based Global Zenith Tropospheric Delay Forecasting Model
by Huan Zhang, Yibin Yao, Chaoqian Xu, Wei Xu and Junbo Shi
Remote Sens. 2022, 14(14), 3335; https://doi.org/10.3390/rs14143335 - 11 Jul 2022
Cited by 4 | Viewed by 1954
Abstract
Zenith tropospheric delay (ZTD) plays an important role in high-precision global navigation satellite system (GNSS) positioning and meteorology. At present, commonly used ZTD forecasting models comprise empirical, meteorological parameter, and neural network models. The empirical model can only fit approximate periodic variations, and [...] Read more.
Zenith tropospheric delay (ZTD) plays an important role in high-precision global navigation satellite system (GNSS) positioning and meteorology. At present, commonly used ZTD forecasting models comprise empirical, meteorological parameter, and neural network models. The empirical model can only fit approximate periodic variations, and its accuracy is relatively low. The accuracy of the meteorological parameter model depends heavily on the accuracy of the meteorological parameters. The recurrent neural network (RNN) is suitable for short-term series data prediction, but for long-term series, the ZTD prediction accuracy is clearly reduced. Long short-term memory (LSTM) has superior forecasting accuracy for long-term ZTD series; however, the LSTM model is complex, cannot be parallelized, and is time-consuming. In this study, we propose a novel ZTD time-series forecasting utilizing transformer-based machine-learning methods that are popular in natural language processing (NLP) and forecasting global ZTD, the training parameters provided by the global geodetic observing system (GGOS). The proposed transformer model leverages self-attention mechanisms by encoder and decoder modules to learn complex patterns and dynamics from long ZTD time series. The numeric results showed that the root mean square error (RMSE) of the forecasting ZTD results were 1.8 cm and mean bias, STD, MAE, and R 0.0, 1.7, 1.3, and 0.95, respectively, which is superior to that of the LSTM, RNN, convolutional neural network (CNN), and GPT3 series models. We investigated the global distribution of these accuracy indicators, and the results demonstrated that the accuracy in continents was superior to maritime space transformer ZTD forecasting model accuracy at high latitudes superior to that at low latitude. In addition to the overall accuracy improvement, the proposed transformer ZTD forecast model also mitigates the accuracy variations in space and time, thereby guaranteeing high accuracy globally. This study provides a novel method to estimate the ZTD, which could potentially contribute to precise GNSS positioning and meteorology. Full article
(This article belongs to the Special Issue Data Science and Machine Learning for Geodetic Earth Observation)
Show Figures

Graphical abstract

20 pages, 4879 KiB  
Article
An Improved Independent Parameter Decomposition Method for Gaofen-3 Surveying and Mapping Calibration
by Tao Li, Jun Fan, Yanyang Liu, Ruifeng Lu, Yusheng Hou and Jing Lu
Remote Sens. 2022, 14(13), 3089; https://doi.org/10.3390/rs14133089 - 27 Jun 2022
Cited by 2 | Viewed by 1440
Abstract
The Gaofen-3 (GF-3) satellite can provide digital elevation model (DEM) data from its interferogram outputs. However, the accuracy of these data cannot be ensured without applying a surveying and mapping (SAM) calibration process, thus necessitating geometric and interferometric calibration technologies. In this paper, [...] Read more.
The Gaofen-3 (GF-3) satellite can provide digital elevation model (DEM) data from its interferogram outputs. However, the accuracy of these data cannot be ensured without applying a surveying and mapping (SAM) calibration process, thus necessitating geometric and interferometric calibration technologies. In this paper, we propose an independent parameter decomposition (IPD) method to conduct SAM calibration on GF-3 data and generate high-accuracy DEMs. We resolved the geometric parameters to improve the location accuracy and resolved the interferometric parameters to improve the height accuracy. First, we established a geometric calibration model, analyzed the Range–Doppler (RD) model and resolved the initial imaging time error as well as the initial slant range error. Then, we established a three-dimensional reconstruction (TDR) model to analyze the height error sources. Finally, the interferometric phase error and baseline vector error were precisely estimated to ensure the vertical accuracy of the interferometric results by establishing the interferometric calibration model. We then used the GF-3 interferometric data derived on the same orbit in a north–south distribution to conduct the calibration experiment. The results show that the plane positioning accuracy was 5.09 m following geometric calibration, that the vertical accuracy of the interferometric results was 4.18 m following interferometric calibration and that the average absolute elevation accuracy of the derived DEM product was better than 3.09 m when using the GF-3 SAR data, thus confirming the correctness and effectiveness of the proposed GF-3 IPD calibration method. These results provide a technical basis for SAM calibration using GF-3 interferograms at the 1:50,000 scale in China. Full article
(This article belongs to the Special Issue Data Science and Machine Learning for Geodetic Earth Observation)
Show Figures

Figure 1

15 pages, 35809 KiB  
Article
Modeling of Residual GNSS Station Motions through Meteorological Data in a Machine Learning Approach
by Pia Ruttner, Roland Hohensinn, Stefano D’Aronco, Jan Dirk Wegner and Benedikt Soja
Remote Sens. 2022, 14(1), 17; https://doi.org/10.3390/rs14010017 - 22 Dec 2021
Cited by 3 | Viewed by 3457
Abstract
Long-term Global Navigation Satellite System (GNSS) height residual time series contain signals that are related to environmental influences. A big part of the residuals can be explained by environmental surface loadings, expressed through physical models. This work aims to find a model that [...] Read more.
Long-term Global Navigation Satellite System (GNSS) height residual time series contain signals that are related to environmental influences. A big part of the residuals can be explained by environmental surface loadings, expressed through physical models. This work aims to find a model that connects raw meteorological parameters with the GNSS residuals. The approach is to train a Temporal Convolutional Network (TCN) on 206 GNSS stations in central Europe, after which the resulting model is applied to 68 test stations in the same area. When comparing the Root Mean Square (RMS) error reduction of the time series reduced by physical models, and, by the TCN model, the latter reduction rate is, on average, 0.8% lower. In a second experiment, the TCN is utilized to further reduce the RMS of the time series, of which the loading models were already subtracted. This yields additional 2.7% of RMS reduction on average, resulting in a mean RMS reduction of 28.6% overall. The results suggests that a TCN, using meteorological features as input data, is able to reconstruct the reductions almost on the same level as physical models. Trained on the residuals, reduced by environmental loadings, the TCN is still able to slightly increase the overall reduction of variations in the GNSS station position time series. Full article
(This article belongs to the Special Issue Data Science and Machine Learning for Geodetic Earth Observation)
Show Figures

Figure 1

21 pages, 13759 KiB  
Article
Artificial Neural Network-Based Microwave Satellite Soil Moisture Reconstruction over the Qinghai–Tibet Plateau, China
by Jie Wang and Duanyang Xu
Remote Sens. 2021, 13(24), 5156; https://doi.org/10.3390/rs13245156 - 19 Dec 2021
Cited by 5 | Viewed by 2455
Abstract
Soil moisture is a key parameter for land-atmosphere interaction system; however, fewer existing spatial-temporally continuous and high-quality observation records impose great limitations on the application of soil moisture on long term climate change monitoring and predicting. Therefore, this study selected the Qinghai–Tibet Plateau [...] Read more.
Soil moisture is a key parameter for land-atmosphere interaction system; however, fewer existing spatial-temporally continuous and high-quality observation records impose great limitations on the application of soil moisture on long term climate change monitoring and predicting. Therefore, this study selected the Qinghai–Tibet Plateau (QTP) of China as research region, and explored the feasibility of using Artificial Neural Network (ANN) to reconstruct soil moisture product based on AMSR-2/AMSR-E brightness temperature and SMAP satellite data by introducing auxiliary variables, specifically considering the sensitivity of different combination of input variables, number of neurons in hidden layer, sample ratio, and precipitation threshold in model building. The results showed that the ANN model had the highest accuracy when all variables were used as inputs, it had a network containing 12 neurons in a hidden layer, it had a sample ratio 80%-10%-10% (training-validation-testing), and had a precipitation threshold of 8.75 mm, respectively. Furthermore, validation of the reconstructed soil moisture product (named ANN-SM) in other period were conducted by comparing with SMAP (April 2019 to July 2021) for all grid cells and in situ soil moisture sites (August 2010 to March 2015) of QTP, which achieved an ideal accuracy. In general, the proposed method is capable of rebuilding soil moisture products by adopting different satellite data and our soil moisture product is promising for serving the studies of long-term global and regional dynamics in water cycle and climate. Full article
(This article belongs to the Special Issue Data Science and Machine Learning for Geodetic Earth Observation)
Show Figures

Graphical abstract

23 pages, 1676 KiB  
Article
Discontinuity Detection in GNSS Station Coordinate Time Series Using Machine Learning
by Laura Crocetti, Matthias Schartner and Benedikt Soja
Remote Sens. 2021, 13(19), 3906; https://doi.org/10.3390/rs13193906 - 29 Sep 2021
Cited by 3 | Viewed by 3138
Abstract
Global navigation satellite systems (GNSS) provide globally distributed station coordinate time series that can be used for a variety of applications such as the definition of a terrestrial reference frame. A reliable estimation of the coordinate time series trends gives valuable information about [...] Read more.
Global navigation satellite systems (GNSS) provide globally distributed station coordinate time series that can be used for a variety of applications such as the definition of a terrestrial reference frame. A reliable estimation of the coordinate time series trends gives valuable information about station movements during the measured time period. Detecting discontinuities of various origins in such time series is crucial for accurate and robust velocity estimation. At present, there is no fully automated standard method for detecting discontinuities. Instead, discontinuity-catalogues are frequently used, which provide information about when a device was changed or an earthquake occurred. However, it is known that these catalogues suffer from incompleteness. This study investigates the suitability of machine learning classification algorithms that are fully data-driven to detect discontinuities caused by earthquakes in station coordinate time series without the need for external information. For this study, Japan was selected as a testing area. Ten different machine learning algorithms have been tested. It is found that Random Forest achieves the best performance with an F1 score of 0.77, a recall of 0.78, and a precision of 0.76. Overall, 525 of 565 recorded earthquakes in the test data were correctly classified. It is further highlighted that splitting the time series into chunks of 21 days leads to the best performance. Furthermore, it is beneficial to combine the three (normalized) components of the GNSS solution into one sample, and that adding the value range as an additional feature improves the result. Thus, this work demonstrates how it is possible to use machine learning algorithms to detect discontinuities in GNSS time series. Full article
(This article belongs to the Special Issue Data Science and Machine Learning for Geodetic Earth Observation)
Show Figures

Graphical abstract

19 pages, 26068 KiB  
Article
Inverted Algorithm of Terrestrial Water-Storage Anomalies Based on Machine Learning Combined with Load Model and Its Application in Southwest China
by Yifan Shen, Wei Zheng, Wenjie Yin, Aigong Xu, Huizhong Zhu, Shuai Yang and Kai Su
Remote Sens. 2021, 13(17), 3358; https://doi.org/10.3390/rs13173358 - 25 Aug 2021
Cited by 12 | Viewed by 2378
Abstract
Dense Global Position System (GPS) arrays can be used to invert the terrestrial water-storage anomaly (TWSA) with higher accuracy. However, the uneven distribution of GPS stations greatly limits the application of GPS to derive the TWSA. Aiming to solve this problem, we grid [...] Read more.
Dense Global Position System (GPS) arrays can be used to invert the terrestrial water-storage anomaly (TWSA) with higher accuracy. However, the uneven distribution of GPS stations greatly limits the application of GPS to derive the TWSA. Aiming to solve this problem, we grid the GPS array using regression to raise the reliability of TWSA inversion. First, the study uses the random forest (RF) model to simulate crustal deformation in unobserved grids. Meanwhile, the new Machine-Learning Loading-Inverted Method (MLLIM) is constructed based on the traditional GPS derived method to raise the truthfulness of TWSA inversion. Second, this research selects southwest China as the study region, the MLLIM and traditional GPS inversion methods are used to derive the TWSA, and the inverted results are contrasted with datasets of the Gravity Recovery and Climate Experiment (GRACE) Mascon and the Global Land Data Assimilation System (GLDAS) model. The comparison shows that values of Pearson Correlation Coefficient (PCC) between the MLLIM and GRACE and GRACE Follow-On (GRACE-FO) are equal to 0.91 and 0.88, respectively; and the values of R-squared (R2) are equal to 0.76 and 0.65, respectively; the values of PCC and R2 between MLLIM and GLDAS solutions are equal to 0.79 and 0.65. Compared with the traditional GPS inversion, the MLLIM improves PCC and R2 by 8.85% and 7.99% on average, which indicates that the MLLIM can improve the accuracy of TWSA inversion more than the traditional GPS method. Third, this study applies the MLLIM to invert the TWSA in each province of southwest China and combines the precipitation to analyze the change of TWSA in each province. The results are as follows: (1) The spatial distribution of TWSA and precipitation is coincident, which is highlighted in southwest Yunnan and southeast Guangxi; (2) this study compares TWSA of MLLIM with GRACE and GLDAS solutions in each province, which indicates that the maximum value of PCC is as high as 0.86 and 0.94, respectively, which indicates the MLLIM can be used to invert the TWSA in the regions with sparse GPS stations. The TWSA based on the MLLIM can be used to fill the vacancy between GRACE and GRACE-FO. Full article
(This article belongs to the Special Issue Data Science and Machine Learning for Geodetic Earth Observation)
Show Figures

Figure 1

Back to TopTop