Hydrological Simulation and Forecasting Based on Artificial Intelligence

A special issue of Water (ISSN 2073-4441). This special issue belongs to the section "Hydrology".

Deadline for manuscript submissions: 20 August 2024 | Viewed by 4944

Special Issue Editor


E-Mail Website
Guest Editor
Research Center on Flood & Drought Disaster Reduction of the Ministry of Water Resources, China Institute of Water Resources and Hydropower Research, Beijing, China
Interests: hydrological–hydrodynamic modelling; groundwater modelling; parallel computing; artificial intelligence; remote sensing
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

With the development of modern artificial intelligence and parallel computing, the applications of these novel technologies in the field of hydrological and hydrodynamic modeling, flood simulation and forecasting, risk and uncertainty analysis, etc., have significantly improved accuracy, reliability, and computational efficiency in the domain of disaster defense. This Special Issue mainly focuses on the application and novel methods of flood simulation, forecasting and modeling, model parameter estimation, risk and uncertainty analysis, and data analysis, based on modern artificial intelligence and/or parallel computing technologies. We invite submissions including, but not limited to, the following topics:

(1) Artificial-intelligence-aided hydrological simulation and forecasting.

(2) Hydrodynamic modelling based on artificial intelligence technologies.

(3) Flood risk analysis, hydrological or hydrodynamic model uncertainty analysis based on intelligence optimization algorithms or other related artificial intelligence techniques.

(4) Model parameter optimization algorithms based on intelligence optimization algorithms.

(5) Data analysis based on artificial intelligence or machine leaning technologies.

(6) Hydrological/hydrodynamic modeling, risk and uncertainty analysis, parameter optimization, data analysis, etc., accelerated by modern parallel computing technologies such as many-core GPU, multi-core CPU, or large-scale parallel computer clusters.

Dr. Guangyuan Kan
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Water is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • artificial intelligence
  • machine learning
  • parallel computing
  • GPU computing
  • hydrological model
  • hydrodynamic model
  • flood simulation
  • flood forecasting
  • risk and uncertainty analysis
  • data analysis

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14 pages, 2803 KiB  
Article
Forecasting the River Water Discharge by Artificial Intelligence Methods
by Alina Bărbulescu and Liu Zhen
Water 2024, 16(9), 1248; https://doi.org/10.3390/w16091248 - 26 Apr 2024
Viewed by 269
Abstract
The management of water resources must be based on accurate models of the river discharge in the context of the water flow alteration due to anthropic influences and climate change. Therefore, this article addresses the challenge of detecting the best model among three [...] Read more.
The management of water resources must be based on accurate models of the river discharge in the context of the water flow alteration due to anthropic influences and climate change. Therefore, this article addresses the challenge of detecting the best model among three artificial intelligence techniques (AI)—backpropagation neural networks (BPNN), long short-term memory (LSTM), and extreme learning machine (ELM)—for the monthly data series discharge of the Buzău River, in Romania. The models were built for three periods: January 1955–September 2006 (S1 series), January 1955–December 1983 (S2 series), and January 1984–December 2010 (S series). In terms of mean absolute error (MAE), the best performances were those of ELM on both Training and Test sets on S2, with MAETraining = 5.02 and MAETest = 4.01. With respect to MSE, the best was LSTM on the Training set of S2 (MSE = 60.07) and ELM on the Test set of S2 (MSE = 32.21). Accounting for the R2 value, the best model was LSTM on S2 (R2Training = 99.92%, and R2Test = 99.97%). ELM was the fastest, with 0.6996 s, 0.7449 s, and 0.6467 s, on S, S1, and S2, respectively. Full article
Show Figures

Figure 1

28 pages, 2857 KiB  
Article
Improving Forecasting Accuracy of Multi-Scale Groundwater Level Fluctuations Using a Heterogeneous Ensemble of Machine Learning Algorithms
by Dilip Kumar Roy, Tasnia Hossain Munmun, Chitra Rani Paul, Mohamed Panjarul Haque, Nadhir Al-Ansari and Mohamed A. Mattar
Water 2023, 15(20), 3624; https://doi.org/10.3390/w15203624 - 16 Oct 2023
Cited by 3 | Viewed by 1057
Abstract
Accurate groundwater level (GWL) forecasts are crucial for the efficient utilization, strategic long-term planning, and sustainable management of finite groundwater resources. These resources have a substantial impact on decisions related to irrigation planning, crop selection, and water supply. This study evaluates data-driven models [...] Read more.
Accurate groundwater level (GWL) forecasts are crucial for the efficient utilization, strategic long-term planning, and sustainable management of finite groundwater resources. These resources have a substantial impact on decisions related to irrigation planning, crop selection, and water supply. This study evaluates data-driven models using different machine learning algorithms to forecast GWL fluctuations for one, two, and three weeks ahead in Bangladesh’s Godagari upazila. To address the accuracy limitations inherent in individual forecasting models, a Bayesian model averaging (BMA)-based heterogeneous ensemble of forecasting models was proposed. The dataset encompasses 1807 weekly GWL readings (February 1984 to September 2018) from four wells, divided into training (70%), validation (15%), and testing (15%) subsets. Both standalone models and ensembles employed a Minimum Redundancy Maximum Relevance (MRMR) algorithm to select the most influential lag times among candidate GWL lags up to 15 weeks. Statistical metrics and visual aids were used to evaluate the standalone and ensemble GWL forecasts. The results consistently favor the heterogeneous BMA ensemble, excelling over standalone models for multi-step ahead forecasts across time horizons. For instance, at GT8134017, the BMA approach yielded values like R (0.93), NRMSE (0.09), MAE (0.50 m), IOA (0.96), NS (0.87), and a-20 index (0.94) for one-week-ahead forecasts. Despite a slight decline in performance with an increasing forecast horizon, evaluation indices confirmed the superior BMA ensemble performance. This ensemble also outperformed standalone models for other observation wells. Thus, the BMA-based heterogeneous ensemble emerges as a promising strategy to bolster multi-step ahead GWL forecasts within this area and beyond. Full article
Show Figures

Figure 1

16 pages, 5505 KiB  
Article
Extraction of Spatiotemporal Distribution Characteristics and Spatiotemporal Reconstruction of Rainfall Data by PCA Algorithm
by Yuanyuan Liu, Yesen Liu, Shu Liu, Hancheng Ren, Peinan Tian and Nana Yang
Water 2023, 15(20), 3596; https://doi.org/10.3390/w15203596 - 14 Oct 2023
Viewed by 834
Abstract
Scientific analyses of urban flood risks are essential for evaluating urban flood insurance and designing drainage projects. Although the current rainfall monitoring system in China has a dense station network and high-precision rainfall data, the time series is short. In contrast, historical rainfall [...] Read more.
Scientific analyses of urban flood risks are essential for evaluating urban flood insurance and designing drainage projects. Although the current rainfall monitoring system in China has a dense station network and high-precision rainfall data, the time series is short. In contrast, historical rainfall data have a longer sample time series but lower precision. This study introduced a PCA algorithm to reconstruct historical rainfall data. Based on the temporal and spatial characteristics of rainfall extracted from high-resolution rainfall data over the past decade, historical (6 h intervals) rainfall spatial data were reconstructed into high-resolution (1 h intervals) spatial data to satisfy the requirements of the urban flood risk analysis. The results showed that the average error between the reconstructed data and measured values in the high-value area was within 15% and in the low-value area was within 20%, representing decreases of approximately 65% and 40%, respectively, compared to traditional interpolation data. The reconstructed historical spatial rainfall data conformed to the temporal and spatial distribution characteristics of rainfall, improved the granularity of rainfall spatial data, and enabled the effective and reasonable extraction and summary of the fine temporal and spatial distribution characteristics of rainfall. Full article
Show Figures

Figure 1

29 pages, 6523 KiB  
Article
Improving the Performance of Hydrological Model Parameter Uncertainty Analysis Using a Constrained Multi-Objective Intelligent Optimization Algorithm
by Xichen Liu, Guangyuan Kan, Liuqian Ding, Xiaoyan He, Ronghua Liu and Ke Liang
Water 2023, 15(15), 2700; https://doi.org/10.3390/w15152700 - 26 Jul 2023
Viewed by 927
Abstract
In the field of hydrological model parameter uncertainty analysis, sampling methods such as Differential Evolution based on Monte Carlo Markov Chain (DE-MC) and Shuffled Complex Evolution Metropolis (SCEM-UA) algorithms have been widely applied. However, there are two drawbacks which may introduce bad effects [...] Read more.
In the field of hydrological model parameter uncertainty analysis, sampling methods such as Differential Evolution based on Monte Carlo Markov Chain (DE-MC) and Shuffled Complex Evolution Metropolis (SCEM-UA) algorithms have been widely applied. However, there are two drawbacks which may introduce bad effects into the uncertainty analysis. The first disadvantage is that few optimization algorithms consider the physical meaning and reasonable range of the model parameters. The traditional sampling algorithms may generate non-physical parameter values and poorly simulated hydrographs when carrying out the uncertainty analysis. The second disadvantage is that the widely used sampling algorithms commonly involve only a single objective. Such sampling procedures implicitly introduce too strong an “exploitation” property into the sampling process, consequently destroying the diversity property of the sampled population, i.e., the “exploration” property is bad. Here, “exploitation” refers to using good already-existing solutions and making refinements to them, so that their fitness will improve further; meanwhile, “exploration” denotes that the algorithm searches for new solutions in new regions. With the aim of improving the performance of uncertainty analysis algorithms, in this research, a constrained multi-objective intelligent optimization algorithm is proposed that preserves the physical meaning of the model parameter using the penalty function method and maintains the population diversity using a Non-dominated Sorted Genetic Algorithm-II (NSGA-II) multi-objective optimization procedure. The representativeness of the parameter population is estimated on the basis of the mean and standard deviation of the Nash–Sutcliffe coefficient, and the diversity is evaluated on the basis of the mean Euclidean distance. The Chengcun watershed is selected as the study area, and uncertainty analysis is carried out. The numerical simulations indicate that the performance of the proposed algorithm is significantly improved, preserving the physical meaning and reasonable range of the model parameters while significantly improving the diversity and reliability of the sampled parameter population. Full article
Show Figures

Figure 1

25 pages, 8240 KiB  
Article
Research on Rain Pattern Classification Based on Machine Learning: A Case Study in Pi River Basin
by Xiaodi Fu, Guangyuan Kan, Ronghua Liu, Ke Liang, Xiaoyan He and Liuqian Ding
Water 2023, 15(8), 1570; https://doi.org/10.3390/w15081570 - 17 Apr 2023
Cited by 3 | Viewed by 1254
Abstract
For the purpose of improving the scientific nature, reliability, and accuracy of flood forecasting, it is an effective and practical way to construct a flood forecasting scheme and carry out real-time forecasting with consideration of different rain patterns. The technique for rain pattern [...] Read more.
For the purpose of improving the scientific nature, reliability, and accuracy of flood forecasting, it is an effective and practical way to construct a flood forecasting scheme and carry out real-time forecasting with consideration of different rain patterns. The technique for rain pattern classification is of great significance in the above-mentioned technical roadmap. With the rapid development of artificial intelligence technologies such as machine learning, it is possible and necessary to apply these new methods to assist rain classification applications. In this research, multiple machine learning methods were adopted to study the time-history distribution characteristics and conduct rain pattern classification from observed rainfall time series data. Firstly, the hourly rainfall data between 2003 and 2021 of 37 rain gauge stations in the Pi River Basin were collected to classify rain patterns based on the universally acknowledged dynamic time warping (DTW) algorithm, and the classifications were treated as the benchmark result. After that, four other machine learning methods, including the Decision Tree (DT), Long- and Short-Term Memory (LSTM) neural network, Light Gradient Boosting Machine (LightGBM), and Support Vector Machine (SVM), were specifically selected to establish classification models and the model performances were compared. By adjusting the sampling size, the influence of different sizes on the classification was analyzed. Intercomparison results indicated that LightGBM achieved the highest accuracy and the fastest training speed, the accuracy and F1 score were 98.95% and 98.58%, respectively, and the loss function and accuracy converged quickly after only 20 iterations. LSTM and SVM have satisfactory accuracy but relatively low training efficiency, and DT has fast classification speed but relatively low accuracy. With the increase in the sampling size, classification results became stable and more accurate. Besides the higher accuracy, the training efficiency of the four methods was also improved. Full article
Show Figures

Figure 1

Back to TopTop