Machine Learning for Time Series Analysis

A special issue of Algorithms (ISSN 1999-4893). This special issue belongs to the section "Evolutionary Algorithms and Machine Learning".

Deadline for manuscript submissions: closed (15 July 2023) | Viewed by 17602

Special Issue Editors

Information Security Research Group, Instituter of Computer Science, University of Tartu, 50090 Tartu, Estonia
Interests: AI/ML; cybersecurity; information security; blockchain technology; intelligent vehicles; big data analysis
ReSESNE Labs, Department of Electronics Engineering, Hankuk (Korea) University of Foreign Studies (HUFS), Seoul 02450, Republic of Korea
Interests: AI; IoT; smart city; e-healthcare; blockchain; connected vehicles; wireless communication
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

We invite you to submit your latest research in the area of time series analysis. Data acquired from a smart environment (accommodated with smart devices and sensors) over a uniform period of time is acknowledged as time-series data. Each data point is attributed to time fixed point of time arranged in chronological order like temperature over time, acceleration data per sec, etc. Further to attain meaningful information from time-series data and perform an action based on the same information, one needs to perform statistical analysis. Time series learning is a subfield of machine learning which are mathematically designed to compute sequential data. Time series machine learning can be deployed in various applications concerned with pattern detection, future trends, and prediction based on past data. Machine learning on time series data has superiority over simple traditional statistical analysis because of the advancements in its algorithmic models and improved time series forecasting technology. It has tremendous potential for business operations, day-to-day forecast requirements, and also for other various pattern prediction facilities.

Dr. Madhusudan Singh
Dr. Dhananjay Singh
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Algorithms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  •  sequential data
  •  irregular data
  •  pattern analysis
  •  future prediction
  •  time series forecasting
  •  trend analysis
  •  classification
  •  regression
  •  unsupervised and semi-supervised learning
  •  hidden markov model
  •  recurrent neural network
  •  ARIMA- autoregressive integrated moving average
  •  STD- seasonal trend decomposition
  •  ARCH- auto-regressive conditionally heteroscedastic

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

15 pages, 3671 KiB  
Article
Elevating Univariate Time Series Forecasting: Innovative SVR-Empowered Nonlinear Autoregressive Neural Networks
by Juan D. Borrero and Jesus Mariscal
Algorithms 2023, 16(9), 423; https://doi.org/10.3390/a16090423 - 02 Sep 2023
Cited by 1 | Viewed by 1012
Abstract
Efforts across diverse domains like economics, energy, and agronomy have focused on developing predictive models for time series data. A spectrum of techniques, spanning from elementary linear models to intricate neural networks and machine learning algorithms, has been explored to achieve accurate forecasts. [...] Read more.
Efforts across diverse domains like economics, energy, and agronomy have focused on developing predictive models for time series data. A spectrum of techniques, spanning from elementary linear models to intricate neural networks and machine learning algorithms, has been explored to achieve accurate forecasts. The hybrid ARIMA-SVR model has garnered attention due to its fusion of a foundational linear model with error correction capabilities. However, its use is limited to stationary time series data, posing a significant challenge. To overcome these limitations and drive progress, we propose the innovative NAR–SVR hybrid method. Unlike its predecessor, this approach breaks free from stationarity and linearity constraints, leading to improved model performance solely through historical data exploitation. This advancement significantly reduces the time and computational resources needed for precise predictions, a critical factor in univariate economic time series forecasting. We apply the NAR–SVR hybrid model in three scenarios: Spanish berry daily yield data from 2018 to 2021, daily COVID-19 cases in three countries during 2020, and the daily Bitcoin price time series from 2015 to 2020. Through extensive comparative analyses with other time series prediction models, our results substantiate that our novel approach consistently outperforms its counterparts. By transcending stationarity and linearity limitations, our hybrid methodology establishes a new paradigm for univariate time series forecasting, revolutionizing the field and enhancing predictive capabilities across various domains as highlighted in this study. Full article
(This article belongs to the Special Issue Machine Learning for Time Series Analysis)
Show Figures

Figure 1

22 pages, 640 KiB  
Article
Intelligent Identification of Trend Components in Singular Spectrum Analysis
by Nina Golyandina, Pavel Dudnik and Alex Shlemov
Algorithms 2023, 16(7), 353; https://doi.org/10.3390/a16070353 - 24 Jul 2023
Viewed by 1172
Abstract
Singular spectrum analysis (SSA) is a non-parametric adaptive technique used for time series analysis. It allows solving various problems related to time series without the need to define a model. In this study, we focus on the problem of trend extraction. To extract [...] Read more.
Singular spectrum analysis (SSA) is a non-parametric adaptive technique used for time series analysis. It allows solving various problems related to time series without the need to define a model. In this study, we focus on the problem of trend extraction. To extract trends using SSA, a grouping of elementary components is required. However, automating this process is challenging due to the nonparametric nature of SSA. Although there are some known approaches to automated grouping in SSA, they do not work well when the signal components are mixed. In this paper, a novel approach that combines automated identification of trend components with separability improvement is proposed. We also consider a new method called EOSSA for separability improvement, along with other known methods. The automated modifications are numerically compared and applied to real-life time series. The proposed approach demonstrated its advantage in extracting trends when dealing with mixed signal components. The separability-improving method EOSSA proved to be the most accurate when the signal rank is properly detected or slightly exceeded. The automated SSA was very successfully applied to US Unemployment data to separate an annual trend from seasonal effects. The proposed approach has shown its capability to automatically extract trends without the need to determine their parametric form. Full article
(This article belongs to the Special Issue Machine Learning for Time Series Analysis)
Show Figures

Figure 1

18 pages, 4776 KiB  
Article
Forgetful Forests: Data Structures for Machine Learning on Streaming Data under Concept Drift
by Zhehu Yuan, Yinqi Sun and Dennis Shasha
Algorithms 2023, 16(6), 278; https://doi.org/10.3390/a16060278 - 31 May 2023
Cited by 1 | Viewed by 1268
Abstract
Database and data structure research can improve machine learning performance in many ways. One way is to design better algorithms on data structures. This paper combines the use of incremental computation as well as sequential and probabilistic filtering to enable “forgetful” tree-based learning [...] Read more.
Database and data structure research can improve machine learning performance in many ways. One way is to design better algorithms on data structures. This paper combines the use of incremental computation as well as sequential and probabilistic filtering to enable “forgetful” tree-based learning algorithms to cope with streaming data that suffers from concept drift. (Concept drift occurs when the functional mapping from input to classification changes over time). The forgetful algorithms described in this paper achieve high performance while maintaining high quality predictions on streaming data. Specifically, the algorithms are up to 24 times faster than state-of-the-art incremental algorithms with, at most, a 2% loss of accuracy, or are at least twice faster without any loss of accuracy. This makes such structures suitable for high volume streaming applications. Full article
(This article belongs to the Special Issue Machine Learning for Time Series Analysis)
Show Figures

Graphical abstract

16 pages, 5558 KiB  
Article
Time-Series Forecasting of Seasonal Data Using Machine Learning Methods
by Vadim Kramar and Vasiliy Alchakov
Algorithms 2023, 16(5), 248; https://doi.org/10.3390/a16050248 - 10 May 2023
Cited by 3 | Viewed by 4675
Abstract
The models for forecasting time series with seasonal variability can be used to build automatic real-time control systems. For example, predicting the water flowing in a wastewater treatment plant can be used to calculate the optimal electricity consumption. The article describes a performance [...] Read more.
The models for forecasting time series with seasonal variability can be used to build automatic real-time control systems. For example, predicting the water flowing in a wastewater treatment plant can be used to calculate the optimal electricity consumption. The article describes a performance analysis of various machine learning methods (SARIMA, Holt-Winters Exponential Smoothing, ETS, Facebook Prophet, XGBoost, and Long Short-Term Memory) and data-preprocessing algorithms implemented in Python. The general methodology of model building and the requirements of the input data sets are described. All models use actual data from sensors of the monitoring system. The novelty of this work is in an approach that allows using limited history data sets to obtain predictions with reasonable accuracy. The implemented algorithms made it possible to achieve an R-Squared accuracy of more than 0.95. The forecasting calculation time is minimized, which can be used to run the algorithm in real-time control and embedded systems. Full article
(This article belongs to the Special Issue Machine Learning for Time Series Analysis)
Show Figures

Figure 1

14 pages, 2032 KiB  
Article
Periodicity Intensity Reveals Insights into Time Series Data: Three Use Cases
by Alan F. Smeaton and Feiyan Hu
Algorithms 2023, 16(2), 119; https://doi.org/10.3390/a16020119 - 15 Feb 2023
Cited by 2 | Viewed by 1367
Abstract
Periodic phenomena are oscillating signals found in many naturally occurring time series. A periodogram can be used to measure the intensities of oscillations at different frequencies over an entire time series, but sometimes, we are interested in measuring how periodicity intensity at a [...] Read more.
Periodic phenomena are oscillating signals found in many naturally occurring time series. A periodogram can be used to measure the intensities of oscillations at different frequencies over an entire time series, but sometimes, we are interested in measuring how periodicity intensity at a specific frequency varies throughout the time series. This can be performed by calculating periodicity intensity within a window, then sliding and recalculating the intensity for the window, giving an indication of how periodicity intensity at a specific frequency changes throughout the series. We illustrate three applications of this, the first of which are the movements of a herd of new-born calves, where we show how intensity in the 24 h periodicity increases and decreases synchronously across the herd. We also show how changes in 24 h periodicity intensity of activities detected from in-home sensors can be indicative of overall wellness. We illustrate this on several weeks of sensor data gathered from each of the homes of 23 older adults. Our third application is the intensity of the 7-day periodicity of hundreds of University students accessing online resources from a virtual learning environment (VLE) and how the regularity of their weekly learning behaviours changes throughout a teaching semester. The paper demonstrates how periodicity intensity reveals insights into time series data not visible using other forms of analysis. Full article
(This article belongs to the Special Issue Machine Learning for Time Series Analysis)
Show Figures

Figure 1

21 pages, 6658 KiB  
Article
Predictive Quantization and Symbolic Dynamics
by Shlomo Dubnov
Algorithms 2022, 15(12), 484; https://doi.org/10.3390/a15120484 - 19 Dec 2022
Cited by 1 | Viewed by 1337
Abstract
Capturing long-term statistics of signals and time series is important for modeling recurrent phenomena, especially when such recurrences are a-periodic and can be characterized by the approximate repetition of variable length motifs, such as patterns in human gestures and trends in financial time [...] Read more.
Capturing long-term statistics of signals and time series is important for modeling recurrent phenomena, especially when such recurrences are a-periodic and can be characterized by the approximate repetition of variable length motifs, such as patterns in human gestures and trends in financial time series or musical melodies. Regressive and auto-regressive models that are common in such problems, both analytically derived and neural network-based, often suffer from limited memory or tend to accumulate errors, making them sensitive during training. Moreover, such models often assume stationary signal statistics, which makes it difficult to deal with switching regimes or conditional signal dynamics. In this paper, we describe a method for time series modeling that is based on adaptive symbolization that maximizes the predictive information of the resulting sequence. Using approximate string-matching methods, the initial vectorized sequence is quantized into a discrete representation with a variable quantization threshold. Finding an optimal signal embedding is formulated in terms of a predictive bottleneck problem that takes into account the trade-off between representation and prediction accuracy. Several downstream applications based on discrete representation are described in this paper, which includes an analysis of the symbolic dynamics of recurrence statistics, motif extraction, segmentation, query matching, and the estimation of transfer entropy between parallel signals. Full article
(This article belongs to the Special Issue Machine Learning for Time Series Analysis)
Show Figures

Figure 1

19 pages, 4410 KiB  
Article
Cicada Species Recognition Based on Acoustic Signals
by Wan Teng Tey, Tee Connie, Kan Yeep Choo and Michael Kah Ong Goh
Algorithms 2022, 15(10), 358; https://doi.org/10.3390/a15100358 - 28 Sep 2022
Cited by 2 | Viewed by 1605
Abstract
Traditional methods used to identify and monitor insect species are time-consuming, costly, and fully dependent on the observer’s ability. This paper presents a deep learning-based cicada species recognition system using acoustic signals to classify the cicada species. The sound recordings of cicada species [...] Read more.
Traditional methods used to identify and monitor insect species are time-consuming, costly, and fully dependent on the observer’s ability. This paper presents a deep learning-based cicada species recognition system using acoustic signals to classify the cicada species. The sound recordings of cicada species were collected from different online sources and pre-processed using denoising algorithms. An improved Härmä syllable segmentation method is introduced to segment the audio signals into syllables since the syllables play a key role in identifying the cicada species. After that, a visual representation of the audio signal was obtained using a spectrogram, which was fed to a convolutional neural network (CNN) to perform classification. The experimental results validated the robustness of the proposed method by achieving accuracies ranging from 66.67% to 100%. Full article
(This article belongs to the Special Issue Machine Learning for Time Series Analysis)
Show Figures

Figure 1

19 pages, 723 KiB  
Article
An Optimized Data Analysis on a Real-Time Application of PEM Fuel Cell Design by Using Machine Learning Algorithms
by Arun Saco, P. Shanmuga Sundari, Karthikeyan J and Anand Paul
Algorithms 2022, 15(10), 346; https://doi.org/10.3390/a15100346 - 25 Sep 2022
Cited by 13 | Viewed by 2089
Abstract
In recent years, machine learning algorithms have been applied in many real-time applications. Crises in the energy sector are the primary challenges experienced today among all countries across the globe, regardless of their economic status. There is a huge demand to acquire and [...] Read more.
In recent years, machine learning algorithms have been applied in many real-time applications. Crises in the energy sector are the primary challenges experienced today among all countries across the globe, regardless of their economic status. There is a huge demand to acquire and produce environmentally friendly renewable energy and to distribute and utilize it efficiently because of its huge production cost. PEMFC are known for their energy efficiency and comparatively low cost, and can be an alternative energy source. The efficiency of these PEMFC can still be enhanced with the help of advanced technologies like machine learning and artificial intelligence, as they provide an optimal solution to explore the hidden knowledge from the generated data. The proposed model attempts to compare several design techniques with varied humidity levels. To enhance the performance of PEMFC, the various humidification processes were considered during the experimental study. The humidification reduces the heat during energy generation and increases the performance of PEM fuel cell. The humidity levels such as 100%, 50%, and 10% were considered to be tested with the machine learning models. The SVMR, LR, and KNN algorithms were tested and observed with the RMSE value as the evaluation parameters. The observed results show that SVMR has an RMSE rate of 0.0046, the LR method has an RMSE rate of 0.0034, and KNN has an RMSE rate of 0.004. The analysis shows that the LR model provides better accuracy than other models. The LR model enhances the PEMFC performance. Full article
(This article belongs to the Special Issue Machine Learning for Time Series Analysis)
Show Figures

Figure 1

Other

Jump to: Research

21 pages, 2743 KiB  
Concept Paper
A Model Architecture for Public Transport Networks Using a Combination of a Recurrent Neural Network Encoder Library and a Attention Mechanism
by Thilo Reich, David Hulbert and Marcin Budka
Algorithms 2022, 15(9), 328; https://doi.org/10.3390/a15090328 - 14 Sep 2022
Cited by 1 | Viewed by 1304
Abstract
This study presents a working concept of a model architecture allowing to leverage the state of an entire transport network to make estimated arrival time (ETA) and next-step location predictions. To this end, a combination of an attention mechanism with a dynamically changing [...] Read more.
This study presents a working concept of a model architecture allowing to leverage the state of an entire transport network to make estimated arrival time (ETA) and next-step location predictions. To this end, a combination of an attention mechanism with a dynamically changing recurrent neural network (RNN)-based encoder library is used. To achieve this, an attention mechanism was employed that incorporates the states of other vehicles in the network by encoding their positions using gated recurrent units (GRUs) of the individual bus line to encode their current state. By muting specific parts of the imputed information, their impact on prediction accuracy can be estimated on a subset of the available data. The results of the experimental investigation show that the full model with access to all the network data performed better in some scenarios. However, a model limited to vehicles of the same line ahead of the target was the best performing model, suggesting that the incorporation of additional data can have a negative impact on the prediction accuracy if they do not add any useful information. This could be caused by poor data quality but also by a lack of interaction between the included lines and the target line. The technical aspects of this study are challenging and resulted in a very inefficient training procedure. We highlight several areas where improvements to our presented method are required to make it a viable alternative to current methods. The findings in this study should be considered as a possible and promising avenue for further research into this novel architecture. As such, it is a stepping stone for future research to improve public transport predictions if network operators provide high-quality datasets. Full article
(This article belongs to the Special Issue Machine Learning for Time Series Analysis)
Show Figures

Figure 1

Back to TopTop