entropy-logo

Journal Browser

Journal Browser

Symbolic Entropy Analysis and Its Applications II

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 August 2021) | Viewed by 19832

Special Issue Editor

Special Issue Information

Dear Colleagues,

Symbolic data analysis has received a great deal of attention over the last few years and has been applied to many research areas, including astrophysics and geophysics, biology and medicine, fluid flow, chemistry, mechanical systems, artificial intelligence, communication systems, and, recently, data mining and big data. A fundamental step in this methodology is the quantization of original data into a corresponding sequence of symbols. The resulting time series is then considered a transformed version of the original data, allowing to highlight its temporal information. Indeed, it has been proven that this symbolization procedure can notably improve signal-to-noise ratios in some noisy time series. Moreover, symbolic data analysis also makes communication and numerical computation more efficient and effective, compared with the processing of continuous-valued time series.

However, symbolization of a time series always involves information loss, and, hence, this process deserves special attention. This challenge, along with other problems associated with symbolic entropy analysis, has been addressed in the first volume of the Special Issue on “Symbolic Entropy Analysis and Its Applications”. Given its success, this second volume aims to compile key current research on novel symbolization approaches, as well as applications of this kind of analysis to reveal pioneering information from different types of time series. Hence, manuscripts dealing with these topics will be welcome.

Volume I: https://www.mdpi.com/journal/entropy/special_issues/symbolic_entropy_analysis

Prof. Dr. Raúl Alcaraz Martínez
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Symbolic data analysis
  • Symbolization approaches
  • Symbolic entropy
  • Transfer entropy
  • Permutation entropy
  • Lempel–Ziv Complexity

Related Special Issues

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

28 pages, 4482 KiB  
Article
Entropy Profiling: A Reduced—Parametric Measure of Kolmogorov—Sinai Entropy from Short-Term HRV Signal
by Chandan Karmakar, Radhagayathri Udhayakumar and Marimuthu Palaniswami
Entropy 2020, 22(12), 1396; https://doi.org/10.3390/e22121396 - 10 Dec 2020
Cited by 9 | Viewed by 2344
Abstract
Entropy profiling is a recently introduced approach that reduces parametric dependence in traditional Kolmogorov-Sinai (KS) entropy measurement algorithms. The choice of the threshold parameter r of vector distances in traditional entropy computations is crucial in deciding the accuracy of signal irregularity information retrieved [...] Read more.
Entropy profiling is a recently introduced approach that reduces parametric dependence in traditional Kolmogorov-Sinai (KS) entropy measurement algorithms. The choice of the threshold parameter r of vector distances in traditional entropy computations is crucial in deciding the accuracy of signal irregularity information retrieved by these methods. In addition to making parametric choices completely data-driven, entropy profiling generates a complete profile of entropy information as against a single entropy estimate (seen in traditional algorithms). The benefits of using “profiling” instead of “estimation” are: (a) precursory methods such as approximate and sample entropy that have had the limitation of handling short-term signals (less than 1000 samples) are now made capable of the same; (b) the entropy measure can capture complexity information from short and long-term signals without multi-scaling; and (c) this new approach facilitates enhanced information retrieval from short-term HRV signals. The novel concept of entropy profiling has greatly equipped traditional algorithms to overcome existing limitations and broaden applicability in the field of short-term signal analysis. In this work, we present a review of KS-entropy methods and their limitations in the context of short-term heart rate variability analysis and elucidate the benefits of using entropy profiling as an alternative for the same. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications II)
Show Figures

Figure 1

23 pages, 5058 KiB  
Article
Development of Automated Sleep Stage Classification System Using Multivariate Projection-Based Fixed Boundary Empirical Wavelet Transform and Entropy Features Extracted from Multichannel EEG Signals
by Rajesh Kumar Tripathy, Samit Kumar Ghosh, Pranjali Gajbhiye and U. Rajendra Acharya
Entropy 2020, 22(10), 1141; https://doi.org/10.3390/e22101141 - 09 Oct 2020
Cited by 32 | Viewed by 3199
Abstract
The categorization of sleep stages helps to diagnose different sleep-related ailments. In this paper, an entropy-based information–theoretic approach is introduced for the automated categorization of sleep stages using multi-channel electroencephalogram (EEG) signals. This approach comprises of three stages. First, the decomposition of multi-channel [...] Read more.
The categorization of sleep stages helps to diagnose different sleep-related ailments. In this paper, an entropy-based information–theoretic approach is introduced for the automated categorization of sleep stages using multi-channel electroencephalogram (EEG) signals. This approach comprises of three stages. First, the decomposition of multi-channel EEG signals into sub-band signals or modes is performed using a novel multivariate projection-based fixed boundary empirical wavelet transform (MPFBEWT) filter bank. Second, entropy features such as bubble and dispersion entropies are computed from the modes of multi-channel EEG signals. Third, a hybrid learning classifier based on class-specific residuals using sparse representation and distances from nearest neighbors is used to categorize sleep stages automatically using entropy-based features computed from MPFBEWT domain modes of multi-channel EEG signals. The proposed approach is evaluated using the multi-channel EEG signals obtained from the cyclic alternating pattern (CAP) sleep database. Our results reveal that the proposed sleep staging approach has obtained accuracies of 91.77%, 88.14%, 80.13%, and 73.88% for the automated categorization of wake vs. sleep, wake vs. rapid eye movement (REM) vs. Non-REM, wake vs. light sleep vs. deep sleep vs. REM sleep, and wake vs. S1-sleep vs. S2-sleep vs. S3-sleep vs. REM sleep schemes, respectively. The developed method has obtained the highest overall accuracy compared to the state-of-art approaches and is ready to be tested with more subjects before clinical application. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications II)
Show Figures

Graphical abstract

23 pages, 1100 KiB  
Article
Entropy Monotonicity and Superstable Cycles for the Quadratic Family Revisited
by José M. Amigó and Ángel Giménez
Entropy 2020, 22(10), 1136; https://doi.org/10.3390/e22101136 - 07 Oct 2020
Viewed by 1444
Abstract
The main result of this paper is a proof using real analysis of the monotonicity of the topological entropy for the family of quadratic maps, sometimes called Milnor’s Monotonicity Conjecture. In contrast, the existing proofs rely in one way or another on complex [...] Read more.
The main result of this paper is a proof using real analysis of the monotonicity of the topological entropy for the family of quadratic maps, sometimes called Milnor’s Monotonicity Conjecture. In contrast, the existing proofs rely in one way or another on complex analysis. Our proof is based on tools and algorithms previously developed by the authors and collaborators to compute the topological entropy of multimodal maps. Specifically, we use the number of transverse intersections of the map iterations with the so-called critical line. The approach is technically simple and geometrical. The same approach is also used to briefly revisit the superstable cycles of the quadratic maps, since both topics are closely related. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications II)
Show Figures

Figure 1

15 pages, 642 KiB  
Article
Fever Time Series Analysis Using Slope Entropy. Application to Early Unobtrusive Differential Diagnosis
by David Cuesta-Frau, Pradeepa H. Dakappa, Chakrapani Mahabala and Arjun R. Gupta
Entropy 2020, 22(9), 1034; https://doi.org/10.3390/e22091034 - 15 Sep 2020
Cited by 20 | Viewed by 2741
Abstract
Fever is a readily measurable physiological response that has been used in medicine for centuries. However, the information provided has been greatly limited by a plain thresholding approach, overlooking the additional information provided by temporal variations and temperature values below such threshold that [...] Read more.
Fever is a readily measurable physiological response that has been used in medicine for centuries. However, the information provided has been greatly limited by a plain thresholding approach, overlooking the additional information provided by temporal variations and temperature values below such threshold that are also representative of the subject status. In this paper, we propose to utilize continuous body temperature time series of patients that developed a fever, in order to apply a method capable of diagnosing the specific underlying fever cause only by means of a pattern relative frequency analysis. This analysis was based on a recently proposed measure, Slope Entropy, applied to a variety of records coming from dengue and malaria patients, among other fever diseases. After an input parameter customization, a classification analysis of malaria and dengue records took place, quantified by the Matthews Correlation Coefficient. This classification yielded a high accuracy, with more than 90% of the records correctly labelled in some cases, demonstrating the feasibility of the approach proposed. This approach, after further studies, or combined with more measures such as Sample Entropy, is certainly very promising in becoming an early diagnosis tool based solely on body temperature temporal patterns, which is of great interest in the current Covid-19 pandemic scenario. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications II)
Show Figures

Figure 1

19 pages, 1382 KiB  
Article
Permutation Entropy as a Measure of Information Gain/Loss in the Different Symbolic Descriptions of Financial Data
by Jan Kozak, Krzysztof Kania and Przemysław Juszczuk
Entropy 2020, 22(3), 330; https://doi.org/10.3390/e22030330 - 13 Mar 2020
Cited by 8 | Viewed by 3301
Abstract
Financial markets give a large number of trading opportunities. However, over-complicated systems make it very difficult to be effectively used by decision-makers. Volatility and noise present in the markets evoke a need to simplify the market picture derived for the decision-makers. Symbolic representation [...] Read more.
Financial markets give a large number of trading opportunities. However, over-complicated systems make it very difficult to be effectively used by decision-makers. Volatility and noise present in the markets evoke a need to simplify the market picture derived for the decision-makers. Symbolic representation fits in this concept and greatly reduces data complexity. However, at the same time, some information from the market is lost. Our motivation is to answer the question: What is the impact of introducing different data representation on the overall amount of information derived for the decision-maker? We concentrate on the possibility of using entropy as a measure of the information gain/loss for the financial data, and as a basic form, we assume permutation entropy with later modifications. We investigate different symbolic representations and compare them with classical data representation in terms of entropy. The real-world data covering the time span of 10 years are used in the experiments. The results and the statistical verification show that extending the symbolic description of the time series does not affect the permutation entropy values. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications II)
Show Figures

Figure 1

26 pages, 770 KiB  
Article
Augmentation of Dispersion Entropy for Handling Missing and Outlier Samples in Physiological Signal Monitoring
by Evangelos Kafantaris, Ian Piper, Tsz-Yan Milly Lo and Javier Escudero
Entropy 2020, 22(3), 319; https://doi.org/10.3390/e22030319 - 11 Mar 2020
Cited by 7 | Viewed by 2933
Abstract
Entropy quantification algorithms are becoming a prominent tool for the physiological monitoring of individuals through the effective measurement of irregularity in biological signals. However, to ensure their effective adaptation in monitoring applications, the performance of these algorithms needs to be robust when analysing [...] Read more.
Entropy quantification algorithms are becoming a prominent tool for the physiological monitoring of individuals through the effective measurement of irregularity in biological signals. However, to ensure their effective adaptation in monitoring applications, the performance of these algorithms needs to be robust when analysing time-series containing missing and outlier samples, which are common occurrence in physiological monitoring setups such as wearable devices and intensive care units. This paper focuses on augmenting Dispersion Entropy (DisEn) by introducing novel variations of the algorithm for improved performance in such applications. The original algorithm and its variations are tested under different experimental setups that are replicated across heart rate interval, electroencephalogram, and respiratory impedance time-series. Our results indicate that the algorithmic variations of DisEn achieve considerable improvements in performance while our analysis signifies that, in consensus with previous research, outlier samples can have a major impact in the performance of entropy quantification algorithms. Consequently, the presented variations can aid the implementation of DisEn to physiological monitoring applications through the mitigation of the disruptive effect of missing and outlier samples. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications II)
Show Figures

Figure 1

14 pages, 994 KiB  
Article
Optimized Dimensionality Reduction Methods for Interval-Valued Variables and Their Application to Facial Recognition
by Jorge Arce Garro and Oldemar Rodríguez Rojas
Entropy 2019, 21(10), 1016; https://doi.org/10.3390/e21101016 - 19 Oct 2019
Cited by 2 | Viewed by 2736
Abstract
The center method, which was first proposed in Rev. Stat. Appl. 1997 by Cazes et al. and Stat. Anal. Data Mining 2011 by Douzal-Chouakria et al., extends the well-known Principal Component Analysis (PCA) method to particular types of symbolic objects that are characterized [...] Read more.
The center method, which was first proposed in Rev. Stat. Appl. 1997 by Cazes et al. and Stat. Anal. Data Mining 2011 by Douzal-Chouakria et al., extends the well-known Principal Component Analysis (PCA) method to particular types of symbolic objects that are characterized by multivalued interval-type variables. In contrast to classical data, symbolic data have internal variation. The authors who originally proposed the center method used the center of a hyper-rectangle in R m as a base point to carry out PCA, followed by the projection of all vertices of the hyper-rectangles as supplementary elements. Since these publications, the center point of the hyper-rectangle has typically been assumed to be the best point for the initial PCA. However, in this paper, we show that this is not always the case, if the aim is to maximize the variance of projections or minimize the squared distance between the vertices and their respective projections. Instead, we propose the use of an optimization algorithm that maximizes the variance of the projections (or that minimizes the distances between the squares of the vertices and their respective projections) and finds the optimal point for the initial PCA. The vertices of the hyper-rectangles are, then, projected as supplementary variables to this optimal point, which we call the “Best Point” for projection. For this purpose, we propose four new algorithms and two new theorems. The proposed methods and algorithms are illustrated using a data set comprised of measurements of facial characteristics from a study on facial recognition patterns for use in surveillance. The performance of our approach is compared with that of another procedure in the literature, and the results show that our symbolic analyses provide more accurate information. Our approach can be regarded as an optimization method, as it maximizes the explained variance or minimizes the squared distance between projections and the original points. In addition, the symbolic analyses generate more informative conclusions, compared with the classical analysis in which classical surrogates replace intervals. All the methods proposed in this paper can be executed in the RSDA package developed in R. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications II)
Show Figures

Figure 1

Back to TopTop