entropy-logo

Journal Browser

Journal Browser

Shannon Information and Kolmogorov Complexity

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Complexity".

Deadline for manuscript submissions: closed (28 February 2021) | Viewed by 26743

Special Issue Editor


E-Mail Website
Guest Editor
Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM 87501, USA
Interests: complexity science; nonlinear phenomena; stochastic calculus; Kolmogorov complexity; complexity measures
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

In 1948, C.E. Shannon introduced the Shannon Information concept in his foundational paper: “A Mathematical Theory of Communication”. It was in the 1960s when several researchers, Solomonoff, Kolmogorov, and Chaitin, gave rise to the concept of Kolmogorov complexity in their seminal papers: “A Preliminary Report on a General Theory of Inductive Inference”, “Three Approaches to the Quantitative Definition of Information”, and “On the Simplicity and Speed of Programs for Computing Infinite Sets of Natural Numbers”, respectively.

While Shannon’s point of view was mainly interested in the minimum expected number of bits to transmit a message from a random source through an error-free channel, the Kolmogorov complexity of a sequence of data, by contrast, measured the length of the shortest computer program that reproduces the sequence and halts.

Even though these two measures look similar, and use similar functional properties, they arguably refer to different, complementary problems: the information of a given communication process and the complexity of an object.

In this Special Issue, we are interested in original research discussing the relationship between these two measures, and their applications to physical and related systems. We welcome cross-disciplinary contribution focusing on the understanding of complex systems.

Prof. Miguel A. Fuentes
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Kolmogorov complexity
  • Shannon information
  • Complexity science
  • Nonlinear phenomena
  • Information theory

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

19 pages, 2752 KiB  
Article
A Modified Multivariable Complexity Measure Algorithm and Its Application for Identifying Mental Arithmetic Task
by Dizhen Ma, Shaobo He and Kehui Sun
Entropy 2021, 23(8), 931; https://doi.org/10.3390/e23080931 - 22 Jul 2021
Cited by 4 | Viewed by 1702
Abstract
Properly measuring the complexity of time series is an important issue. The permutation entropy (PE) is a widely used as an effective complexity measurement algorithm, but it is not suitable for the complexity description of multi-dimensional data. In this paper, in order to [...] Read more.
Properly measuring the complexity of time series is an important issue. The permutation entropy (PE) is a widely used as an effective complexity measurement algorithm, but it is not suitable for the complexity description of multi-dimensional data. In this paper, in order to better measure the complexity of multi-dimensional time series, we proposed a modified multivariable PE (MMPE) algorithm with principal component analysis (PCA) dimensionality reduction, which is a new multi-dimensional time series complexity measurement algorithm. The analysis results of different chaotic systems verify that MMPE is effective. Moreover, we applied it to the comlexity analysis of EEG data. It shows that the person during mental arithmetic task has higher complexity comparing with the state before mental arithmetic task. In addition, we also discussed the necessity of the PCA dimensionality reduction. Full article
(This article belongs to the Special Issue Shannon Information and Kolmogorov Complexity)
Show Figures

Figure 1

26 pages, 1177 KiB  
Article
Dynamics and Complexity of Computrons
by Murat Erkurt
Entropy 2020, 22(2), 150; https://doi.org/10.3390/e22020150 - 27 Jan 2020
Viewed by 2151
Abstract
We investigate chaoticity and complexity of a binary general network automata of finite size with external input which we call a computron. As a generalization of cellular automata, computrons can have non-uniform cell rules, non-regular cell connectivity and an external input. We [...] Read more.
We investigate chaoticity and complexity of a binary general network automata of finite size with external input which we call a computron. As a generalization of cellular automata, computrons can have non-uniform cell rules, non-regular cell connectivity and an external input. We show that any finite-state machine can be represented as a computron and develop two novel set-theoretic concepts: (i) diversity space as a metric space that captures similarity of configurations on a given graph and (ii) basin complexity as a measure of complexity of partitions of the diversity space. We use these concepts to quantify chaoticity of computrons’ dynamics and the complexity of their basins of attraction. The theory is then extended into probabilistic machines where we define fuzzy basin partitioning of recurrent classes and introduce the concept of ergodic decomposition. A case study on 1D cyclic computron is provided with both deterministic and probabilistic versions. Full article
(This article belongs to the Special Issue Shannon Information and Kolmogorov Complexity)
Show Figures

Figure 1

26 pages, 2607 KiB  
Article
Statistical Complexity Analysis of Turing Machine tapes with Fixed Algorithmic Complexity Using the Best-Order Markov Model
by Jorge M. Silva, Eduardo Pinho, Sérgio Matos and Diogo Pratas
Entropy 2020, 22(1), 105; https://doi.org/10.3390/e22010105 - 16 Jan 2020
Cited by 3 | Viewed by 3636
Abstract
Sources that generate symbolic sequences with algorithmic nature may differ in statistical complexity because they create structures that follow algorithmic schemes, rather than generating symbols from a probabilistic function assuming independence. In the case of Turing machines, this means that machines with the [...] Read more.
Sources that generate symbolic sequences with algorithmic nature may differ in statistical complexity because they create structures that follow algorithmic schemes, rather than generating symbols from a probabilistic function assuming independence. In the case of Turing machines, this means that machines with the same algorithmic complexity can create tapes with different statistical complexity. In this paper, we use a compression-based approach to measure global and local statistical complexity of specific Turing machine tapes with the same number of states and alphabet. Both measures are estimated using the best-order Markov model. For the global measure, we use the Normalized Compression (NC), while, for the local measures, we define and use normal and dynamic complexity profiles to quantify and localize lower and higher regions of statistical complexity. We assessed the validity of our methodology on synthetic and real genomic data showing that it is tolerant to increasing rates of editions and block permutations. Regarding the analysis of the tapes, we localize patterns of higher statistical complexity in two regions, for a different number of machine states. We show that these patterns are generated by a decrease of the tape’s amplitude, given the setting of small rule cycles. Additionally, we performed a comparison with a measure that uses both algorithmic and statistical approaches (BDM) for analysis of the tapes. Naturally, BDM is efficient given the algorithmic nature of the tapes. However, for a higher number of states, BDM is progressively approximated by our methodology. Finally, we provide a simple algorithm to increase the statistical complexity of a Turing machine tape while retaining the same algorithmic complexity. We supply a publicly available implementation of the algorithm in C++ language under the GPLv3 license. All results can be reproduced in full with scripts provided at the repository. Full article
(This article belongs to the Special Issue Shannon Information and Kolmogorov Complexity)
Show Figures

Figure 1

10 pages, 725 KiB  
Article
Fuzzy Kolmogorov Complexity Based on a Classical Description
by Songsong Dai
Entropy 2020, 22(1), 66; https://doi.org/10.3390/e22010066 - 04 Jan 2020
Cited by 2 | Viewed by 1975
Abstract
In this paper, we give a definition for fuzzy Kolmogorov complexity. In the classical setting, the Kolmogorov complexity of a single finite string is the length of the shortest program that produces this string. We define the fuzzy Kolmogorov complexity as the minimum [...] Read more.
In this paper, we give a definition for fuzzy Kolmogorov complexity. In the classical setting, the Kolmogorov complexity of a single finite string is the length of the shortest program that produces this string. We define the fuzzy Kolmogorov complexity as the minimum classical description length of a finite-valued fuzzy language through a universal finite-valued fuzzy Turing machine that produces the desired fuzzy language. The classical Kolmogorov complexity is extended to the fuzzy domain retaining classical descriptions. We show that our definition is robust, that is to say, the complexity of a finite-valued fuzzy language does not depend on the underlying finite-valued fuzzy Turing machine. Full article
(This article belongs to the Special Issue Shannon Information and Kolmogorov Complexity)
18 pages, 356 KiB  
Article
A Survey on Using Kolmogorov Complexity in Cybersecurity
by João S. Resende, Rolando Martins and Luís Antunes
Entropy 2019, 21(12), 1196; https://doi.org/10.3390/e21121196 - 05 Dec 2019
Cited by 8 | Viewed by 3939
Abstract
Security and privacy concerns are challenging the way users interact with devices. The number of devices connected to a home or enterprise network increases every day. Nowadays, the security of information systems is relevant as user information is constantly being shared and moving [...] Read more.
Security and privacy concerns are challenging the way users interact with devices. The number of devices connected to a home or enterprise network increases every day. Nowadays, the security of information systems is relevant as user information is constantly being shared and moving in the cloud; however, there are still many problems such as, unsecured web interfaces, weak authentication, insecure networks, lack of encryption, among others, that make services insecure. The software implementations that are currently deployed in companies should have updates and control, as cybersecurity threats increasingly appearing over time. There is already some research towards solutions and methods to predict new attacks or classify variants of previous known attacks, such as (algorithmic) information theory. This survey combines all relevant applications of this topic (also known as Kolmogorov Complexity) in the security and privacy domains. The use of Kolmogorov-based approaches is resource-focused without the need for specific knowledge of the topic under analysis. We have defined a taxonomy with already existing work to classify their different application areas and open up new research questions. Full article
(This article belongs to the Special Issue Shannon Information and Kolmogorov Complexity)
Show Figures

Figure 1

14 pages, 2190 KiB  
Article
Magnetotelluric Signal-Noise Separation Using IE-LZC and MP
by Xian Zhang, Diquan Li, Jin Li, Yong Li, Jialin Wang, Shanshan Liu and Zhimin Xu
Entropy 2019, 21(12), 1190; https://doi.org/10.3390/e21121190 - 04 Dec 2019
Cited by 2 | Viewed by 2742
Abstract
Eliminating noise signals of the magnetotelluric (MT) method is bound to improve the quality of MT data. However, existing de-noising methods are designed for use in whole MT data sets, causing the loss of low-frequency information and severe mutation of the apparent resistivity-phase [...] Read more.
Eliminating noise signals of the magnetotelluric (MT) method is bound to improve the quality of MT data. However, existing de-noising methods are designed for use in whole MT data sets, causing the loss of low-frequency information and severe mutation of the apparent resistivity-phase curve in low-frequency bands. In this paper, we used information entropy (IE), the Lempel–Ziv complexity (LZC), and matching pursuit (MP) to distinguish and suppress MT noise signals. Firstly, we extracted IE and LZC characteristic parameters from each segment of the MT signal in the time-series. Then, the characteristic parameters were input into the FCM clustering to automatically distinguish between the signal and noise. Next, the MP de-noising algorithm was used independently to eliminate MT signal segments that were identified as interference. Finally, the identified useful signal segments were combined with the denoised data segments to reconstruct the signal. The proposed method was validated through clustering analysis based on the signal samples collected at the Qinghai test site and the measured sites, where the results were compared to those obtained using the remote reference method and independent use of the MP method. The findings show that strong interference is purposefully removed, and the apparent resistivity-phase curve is continuous and stable. Moreover, the processed data can accurately reflect the geoelectrical information and improve the level of geological interpretation. Full article
(This article belongs to the Special Issue Shannon Information and Kolmogorov Complexity)
Show Figures

Figure 1

21 pages, 2794 KiB  
Article
Kolmogorov Complexity of Coronary Sinus Atrial Electrograms Before Ablation Predicts Termination of Atrial Fibrillation After Pulmonary Vein Isolation
by Katarzyna Stępień, Pawel Kuklik, Jan J. Żebrowski, Prashanthan Sanders, Paweł Derejko and Piotr Podziemski
Entropy 2019, 21(10), 970; https://doi.org/10.3390/e21100970 - 04 Oct 2019
Cited by 1 | Viewed by 2718
Abstract
Atrial fibrillation (AF) is related to a very complex local electrical activity reflected in the rich morphology of intracardiac electrograms. The link between electrogram complexity and efficacy of the catheter ablation is unclear. We test the hypothesis that the Kolmogorov complexity of a [...] Read more.
Atrial fibrillation (AF) is related to a very complex local electrical activity reflected in the rich morphology of intracardiac electrograms. The link between electrogram complexity and efficacy of the catheter ablation is unclear. We test the hypothesis that the Kolmogorov complexity of a single atrial bipolar electrogram recorded during AF within the coronary sinus (CS) at the beginning of the catheter ablation may predict AF termination directly after pulmonary vein isolation (PVI). The study population consisted of 26 patients for whom 30 s baseline electrograms were recorded. In all cases PVI was performed. If AF persisted after PVI, ablation was extended beyond PVs. Kolmogorov complexity estimated by Lempel–Ziv complexity and the block decomposition method was calculated and compared with other measures: Shannon entropy, AF cycle length, dominant frequency, regularity, organization index, electrogram fractionation, sample entropy and wave morphology similarity index. A 5 s window length was chosen as optimal in calculations. There was a significant difference in Kolmogorov complexity between patients with AF termination directly after PVI compared to patients undergoing additional ablation (p < 0.01). No such difference was seen for remaining complexity parameters. Kolmogorov complexity of CS electrograms measured at baseline before PVI can predict self-termination of AF directly after PVI. Full article
(This article belongs to the Special Issue Shannon Information and Kolmogorov Complexity)
Show Figures

Figure 1

Review

Jump to: Research

28 pages, 1538 KiB  
Review
A Review of Methods for Estimating Algorithmic Complexity: Options, Challenges, and New Directions
by Hector Zenil
Entropy 2020, 22(6), 612; https://doi.org/10.3390/e22060612 - 30 May 2020
Cited by 17 | Viewed by 6739
Abstract
Some established and also novel techniques in the field of applications of algorithmic (Kolmogorov) complexity currently co-exist for the first time and are here reviewed, ranging from dominant ones such as statistical lossless compression to newer approaches that advance, complement and also pose [...] Read more.
Some established and also novel techniques in the field of applications of algorithmic (Kolmogorov) complexity currently co-exist for the first time and are here reviewed, ranging from dominant ones such as statistical lossless compression to newer approaches that advance, complement and also pose new challenges and may exhibit their own limitations. Evidence suggesting that these different methods complement each other for different regimes is presented and despite their many challenges, some of these methods can be better motivated by and better grounded in the principles of algorithmic information theory. It will be explained how different approaches to algorithmic complexity can explore the relaxation of different necessary and sufficient conditions in their pursuit of numerical applicability, with some of these approaches entailing greater risks than others in exchange for greater relevance. We conclude with a discussion of possible directions that may or should be taken into consideration to advance the field and encourage methodological innovation, but more importantly, to contribute to scientific discovery. This paper also serves as a rebuttal of claims made in a previously published minireview by another author, and offers an alternative account. Full article
(This article belongs to the Special Issue Shannon Information and Kolmogorov Complexity)
Show Figures

Figure 1

Back to TopTop