entropy-logo

Journal Browser

Journal Browser

Applications of Entropy in Causality Analysis

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: closed (30 June 2022) | Viewed by 12900

Special Issue Editors

Department of Automation, Tsinghua University, Beijing 100084, China
Interests: prognosis and health management; smart alarm monitoring; intelligent fault diagnosis
Special Issues, Collections and Topics in MDPI journals
School of Automation, China University of Geosciences, Wuhan 430074, China
Interests: advanced alarm monitoring; process data analytics; data mining for complex industrial processes
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues, 

In the analysis of large-scale systems, causality has become an important concept through which to describe the relationships between phenomena, events, or elements. In particular, in abnormal situations, a local and small fault can propagate through material or information paths to a wide range, leading to a global fault, a significant failure, or even an accident. It is therefore vital to capture the causal relationship in addition to the correlation or association that is easily obtained from time series data. Based on this causality information, one can trace the origin and predict the consequence, which is beneficial for abnormal situation analysis. Causality can be described and captured through statistical analysis. For linear relationships, Granger causality is widely used. For other complex cases, information-theoretic methods have shown their advantages. Entropy-based techniques, such as transfer entropy, directed transfer entropy, transfer zero-entropy, and their variants or extensions, have been developed and have proven effective in many applications.

Hence, this Special Issue, entitled “Applications of Entropy in Causality Analysis”, welcomes theoretical or application submissions reporting original research on the development and application of entropy-based techniques to quantify, characterize, or model causality through time series. We are also happy to receive reviews and commentaries aligned with the vision of this Special Issue. Specifically, this Special Issue will accept unpublished original papers and comprehensive reviews focused on (but not restricted to) the following research areas:

  • Entropy-based approaches for causality analysis
  • Data-driven methods for causality analysis
  • Process knowledge or model-based connectivity and causality analysis
  • Parametric or non-parametric models for cause–effect relations
  • Causality inference for root cause analysis
  • Applications of causality analysis in (but not limited to) the manufacturing industry, information technology, biological sciences, and social sciences

Dr. Fan Yang
Dr. Wenkai Hu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • transfer entropy
  • direct transfer entropy
  • transfer zero-entropy
  • partial transfer entropy
  • phase transfer entropy
  • mutual information
  • causality analysis
  • abnormal situation analysis
  • big data analytics

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

23 pages, 470 KiB  
Article
Entropy-Based Discovery of Summary Causal Graphs in Time Series
by Charles K. Assaad, Emilie Devijver and Eric Gaussier
Entropy 2022, 24(8), 1156; https://doi.org/10.3390/e24081156 - 19 Aug 2022
Cited by 4 | Viewed by 2342
Abstract
This study addresses the problem of learning a summary causal graph on time series with potentially different sampling rates. To do so, we first propose a new causal temporal mutual information measure for time series. We then show how this measure relates to [...] Read more.
This study addresses the problem of learning a summary causal graph on time series with potentially different sampling rates. To do so, we first propose a new causal temporal mutual information measure for time series. We then show how this measure relates to an entropy reduction principle that can be seen as a special case of the probability raising principle. We finally combine these two ingredients in PC-like and FCI-like algorithms to construct the summary causal graph. There algorithm are evaluated on several datasets, which shows both their efficacy and efficiency. Full article
(This article belongs to the Special Issue Applications of Entropy in Causality Analysis)
Show Figures

Figure 1

12 pages, 2247 KiB  
Article
Causality Analysis for COVID-19 among Countries Using Effective Transfer Entropy
by Baki Ünal
Entropy 2022, 24(8), 1115; https://doi.org/10.3390/e24081115 - 13 Aug 2022
Cited by 3 | Viewed by 1345
Abstract
In this study, causalities of COVID-19 across a group of seventy countries are analyzed with effective transfer entropy. To reveal the causalities, a weighted directed network is constructed. In this network, the weights of the links reveal the strength of the causality which [...] Read more.
In this study, causalities of COVID-19 across a group of seventy countries are analyzed with effective transfer entropy. To reveal the causalities, a weighted directed network is constructed. In this network, the weights of the links reveal the strength of the causality which is obtained by calculating effective transfer entropies. Transfer entropy has some advantages over other causality evaluation methods. Firstly, transfer entropy can quantify the strength of the causality and secondly it can detect nonlinear causal relationships. After the construction of the causality network, it is analyzed with well-known network analysis methods such as eigenvector centrality, PageRank, and community detection. Eigenvector centrality and PageRank metrics reveal the importance and the centrality of each node country in the network. In community detection, node countries in the network are divided into groups such that countries in each group are much more densely connected. Full article
(This article belongs to the Special Issue Applications of Entropy in Causality Analysis)
Show Figures

Figure 1

19 pages, 1272 KiB  
Article
Estimating the Individual Treatment Effect on Survival Time Based on Prior Knowledge and Counterfactual Prediction
by Yijie Zhao, Hao Zhou, Jin Gu and Hao Ye
Entropy 2022, 24(7), 975; https://doi.org/10.3390/e24070975 - 14 Jul 2022
Viewed by 1452
Abstract
The estimation of the Individual Treatment Effect (ITE) on survival time is an important research topic in clinics-based causal inference. Various representation learning methods have been proposed to deal with its three key problems, i.e., reducing selection bias, handling censored survival data, and [...] Read more.
The estimation of the Individual Treatment Effect (ITE) on survival time is an important research topic in clinics-based causal inference. Various representation learning methods have been proposed to deal with its three key problems, i.e., reducing selection bias, handling censored survival data, and avoiding balancing non-confounders. However, none of them consider all three problems in a single method. In this study, by combining the Counterfactual Survival Analysis (CSA) model and Dragonnet from the literature, we first propose a CSA–Dragonnet to deal with the three problems simultaneously. Moreover, we found that conclusions from traditional Randomized Controlled Trials (RCTs) or Retrospective Cohort Studies (RCSs) can offer valuable bound information to the counterfactual learning of ITE, which has never been used by existing ITE estimation methods. Hence, we further propose a CSA–Dragonnet with Embedded Prior Knowledge (CDNEPK) by formulating a unified expression of the prior knowledge given by RCTs or RCSs, inserting counterfactual prediction nets into CSA–Dragonnet and defining loss items based on the bounds for the ITE extracted from prior knowledge. Semi-synthetic data experiments showed that CDNEPK has superior performance. Real-world experiments indicated that CDNEPK can offer meaningful treatment advice. Full article
(This article belongs to the Special Issue Applications of Entropy in Causality Analysis)
Show Figures

Figure 1

18 pages, 1868 KiB  
Article
Detection of Cause-Effect Relations Based on Information Granulation and Transfer Entropy
by Xiangxiang Zhang, Wenkai Hu and Fan Yang
Entropy 2022, 24(2), 212; https://doi.org/10.3390/e24020212 - 28 Jan 2022
Cited by 10 | Viewed by 2597
Abstract
Causality inference is a process to infer Cause-Effect relations between variables in, typically, complex systems, and it is commonly used for root cause analysis in large-scale process industries. Transfer entropy (TE), as a non-parametric causality inference method, is an effective method to detect [...] Read more.
Causality inference is a process to infer Cause-Effect relations between variables in, typically, complex systems, and it is commonly used for root cause analysis in large-scale process industries. Transfer entropy (TE), as a non-parametric causality inference method, is an effective method to detect Cause-Effect relations in both linear and nonlinear processes. However, a major drawback of transfer entropy lies in the high computational complexity, which hinders its real application, especially in systems that have high requirements for real-time estimation. Motivated by such a problem, this study proposes an improved method for causality inference based on transfer entropy and information granulation. The calculation of transfer entropy is improved with a new framework that integrates the information granulation as a critical preceding step; moreover, a window-length determination method is proposed based on delay estimation, so as to conduct appropriate data compression using information granulation. The effectiveness of the proposed method is demonstrated by both a numerical example and an industrial case, with a two-tank simulation model. As shown by the results, the proposed method can reduce the computational complexity significantly while holding a strong capability for accurate casuality detection. Full article
(This article belongs to the Special Issue Applications of Entropy in Causality Analysis)
Show Figures

Figure 1

19 pages, 2777 KiB  
Article
An Improved Calculation Formula of the Extended Entropic Chaos Degree and Its Application to Two-Dimensional Chaotic Maps
by Kei Inoue
Entropy 2021, 23(11), 1511; https://doi.org/10.3390/e23111511 - 14 Nov 2021
Cited by 3 | Viewed by 1639
Abstract
The Lyapunov exponent is primarily used to quantify the chaos of a dynamical system. However, it is difficult to compute the Lyapunov exponent of dynamical systems from a time series. The entropic chaos degree is a criterion for quantifying chaos in dynamical systems [...] Read more.
The Lyapunov exponent is primarily used to quantify the chaos of a dynamical system. However, it is difficult to compute the Lyapunov exponent of dynamical systems from a time series. The entropic chaos degree is a criterion for quantifying chaos in dynamical systems through information dynamics, which is directly computable for any time series. However, it requires higher values than the Lyapunov exponent for any chaotic map. Therefore, the improved entropic chaos degree for a one-dimensional chaotic map under typical chaotic conditions was introduced to reduce the difference between the Lyapunov exponent and the entropic chaos degree. Moreover, the improved entropic chaos degree was extended for a multidimensional chaotic map. Recently, the author has shown that the extended entropic chaos degree takes the same value as the total sum of the Lyapunov exponents under typical chaotic conditions. However, the author has assumed a value of infinity for some numbers, especially the number of mapping points. Nevertheless, in actual numerical computations, these numbers are treated as finite. This study proposes an improved calculation formula of the extended entropic chaos degree to obtain appropriate numerical computation results for two-dimensional chaotic maps. Full article
(This article belongs to the Special Issue Applications of Entropy in Causality Analysis)
Show Figures

Figure 1

15 pages, 759 KiB  
Article
Measuring Causal Invariance Formally
by Pierrick Bourrat
Entropy 2021, 23(6), 690; https://doi.org/10.3390/e23060690 - 30 May 2021
Cited by 3 | Viewed by 1963
Abstract
Invariance is one of several dimensions of causal relationships within the interventionist account. The more invariant a relationship between two variables, the more the relationship should be considered paradigmatically causal. In this paper, I propose two formal measures to estimate invariance, illustrated by [...] Read more.
Invariance is one of several dimensions of causal relationships within the interventionist account. The more invariant a relationship between two variables, the more the relationship should be considered paradigmatically causal. In this paper, I propose two formal measures to estimate invariance, illustrated by a simple example. I then discuss the notion of invariance for causal relationships between non-nominal (i.e., ordinal and quantitative) variables, for which Information theory, and hence the formalism proposed here, is not well suited. Finally, I propose how invariance could be qualified for such variables. Full article
(This article belongs to the Special Issue Applications of Entropy in Causality Analysis)
Show Figures

Figure 1

Back to TopTop