entropy-logo

Journal Browser

Journal Browser

Synergy and Redundancy Measures: Theory and Applications to Characterize Complex Systems and Shape Neural Network Representations

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (20 April 2024) | Viewed by 4341

Special Issue Editor


E-Mail Website
Guest Editor
Department of Computer Science, School of Science & Technology, City, University of London, London EC1V 0HB, UK
Interests: data analysis; causal inference; dimensionality reduction; neuroscience; sensitivity analysis; structure learning; information decomposition; information bottleneck

Special Issue Information

Dear Colleagues,

An important aspect of how sources of information are distributed across a set of variables concerns whether different variables provide redundant, unique, or synergistic information when combined with other variables. Intuitively, variables share redundant information if each variable carries individually the same information carried by other variables. Information carried by a certain variable is unique if it is not carried by any other variables or their combination, and a group of variables carries synergistic information if some information arises only when they are combined.

Recent advances have contributed toward building an information-theoretic framework to determine the distribution and nature of information extractable from multivariate data sets. Measures of redundant, unique, or synergistic information characterize dependencies between the parts of a multivariate system and can help to understand its function and mechanisms. Furthermore, these measures are also useful to analyze how information is distributed across layers in neural networks or can be used as cost functions to shape the structure of data representations learned by the networks.

This Special Issue welcomes contributions on advances in both the theoretical formulation and applications of information-theoretic measures of synergy and redundancy. Encompassed topics include:

  • Advances in a multivariate formulation of redundancy measures or in the comparison of alternative proposals, addressing their distinctive power to capture relevant structures in both synthetic and experimental data sets;
  • Applications to understand interactions in real complex systems;
  • Advances in the estimation of information-theoretic quantities from high-dimensional data sets;
  • Applications for feature selection and sensitivity analysis;
  • Analysis of the distribution and nature of information across layers in neural networks;
  • Design of deep learning models to obtain robust or disentangled data representations.

Dr. Daniel Chicharro
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • mutual information
  • synergy
  • redundancy
  • unique information
  • neural networks
  • disentanglement
  • feature extraction
  • representation learning
  • partial information decomposition

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

24 pages, 790 KiB  
Article
A Measure of Synergy Based on Union Information
by André F. C. Gomes and Mário A. T. Figueiredo
Entropy 2024, 26(3), 271; https://doi.org/10.3390/e26030271 - 19 Mar 2024
Viewed by 735
Abstract
The partial information decomposition (PID) framework is concerned with decomposing the information that a set of (two or more) random variables (the sources) has about another variable (the target) into three types of information: unique, redundant, and synergistic. Classical information theory alone does [...] Read more.
The partial information decomposition (PID) framework is concerned with decomposing the information that a set of (two or more) random variables (the sources) has about another variable (the target) into three types of information: unique, redundant, and synergistic. Classical information theory alone does not provide a unique way to decompose information in this manner and additional assumptions have to be made. One often overlooked way to achieve this decomposition is using a so-called measure of union information—which quantifies the information that is present in at least one of the sources—from which a synergy measure stems. In this paper, we introduce a new measure of union information based on adopting a communication channel perspective, compare it with existing measures, and study some of its properties. We also include a comprehensive critical review of characterizations of union information and synergy measures that have been proposed in the literature. Full article
Show Figures

Figure 1

11 pages, 353 KiB  
Article
Conditioning in Tropical Probability Theory
by Rostislav Matveev and Jacobus W. Portegies
Entropy 2023, 25(12), 1641; https://doi.org/10.3390/e25121641 - 09 Dec 2023
Cited by 1 | Viewed by 748
Abstract
We define a natural operation of conditioning of tropical diagrams of probability spaces and show that it is Lipschitz continuous with respect to the asymptotic entropy distance. Full article
17 pages, 470 KiB  
Article
Arrow Contraction and Expansion in Tropical Diagrams
by Rostislav Matveev and Jacobus W. Portegies
Entropy 2023, 25(12), 1637; https://doi.org/10.3390/e25121637 - 08 Dec 2023
Viewed by 583
Abstract
Arrow contraction applied to a tropical diagram of probability spaces is a modification of the diagram, replacing one of the morphisms with an isomorphism while preserving other parts of the diagram. It is related to the rate regions introduced by Ahlswede and Körner. [...] Read more.
Arrow contraction applied to a tropical diagram of probability spaces is a modification of the diagram, replacing one of the morphisms with an isomorphism while preserving other parts of the diagram. It is related to the rate regions introduced by Ahlswede and Körner. In a companion article, we use arrow contraction to derive information about the shape of the entropic cone. Arrow expansion is the inverse operation to the arrow contraction. Full article
Show Figures

Figure 1

27 pages, 491 KiB  
Article
Decomposing and Tracing Mutual Information by Quantifying Reachable Decision Regions
by Tobias Mages and Christian Rohner
Entropy 2023, 25(7), 1014; https://doi.org/10.3390/e25071014 - 30 Jun 2023
Viewed by 1155
Abstract
The idea of a partial information decomposition (PID) gained significant attention for attributing the components of mutual information from multiple variables about a target to being unique, redundant/shared or synergetic. Since the original measure for this analysis was criticized, several alternatives have been [...] Read more.
The idea of a partial information decomposition (PID) gained significant attention for attributing the components of mutual information from multiple variables about a target to being unique, redundant/shared or synergetic. Since the original measure for this analysis was criticized, several alternatives have been proposed but have failed to satisfy the desired axioms, an inclusion–exclusion principle or have resulted in negative partial information components. For constructing a measure, we interpret the achievable type I/II error pairs for predicting each state of a target variable (reachable decision regions) as notions of pointwise uncertainty. For this representation of uncertainty, we construct a distributive lattice with mutual information as consistent valuation and obtain an algebra for the constructed measure. The resulting definition satisfies the original axioms, an inclusion–exclusion principle and provides a non-negative decomposition for an arbitrary number of variables. We demonstrate practical applications of this approach by tracing the flow of information through Markov chains. This can be used to model and analyze the flow of information in communication networks or data processing systems. Full article
Show Figures

Figure 1

Back to TopTop