# Understanding Interdependency Through Complex Information Sharing

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Preliminaries and the State of the Art

#### 2.1. Negentropy and Total Correlation

#### 2.2. Internal and External Decompositions

**Figure 1.**Layers of internal and external entropies that decompose the dual total correlation (DTC) and the TC. Each $\Delta {H}^{\left(j\right)}$ shows how much information is contained in the j-marginals, while each $\Delta {H}_{\left(j\right)}$ measures the information shared between exactly j variables.

#### 2.3. Inclusion-Exclusion Decompositions

Name | Formula |
---|---|

Total correlation | $\text{TC}={\sum}_{j}H\left({X}_{j}\right)-H\left(\mathbf{X}\right)$ |

Dual total correlation | $\text{DTC}=H\left(\mathbf{X}\right)-{\sum}_{j}H({X}_{j}|{\mathbf{X}}_{j}^{c})$ |

Co-information | $I({X}_{1};{X}_{2};{X}_{3})=I({X}_{1};{X}_{2})-I({X}_{1};{X}_{2}|{X}_{3})$ |

**Figure 2.**An approach based on the inclusion-exclusion principle decomposes the total entropy of three variables $H(X,Y,Z)$ into seven signed areas.

#### 2.4. Synergistic Information

## 3. A Non-Negative Joint Entropy Decomposition

#### 3.1. Predictability Axioms

**Definition**

- (1)
- Non-negativity: $\phantom{\rule{0.277778em}{0ex}}\mathcal{R}({X}_{1}{X}_{2}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}Y),\phantom{\rule{0.222222em}{0ex}}\mathcal{U}({X}_{1}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}Y|{X}_{2})\ge 0$.
- (2)
- $I({X}_{1};Y)=\mathcal{R}({X}_{1}{X}_{2}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}Y)+\mathcal{U}({X}_{1}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}Y|{X}_{2})$.
- (3)
- $I({X}_{1}{X}_{2};Y)\ge \mathcal{R}({X}_{1}{X}_{2}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}Y)+\mathcal{U}({X}_{1}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}Y|{X}_{2})+\mathcal{U}({X}_{2}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}Y|{X}_{1})$.
- (4)
- Weak symmetry I: $\phantom{\rule{0.277778em}{0ex}}\mathcal{R}({X}_{1}{X}_{2}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}Y)=\mathcal{R}({X}_{2}{X}_{1}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}Y)$.

**Lemma 1.**

**Corollary 1.**

**Proof.**

#### 3.2. Shared, Private and Synergistic Information

**Definition**

- (5)
- Weak symmetry II: $\phantom{\rule{0.277778em}{0ex}}{I}_{\text{priv}}({X}_{1};{X}_{2}|{X}_{3})={I}_{\text{priv}}({X}_{2};{X}_{1}|{X}_{3})$.

**Lemma 2.**

- (a)
- Strong symmetry:$\phantom{\rule{0.277778em}{0ex}}{I}_{\cap}({X}_{1};{X}_{2};{X}_{3})$ and ${I}_{\mathrm{S}}({X}_{1};{X}_{2};{X}_{3})$ are symmetric on their three arguments.
- (b)
- Bounds: these quantities satisfy the following inequalities:$$\begin{array}{ll}& min\{I({X}_{1};{X}_{2}),I({X}_{2};{X}_{3}),I({X}_{3};{X}_{1})\}\ge {I}_{\cap}({X}_{1};{X}_{2};{X}_{3})\ge {\left[I({X}_{1};{X}_{2};{X}_{3})\right]}^{+},\\ & min\{I({X}_{1};{X}_{3}),I({X}_{1};{X}_{3}|{X}_{2})\}\ge {I}_{\text{priv}}({X}_{1};{X}_{3}|{X}_{2})\ge 0,\\ & min\{I({X}_{1};{X}_{2}|{X}_{3}),I({X}_{2};{X}_{3}|{X}_{1}),I({X}_{3};{X}_{1}|{X}_{2})\}\ge {I}_{\text{S}}({X}_{1};{X}_{2};{X}_{3})\ge {[-I({X}_{1};{X}_{2};{X}_{3})]}^{+}.\end{array}$$

Directed Measures | Symmetrical Measures |
---|---|

Redundant predictability $\mathcal{R}({X}_{1}{X}_{2}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}{X}_{3})$ | Shared information ${I}_{\cap}({X}_{1};{X}_{2};{X}_{3})$ |

Unique predictability $\mathcal{U}({X}_{1}\phantom{\rule{-0.166667em}{0ex}}\to \phantom{\rule{-0.166667em}{0ex}}{X}_{2}|{X}_{3})$ | Private information ${I}_{\text{priv}}({X}_{1};{X}_{2}|{X}_{3})$ |

Synergistic predictability | Synergistic information ${I}_{\text{S}}({X}_{1};{X}_{2};{X}_{3})$ |

#### 3.3. Further Properties of the Symmetrical Decomposition

**Example**

**Lemma 3.**

**Corollary 2.**

**Proof.**

**Theorem 1.**

**Proof.**

#### 3.4. Decomposition for the Joint Entropy of Three Variables

## 4. Pairwise Independent Variables

#### 4.1. Uniqueness of the Entropy Decomposition

**Corollary 3.**

**Proof.**

#### 4.2. Functions of Independent Arguments

**Lemma 4.**

**Proof.**

**Corollary 4.**

`XOR`logic gate generates the largest amount of synergistic information possible for the case of binary inputs.

## 5. Discrete Pairwise Maximum Entropy Distributions and Markov Chains

#### 5.1. Synergy Minimization

**Theorem 2.**

**Proof.**

**Corollary 5.**

**Proof.**

#### 5.2. Markov Chains

**Corollary 6.**

**Proof.**

Markov Chains | Pairwise Independent Variables |
---|---|

Conditional pairwise independency | Pairwise independency |

$I({X}_{1};{X}_{3}|{X}_{2})=0$ | $I({X}_{1};{X}_{2})=0$ |

No ${I}_{\text{priv}}$ between ${X}_{1}$ and ${X}_{3}$ | No ${I}_{\text{priv}}$ between ${X}_{1}$ and ${X}_{2}$ |

No synergistic information | No shared information |

## 6. Entropy Decomposition for the Gaussian Case

#### 6.1. Understanding the Synergistic Information Between Gaussians

#### 6.2. Understanding the Shared Information

#### 6.3. Shared, Private and Synergistic Information for Gaussian Variables

**Lemma 5.**

## 7. Applications to Network Information Theory

#### 7.1. Slepian–Wolf Coding

#### 7.2.Multiple Access Channel

**Figure 3.**Capacity region of the multiple access channel, which represents the possible data rates that two transmitters can use for transferring information to one receiver.

#### 7.3. Degraded Wiretap Channel

**Figure 4.**The rate of secure information transfer, ${C}_{\mathrm{sec}}$, is the portion of the mutual information that can be used while providing perfect confidentiality with respect to the eavesdropper.

#### 7.4. Gaussian Broadcast Channel

## 8. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## Appendix

## A. Proof of Lemma 1

**Proof.**

## B. Proof of the Consistency of Axiom (3)

## C. Proof of Lemma 2

**Proof.**

## D. Useful Facts about Gaussians

## E. Proof of Lemma 5

**Proof.**

## References

- Kaneko, K. Life: An Introduction to Complex Systems Biology; Springer-Verlag: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Perrings, C. Economy and Environment: A Theoretical Essay on the Interdependence of Economic and Environmental Systems; Cambridge University Press: Cambridge, UK, 2005. [Google Scholar]
- Martignon, L.; Deco, G.; Laskey, K.; Diamond, M.; Freiwald, W.; Vaadia, E. Neural coding: Higher-order temporal patterns in the neurostatistics of cell assemblies. Neural Comput.
**2000**, 12, 2621–2653. [Google Scholar] [CrossRef] [PubMed] - Deutscher, D.; Meilijson, I.; Schuster, S.; Ruppin, E. Can single knockouts accurately single out gene functions? BMC Syst. Biol.
**2008**, 2, 50. [Google Scholar] [CrossRef] [PubMed] - Anand, K.; Bianconi, G. Entropy measures for networks: Toward an information theory of complex topologies. Phys. Rev. E
**2009**, 80, 045102. [Google Scholar] [CrossRef] [PubMed] - Gastpar, M.; Vetterli, M.; Dragotti, P.L. Sensing reality and communicating bits: A dangerous liaison. IEEE Signal Process. Mag.
**2006**, 23, 70–83. [Google Scholar] [CrossRef] - Casella, G.; Berger, R.L. Statistical Inference; Duxbury Press: Pacific Grove, CA, USA, 2002. [Google Scholar]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley: Hoboken, NJ, USA, 1991. [Google Scholar]
- Senge, P.M.; Smith, B.; Kruschwitz, N.; Laur, J.; Schley, S. The Necessary Revolution: How Individuals and Organizations Are Working Together to Create a Sustainable World; Crown Business: New York, NY, USA, 2008. [Google Scholar]
- Amari, S.I. Information geometry on hierarchy of probability distributions. IEEE Trans. Inf. Theory
**2001**, 47, 1701–1711. [Google Scholar] [CrossRef] - Williams, P.L.; Beer, R.D. Nonnegative decomposition of multivariate information. 2010; arXiv:1004.2515. [Google Scholar]
- Olbrich, E.; Bertschinger, N.; Rauh, J. Information Decomposition and Synergy. Entropy
**2015**, 17, 3501–3517. [Google Scholar] [CrossRef] - Li, W. Mutual information functions versus correlation functions. J. Stat. Phys.
**1990**, 60, 823–837. [Google Scholar] [CrossRef] - Jaynes, E.T. Probability Theory: The Logic of Science; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
- Brillouin, L. The negentropy principle of information. J. Appl. Phys.
**1953**, 24, 1152–1163. [Google Scholar] [CrossRef] - Watanabe, S. Information theoretical analysis of multivariate correlation. IBM J. Res. Dev.
**1960**, 4, 66–82. [Google Scholar] [CrossRef] - Studenỳ, M.; Vejnarová, J. The multiinformation function as a tool for measuring stochastic dependence. In Learning in Graphical Models; Springer Netherlands: Amsterdam, The Netherlands, 1998; pp. 261–297. [Google Scholar]
- Schneidman, E.; Still, S.; Berry, M.J.; Bialek, W. Network information and connected correlations. Phys. Rev. Lett.
**2003**, 91, 238701. [Google Scholar] [CrossRef] [PubMed] - Schneidman, E.; Berry, M.J.; Segev, R.; Bialek, W. Weak pairwise correlations imply strongly correlated network states in a neural population. Nature
**2006**, 440, 1007–1012. [Google Scholar] [CrossRef] [PubMed] - Roudi, Y.; Nirenberg, S.; Latham, P.E. Pairwise Maximum Entropy Models for Studying Large Biological Systems: When They Can Work and When They Can’t. PLoS Comput. Biol.
**2009**, 5, e1000380. [Google Scholar] [CrossRef] [PubMed] - Bialek, W.; Cavagna, A.; Giardina, I.; Mora, T.; Silvestri, E.; Viale, M.; Walczak, A.M. Statistical mechanics for natural flocks of birds. Proc. Natl. Acad. Sci. USA
**2012**, 109, 4786–4791. [Google Scholar] [CrossRef] [PubMed] - Merchan, L.; Nemenman, I. On the sufficiency of pairwise interactions in maximum entropy models of biological networks. 2015; arXiv:1505.02831. [Google Scholar]
- Daniels, B.C.; Krakauer, D.C.; Flack, J.C. Sparse code of conflict in a primate society. Proc. Natl. Acad. Sci. USA
**2012**, 109, 14259–14264. [Google Scholar] [CrossRef] [PubMed] - Lee, E.D.; Broedersz, C.P.; Bialek, W. Statistical mechanics of the US Supreme Court. J. Stat. Phys.
**2013**, 160, 275–301. [Google Scholar] [CrossRef] - Sun, H.T. Nonnegative entropy measures of multivariate symmetric correlations. Inf. Control
**1978**, 36, 133–156. [Google Scholar] - Olbrich, E.; Bertschinger, N.; Ay, N.; Jost, J. How should complexity scale with system size? Eur. Phys. J. B
**2008**, 63, 407–415. [Google Scholar] [CrossRef] - Crutchfield, J.P.; Feldman, D.P. Regularities unseen, randomness observed: Levels of entropy convergence. Chaos
**2003**, 13, 25–54. [Google Scholar] [CrossRef] [PubMed] - Rosas, F.; Ntranos, V.; Ellison, C.J.; Verhelst, M.; Pollin, S. Understanding high-order correlations using a synergy-based decomposition of the total entropy. In Proceedings of the 5th Joint WIC/IEEE Symposium on Information Theory and Signal Processing in the Benelux, Brussels, Belgium, 6–7 May 2015; pp. 146–153.
- Yeung, R.W. A new outlook on Shannon’s information measures. IEEE Trans. Inf. Theory
**1991**, 37, 466–474. [Google Scholar] [CrossRef] - Bar-Yam, Y. Multiscale complexity/entropy. Adv. Complex Syst.
**2004**, 7, 47–63. [Google Scholar] [CrossRef] - Bell, A.J. The co-information lattice. In Proceedings of the 4th International Symposium on Independent Component Analysis and Blind Signal Separation (ICA 2003), Nara, Japan, 1–4 April 2003; pp. 921–926.
- McGill, W.J. Multivariate information transmission. Psychometrika
**1954**, 19, 97–116. [Google Scholar] [CrossRef] - James, R.G.; Ellison, C.J.; Crutchfield, J.P. Anatomy of a bit: Information in a time series observation. Chaos Interdiscip. J Nonlinear Sci.
**2011**, 21, 037109. [Google Scholar] [CrossRef] [PubMed] - Griffith, V.; Koch, C. Quantifying Synergistic Mutual Information. In Guided Self-Organization: Inception; Prokopenko, M., Ed.; Springer-Verlag: Berlin/Heidelberg, Germany, 2014; Volume 9, pp. 159–190. [Google Scholar]
- Griffith, V. Quantifying synergistic information. Ph.D. Thesis, California Institute of Technology, Pasadena, CA, USA, 2014. [Google Scholar]
- Jiao, J.; Courtade, T.; Venkat, K.; Weissman, T. Justification of Logarithmic Loss via the Benefit of Side Information. IEEE Trans. Inf. Theory
**2015**, 61, 5357–5365. [Google Scholar] [CrossRef] - Harder, M.; Salge, C.; Polani, D. Bivariate measure of redundant information. Phys. Rev. E
**2013**, 87, 012130. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Griffith, V.; Chong, E.K.; James, R.G.; Ellison, C.J.; Crutchfield, J.P. Intersection information based on common randomness. Entropy
**2014**, 16, 1985–2000. [Google Scholar] [CrossRef] - Bertschinger, N.; Rauh, J.; Olbrich, E.; Jost, J.; Ay, N. Quantifying unique information. Entropy
**2014**, 16, 2161–2183. [Google Scholar] [CrossRef] - Bertschinger, N.; Rauh, J.; Olbrich, E.; Jost, J. Shared information—New insights and problems in decomposing information in complex systems. In Proceedings of the European Conference on Complex Systems 2012, Brussels, Belgium, 3–7 September 2012; pp. 251–269.
- Barrett, A.B. Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems. Phys. Rev. E
**2015**, 91, 052802. [Google Scholar] [CrossRef] [PubMed] - Berkson, J. Limitations of the application of fourfold table analysis to hospital data. Biom. Bull.
**1946**, 2, 47–53. [Google Scholar] [CrossRef] - Kim, J.; Pearl, J. A computational model for causal and diagnostic reasoning in inference systems. In Proceedings of the Eighth International Joint Conference on Artificial Intelligence (IJCAI), Karlsruhe, Germany, 8–12 August 1983; pp. 190–193.
- Bloch, M.; Barros, J. Physical-Layer Security: From Information Theory to Security Engineering; Cambridge University Press: Cambrige, UK, 2011. [Google Scholar]
- Shannon, C.E. Communication theory of secrecy systems*. Bell Syst. Tech. J.
**1949**, 28, 656–715. [Google Scholar] [CrossRef] - Ahlswede, R.; Cai, N.; Li, S.Y.R.; Yeung, R.W. Network information flow. IEEE Trans. Inf. Theory
**2000**, 46, 1204–1216. [Google Scholar] [CrossRef] - Li, S.Y.R.; Yeung, R.W.; Cai, N. Linear network coding. IEEE Trans. Inf. Theory
**2003**, 49, 371–381. [Google Scholar] [CrossRef] - Katti, S.; Rahul, H.; Hu, W.; Katabi, D.; Médard, M.; Crowcroft, J. XORs in the air: Practical wireless network coding. IEEE/ACM Trans. Netw.
**2008**, 16, 497–510. [Google Scholar] [CrossRef] - Landau, L.; Lifshitz, E. Statistical Physics, 2nd ed.; Pergamon Press: Oxford, UK, 1970; Volume 5. [Google Scholar]
- Wainwright, M.J.; Jordan, M.I. Graphical models, exponential families, and variational inference. Found. Trends Mach. Learn.
**2008**, 1, 1–305. [Google Scholar] [CrossRef] - Cipra, B.A. An introduction to the Ising model. Am. Math. Mon.
**1987**, 94, 937–959. [Google Scholar] [CrossRef] - Sayed, A.H. Adaptive Filters; John Wiley & Sons: Hoboken, NJ, USA, 2011. [Google Scholar]
- El Gamal, A.; Kim, Y.H. Network Information Theory; Cambridge University Press: Cambridge, UK, 2011. [Google Scholar]

© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons by Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Rosas, F.; Ntranos, V.; Ellison, C.J.; Pollin, S.; Verhelst, M.
Understanding Interdependency Through Complex Information Sharing. *Entropy* **2016**, *18*, 38.
https://doi.org/10.3390/e18020038

**AMA Style**

Rosas F, Ntranos V, Ellison CJ, Pollin S, Verhelst M.
Understanding Interdependency Through Complex Information Sharing. *Entropy*. 2016; 18(2):38.
https://doi.org/10.3390/e18020038

**Chicago/Turabian Style**

Rosas, Fernando, Vasilis Ntranos, Christopher J. Ellison, Sofie Pollin, and Marian Verhelst.
2016. "Understanding Interdependency Through Complex Information Sharing" *Entropy* 18, no. 2: 38.
https://doi.org/10.3390/e18020038