# Mechanism Integrated Information

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Axioms and Postulates

## 3. Theory

#### 3.1. Mechanism Integrated Information

#### 3.1.1. Existence

#### 3.1.2. Intrinsicality

#### 3.1.3. Information

#### 3.1.4. Integration

#### 3.1.5. Exclusion

#### 3.2. Disintegrating Partitions

#### 3.3. Intrinsic Difference (ID)

**Theorem**

**1.**

## 4. Methods and Results

#### 4.1. Intrinsic Information

#### 4.2. Integrated Information

#### 4.3. Maximal Integrated Information

## 5. Discussion

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## Appendix A. Cause and Effect Repertoires

#### Appendix A.1. Causal Marginalization

#### Appendix A.2. Partitioned Repertoires

## Appendix B. Full Statement and Proof of Theorem 1

**Property I**: Causality. Let $({P}^{n},{Q}^{n})\in {\Delta}^{n}$. The difference $D({P}^{n},{Q}^{n})$ is defined as $D:{\Delta}^{n}\to \mathbb{R}$, such that$$D({P}^{n},{Q}^{n})=0\iff {P}^{n}={Q}^{n}.$$**Property II**: Intrinsicality. Let $({P}^{l},{Q}^{l})\in {\Delta}^{l}$ and $({P}^{m},{Q}^{m})\in {\Delta}^{m}$. Then(a) expansion: $D({V}^{l}\otimes {V}^{m},{P}^{l}\otimes {Q}^{m})=D({V}^{l},{P}^{l})+D({V}^{m},{Q}^{m}),$(b) dilution: $D({P}^{l}\otimes {U}^{m},{Q}^{l}\otimes {U}^{m})={\displaystyle \frac{D({P}^{l},{Q}^{l})+D({U}^{m},{U}^{m})}{m}},$where ${P}^{l}\otimes {Q}^{m}=({p}_{1}{q}_{1},\dots ,{p}_{1}{q}_{m},\dots ,{p}_{l}{q}_{1},\dots ,{p}_{l}{q}_{m})\in {\Gamma}^{lm}$ and from Property I $D({U}^{m},{U}^{m})=0$.**Property III**: Specificity. The difference must be state-specific, meaning there exists $f:K\u27f6\mathbb{R}$ such that for all $({P}^{n},{Q}^{n})\in {\Delta}^{n}$ we have $D({P}^{n},{Q}^{n})=f({p}_{\alpha},{q}_{\alpha})$, where $\alpha \in \{1,\dots ,n\}$, ${p}_{\alpha}\in {P}^{n}$ and ${q}_{\alpha}\in {Q}^{n}$. More precisely, we define$$D({P}^{n},{Q}^{n}):=\underset{\alpha}{max}\left\{\right|f({p}_{\alpha},{q}_{\alpha})\left|\right\},$$

**Lemma**

**A1**

**.**If f and g are real analytic functions on an open interval $U\in \mathbb{R}$ and if there is a sequence of distinct points ${\left\{{x}_{n}\right\}}_{n}\in U$ with ${x}_{0}=\underset{n\to \infty}{lim}{x}_{n}\in U$ such that

**Corollary**

**A1**

**.**If f and g are analytic functions on an open interval U and if there is an open interval $W\subseteq U$ such that

**Lemma**

**A2.**

**Lemma**

**A3.**

**Proof.**

**Lemma**

**A4.**

**Proof.**

**Theorem**

**A1.**

**Proof**

**of**

**Theorem**

**A1.**

**Step 1.**First we show that under four assumptions, f satisfies properties I, II and III iff $f(p,q)=kplog\left(\frac{p}{q}\right),\phantom{\rule{0.166667em}{0ex}}k\in \mathbb{R}\backslash \left\{0\right\}$.

**Step 2.**Next we show that if any of our assumptions is violated, then no suitable f exists.

**Verification of Step 1.**We apply Property II.a with ${P}^{2}=(p,1-p),{Q}^{2}=(q,1-q)$ for some $p,q\in \mathrm{J}$ where $p\ne q$. We then have

**Verification of Step 2.**Up until here we have showed that Equation (A10) not only defines a function which satisfies properties I, II and III, but it also defines the only function which satisfies properties I, II and III for $l=m=2$ given the following assumptions

- AS1:
- $\phantom{\rule{0.277778em}{0ex}}\exists \phantom{\rule{0.166667em}{0ex}}{p}^{\prime},{q}^{\prime}\in \mathrm{J}$ such that $\left|f\right(1,{p}^{\prime}{q}^{\prime}\left)\right|$ is a strict maximum in Equation (A11),
- AS2:
- $\phantom{\rule{0.277778em}{0ex}}\exists \phantom{\rule{0.166667em}{0ex}}{q}_{1}\in \mathrm{J}$ such that $|f(1,{q}_{1})|>|f(0,1-{q}_{1})|$ in Equation (A14),
- AS3:
- $\left|f\left(0,\frac{1-r}{m}\right)\right|$ is never a strict maximum in Equation (A15),
- AS4:
- $sup\left\{\overline{\mathcal{P}}\right\}<\frac{1}{2}$.

## Appendix C. Comparison between ID and EMD

**Figure A1.**Comparison between earth mover’s distance (EMD) and ID. Using the same system S used in Figure 4a, we find the cause purview with maximum integrated information for the mechanism $M=\{A,B\}$ in state $m=\{\uparrow ,\uparrow \}$, which is larger when using the EMD measure (

**a**) when compared to the ID measure (

**b**). The integrated information when using the EMD measure is also larger than the ID measure.

## References

- Albantakis, L. Integrated information theory. In Beyond Neural Correlates of Consciousness; Overgaard, M., Mogensen, J., Kirkeby-Hinrup, A., Eds.; Routledge: London, UK, 2020; pp. 87–103. [Google Scholar]
- Tononi, G.; Boly, M.; Massimini, M.; Koch, C. Integrated information theory: From consciousness to its physical substrate. Nat. Rev. Neurosci.
**2016**, 17, 450–461. [Google Scholar] [CrossRef] [PubMed] - Oizumi, M.; Albantakis, L.; Tononi, G. From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3. 0. PLoS Comput Biol
**2014**, 10, e1003588. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Balduzzi, D.; Tononi, G. Integrated information in discrete dynamical systems: Motivation and theoretical framework. PLoS Comput. Biol.
**2008**, 4, e1000091. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Barbosa, L.; Marshall, W.; Streipert, S.; Albantakis, L.; Tononi, G. A measure for intrinsic information. Sci. Rep.
**2020**, 10, 1–9. [Google Scholar] [CrossRef] [PubMed] - Tononi, G. The Integrated Information Theory of Consciousness. In The Blackwell Companion to Consciousness; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2017; pp. 243–256. [Google Scholar]
- Albantakis, L.; Marshall, W.; Hoel, E.; Tononi, G. What caused what? A quantitative account of actual causation using dynamical causal networks. Entropy
**2019**, 21, 459. [Google Scholar] - Tononi, G. Consciousness as integrated information: A provisional manifesto. Biol. Bull.
**2008**, 215, 216–242. [Google Scholar] [CrossRef] [PubMed] - Marshall, W.; Albantakis, L.; Tononi, G. Black-boxing and cause-effect power. PLoS Comput. Biol.
**2018**, 14, e1006114. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Haun, A.; Tononi, G. Why Does Space Feel the Way it Does? Towards a Principled Account of Spatial Experience. Entropy
**2019**, 21, 1160. [Google Scholar] - Albantakis, L.; Tononi, G. The Intrinsic Cause-Effect Power of Discrete Dynamical Systems—From Elementary Cellular Automata to Adapting Animats. Entropy
**2015**, 17, 5472–5502. [Google Scholar] [CrossRef] [Green Version] - Albantakis, L.; Tononi, G. Causal Composition: Structural Differences among Dynamically Equivalent Systems. Entropy
**2019**, 21, 989. [Google Scholar] [CrossRef] [Green Version] - Marshall, W.; Gomez-Ramirez, J.; Tononi, G. Integrated Information and State Differentiation. Conscious. Res.
**2016**, 7, 926. [Google Scholar] - Gomez, J.D.; Mayner, W.G.P.; Beheler-Amass, M.; Tononi, G.; Albantakis, L. Computing Integrated Information (Φ) in Discrete Dynamical Systems with Multi-Valued Elements. Entropy
**2021**, 23, 6. [Google Scholar] [CrossRef] [PubMed] - Csiszár, I. Axiomatic Characterizations of Information Measures. Entropy
**2008**, 10, 261–273. [Google Scholar] [CrossRef] [Green Version] - Tononi, G. An information integration theory of consciousness. BMC Neurosci.
**2004**, 5, 42. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Kyumin, M. Exclusion and Underdetermined Qualia. Entropy
**2019**, 21, 405. [Google Scholar] - Pearl, J. Causality; Cambridge University Press: Cambridge, UK, 2009. [Google Scholar]
- Janzing, D.; Balduzzi, D.; Grosse-Wentrup, M.; Schölkopf, B. Quantifying causal influences. Ann. Stat.
**2013**, 41, 2324–2358. [Google Scholar] [CrossRef] - Ebanks, B.; Sahoo, P.; Sander, W. Characterizations of Information Measures; World Scientific: Singapore, 1998. [Google Scholar]
- Krantz, S.G.; Parks, H.R. A Primer of Real Analytic Functions, 2nd ed.; Birkhäuser Advanced Texts Basler Lehrbücher; Birkhäuser: Basel, Switzerland, 2002. [Google Scholar]
- Aczél, J. Lectures on Functional Equations and Their Applications; Dover Publications: Mineola, NY, USA, 2006. [Google Scholar]

**Figure 1.**Theory. (

**a**) System S with four random variables. (

**b**) Example of a mechanism $M=\{A,C\}$ in state $m=\{\uparrow ,\uparrow \}$ constraining a cause purview $Z=\left\{B\right\}$ and an effect purview $Z=\{B,D\}$. Dashed lines show the partitions. The bar plots show the probability distributions, that is the cause repertoire (left) and effect repertoire (right). The black bars show the probabilities when the mechanism is constraining the purview, and the white bars show the probabilities after partitioning the mechanism.

**Figure 2.**Intrinsicality. (

**a**) Activation functions without bias ($\eta =0$) and different levels of constraint ($\tau =0$, $\tau =1$ and $\tau =10$). (

**b**) System S analyzed in this figure. The remaining panels show on top the causal graph of the mechanism $M=\left\{A\right\}$ at state $m=\left\{1\right\}$ constraining different output purviews and on the bottom the probability distributions of the purviews (effect repertoires). The black bars show the probabilities when the mechanism is constraining the purview, and the white bars show the unconstrained probabilities after the complete partition ${\psi}^{0}$. The “*” indicates the state selected by the maximum operation in the intrinsic difference (ID) function. (

**c**) The mechanism fully constrains the unit B in the purview $Z=\left\{B\right\}$ (${\tau}_{B}=0$), resulting in state $z=\{\uparrow \}$ defining the amount of intrinsic information in the mechanism as $\phi (m,Z,{\psi}^{0})=ID({\pi}_{e}\left(B\right|M=\uparrow )\mid {\pi}_{e}^{{\psi}^{0}}\left(B\right|M=\uparrow \left)\right)={\pi}_{e}(B=\uparrow |A=\uparrow )\xb7|log({\pi}_{e}(B=\uparrow |A=\uparrow )/{\pi}_{e}^{{\psi}^{0}}(B=\uparrow |M=\uparrow \left)\right)|=1\xb70.69=0.69$. (

**d**) After adding a slightly undetermined unit (${\tau}_{C}=1$) to the purview ($Z=\{B,C\}$), the intrinsic information increases to $1.11$. The new maximum state ($z=\{\uparrow ,\uparrow \}$) has now much higher informativeness ($|log({\pi}_{e}(BC=\uparrow \uparrow |A=\uparrow )/{\pi}_{e}^{{\psi}^{0}}(BC=\uparrow \uparrow |A=\uparrow \left)\right)|=1.26$) but only slightly lower selectivity ($\pi (BC=\uparrow \uparrow |A=\uparrow )=0.89$), resulting in expansion. (

**e**) When instead of C, we add the very undetermined unit D to the purview (${\tau}_{D}=10$), the new purview ($Z=\{B,D\}$) has a new maximum state ($z=\{\uparrow ,\uparrow \}$) with marginally higher informativeness ($|log({\pi}_{e}(BC=\uparrow \uparrow |A=\uparrow )/{\pi}_{e}^{{\psi}^{0}}(BC=\uparrow \uparrow |A=\uparrow \left)\right)|=0.79$) and very low selectivity (${\pi}_{e}(BC=\uparrow \uparrow |A=\uparrow )=0.55$), resulting in dilution.

**Figure 3.**Information. (

**a**) System S analyzed in this figure. All units have $\tau =1$ and $\eta =-3$ (partially deterministic AND gates). The remaining panels show on the left the time unfolded graph of the mechanism $M=\{A,B,C,D\}$ constraining different output purviews and on the right the probability distribution of the purview $Z=\{A,B,C\}$ (effect repertoires). The black bars show the probabilities when the mechanism is constraining the purview, and the white bars show the unconstrained probabilities after the complete partition. The “*” indicates the state selected by the maximum operation in the ID function. (

**b**) The mechanism at state $m=\{\downarrow ,\downarrow ,\downarrow ,\downarrow \}$. The purview state $z=\{\downarrow ,\downarrow ,\downarrow \}$ is not only the most constrained by the mechanism (high informativeness) but also very dense (high selectivity). As a result, it has intrinsic information higher than all other states in the purview and defines the intrinsic information of the mechanism as 0.27. (

**c**) If we change the mechanism state to $m=\{\downarrow ,\uparrow ,\uparrow ,\uparrow \}$, the probability of observing the purview state $z=\{\downarrow ,\downarrow ,\downarrow \}$ is now smaller than chance. However, this probability is still very different from chance and therefore very constrained by the mechanism (high informativeness). At the same time, the state is still very dense, meaning it has a probability of happening much higher than all other states (high selectivity). Together, they define the intrinsic information of the state, which is higher than the intrinsic information of all other states in the purview, defining the intrinsic information of the mechanism as 0.08.

**Figure 4.**Integration. (

**a**) System S analysed in this figure and in Figure 5. All units have $\tau =1$ and $\eta =0$ (partially deterministic MAJORITY gates). The remaining panels show on the top the time unfolded graph of different mechanisms constraining different output purviews and on the bottom the probability distributions (effect repertoires). The black bars show the probabilities when the mechanism is constraining the purview, and the white bars show the partitioned probabilities. The “*” indicates the state selected by the maximum operation in the ID function. (

**b**) The mechanism $M=\{A,E\}$ in state $m=\{\uparrow ,\downarrow \}$ constraining the purview $Z=\{A,E\}$. While the complete partition has nonzero intrinsic information, the mechanism is clearly not integrated, as revealed by the MIP partition ${\psi}^{*}=\{(\{A,\},\left\{A\right\}),(\{E,\},\left\{E\right\})\}$, resulting in zero integrated information. (

**c**) The mechanism $M=\{A,B\}$ in state $m=\{\uparrow ,\uparrow \}$ constraining the purview $Z=\{A,B\}$. The partition ${\psi}^{*}=\{(\{A,\},\{A,B\}),(\left\{B\right\},\{\varnothing \})\}$ has less intrinsic information than any other partition, i.e., it is the MIP of this mechanism, and it defines the integrated information as 0.36. (

**d**) The mechanism $M=\{A,B,D\}$ in state $m=\{\uparrow ,\uparrow ,\downarrow \}$ constraining the purview $Z=\{E,F\}$. The tri-partition ${\psi}^{*}=\{(\left\{A\right\},\{\varnothing \}),(\{\varnothing ,\},\left\{F\right\}),(\{B,D\},\left\{E\right\})\}$ is the MIP and it shows that the mechanism is not integrated, i.e, the mechanism has zero integrated information.

**Figure 5.**Exclusion. Causal graphs of different mechanisms constraining different purviews. The system S used in these examples is the same as in Figure 4a. Each line shows the mechanism M constraining different purviews Z. (

**a**) The mechanism $M=\left\{A\right\}$ at state $m=\{\uparrow \}$. The bottom line shows the purview $Z\in S$ with maximum integrated effect information and the MIP is the complete partition. (

**b**) The mechanism $M=\{A,B,C,D\}$ at state $m=\{\uparrow ,\uparrow ,\uparrow ,\downarrow \}$. The bottom line is the purview $Z\in S$ with maximum integrated effect information and the MIP is ${\psi}^{*}=\{(\{A,B,C\},\{A,B,C\}),(\left\{D\right\},\{\varnothing \})\}$.

**Figure 6.**Integrated cause information. (

**a**) Causal graph of mechanism $M=\{A,B,C,D\}$ at state $m=\{\uparrow ,\uparrow ,\uparrow ,\downarrow \}$ constraining the purview $Z=\{A,B,F\}$, which has the maximum integrated information of all $Z\subseteq S$ and defines the mechanism integrated information. (

**b**) The black bars show the probabilities when the mechanism is constraining the purview (cause repertoire), and the white bars show the probabilities after the partition (partitioned cause repertoire). The “*” indicates the state selected by the maximum operation in the ID function and defines ${Z}_{c}^{*}$.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Barbosa, L.S.; Marshall, W.; Albantakis, L.; Tononi, G.
Mechanism Integrated Information. *Entropy* **2021**, *23*, 362.
https://doi.org/10.3390/e23030362

**AMA Style**

Barbosa LS, Marshall W, Albantakis L, Tononi G.
Mechanism Integrated Information. *Entropy*. 2021; 23(3):362.
https://doi.org/10.3390/e23030362

**Chicago/Turabian Style**

Barbosa, Leonardo S., William Marshall, Larissa Albantakis, and Giulio Tononi.
2021. "Mechanism Integrated Information" *Entropy* 23, no. 3: 362.
https://doi.org/10.3390/e23030362