# Entropy: From Thermodynamics to Information Processing

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Historical Background

#### 2.1. Clausius Entropy

#### 2.2. Boltzmann–Gibbs Entropy

#### 2.3. Shannon Entropy

“My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage".

#### 2.4. Partial Information Decomposition

#### 2.5. Algorithmic Information Theory

#### 2.6. Algorithmic Information Dynamics

## 3. Equivalence of Entropy in Thermodynamics and Information Theory

#### 3.1. Unity Analysis

#### 3.2. Underlying Probability

#### 3.3. Shannon Entropy and Thermodynamics

“… if we conceive of a being whose faculties are so sharpened that he can follow every molecule in its course, such a being, whose attributes are as essentially finite as our own, would be able to do what is impossible to us. For we have seen that molecules in a vessel full of air at uniform temperature are moving with velocities by no means uniform, though the mean velocity of any great number of them, arbitrarily selected, is almost exactly uniform. Now let us suppose that such a vessel is divided into two portions, A and B, by a division in which there is a small hole, and that a being, who can see the individual molecules, opens and closes this hole, so as to allow only the swifter molecules to pass from A to B, and only the slower molecules to pass from B to A. He will thus, without expenditure of work, raise the temperature of B and lower that of A, in contradiction to the second law of thermodynamics."

#### 3.4. Information Theoretical Proof that Boltzmann-Gibbs Entropy is the Same as Clausius’s

#### 3.5. Using Kullback–Leibler Divergence to Obtain an Analogous of the Second Law of Thermodynamics

## 4. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## References

- Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley & Sons: Hoboken, NJ, USA, 2012. [Google Scholar]
- Greven, A.; Keller, G.; Warnecke, G. Entropy; Princeton University Press: Princeton, NJ, USA, 2014; Volume 47. [Google Scholar]
- Wehrl, A. General properties of entropy. Rev. Mod. Phys.
**1978**, 50, 221. [Google Scholar] [CrossRef] - Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J.
**1948**, 27, 379–423. [Google Scholar] [CrossRef] [Green Version] - Demirel, Y.; Gerbaud, V. Nonequilibrium Thermodynamics: Transport and Rate Processes in Physical, Chemical and Biological Systems; Elsevier: Amsterdam, The Netherlands, 2019. [Google Scholar]
- De Martino, A.; De Martino, D. An introduction to the maximum entropy approach and its application to inference problems in biology. Heliyon
**2018**, 4, e00596. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Caro, J.A.; Valentine, K.G.; Wand, A.J. Role of Conformational Entropy in Extremely High Affinity Protein Interactions. Biophys. J.
**2018**, 114, 67a. [Google Scholar] [CrossRef] - Demirel, Y. Information in biological systems and the fluctuation theorem. Entropy
**2014**, 16, 1931–1948. [Google Scholar] [CrossRef] [Green Version] - Brooks, D.R.; Wiley, E.O.; Brooks, D. Evolution as Entropy; University of Chicago Press: Chicago, IL, USA, 1988. [Google Scholar]
- Maldacena, J. Black hole entropy and quantum mechanics. arXiv
**2018**, arXiv:1810.11492. [Google Scholar] - Xiao, M.; Du, P.; Horne, K.; Hu, C.; Li, Y.R.; Huang, Y.K.; Lu, K.X.; Qiu, J.; Wang, F.; Bai, J.M.; et al. Supermassive Black Holes with High Accretion Rates in Active Galactic Nuclei. VII. Reconstruction of Velocity-delay Maps by the Maximum Entropy Method. Astrophys. J.
**2018**, 864, 109. [Google Scholar] [CrossRef] - Bousso, R. Black hole entropy and the Bekenstein bound. arXiv
**2018**, arXiv:1810.01880. [Google Scholar] - Zeeshan, A.; Hassan, M.; Ellahi, R.; Nawaz, M. Shape effect of nanosize particles in unsteady mixed convection flow of nanofluid over disk with entropy generation. Proc. Inst. Mech. Eng. Part E J. Process Mech. Eng.
**2017**, 231, 871–879. [Google Scholar] [CrossRef] - Rostaghi, M.; Azami, H. Dispersion entropy: A measure for time-series analysis. IEEE Signal Process. Lett.
**2016**, 23, 610–614. [Google Scholar] [CrossRef] - He, D.; Wang, X.; Li, S.; Lin, J.; Zhao, M. Identification of multiple faults in rotating machinery based on minimum entropy deconvolution combined with spectral kurtosis. Mech. Syst. Signal Process.
**2016**, 81, 235–249. [Google Scholar] [CrossRef] - Degaetano-Ortlieb, S.; Teich, E. Modeling intra-textual variation with entropy and surprisal: Topical vs. stylistic patterns. In Proceedings of the Joint SIGHUM Workshop on Computational Linguistics for Cultural Heritage, Social Sciences, Humanities and Literature, Vancouver, BC, Canada, 10 August 2017; pp. 68–77. [Google Scholar]
- Reynar, J.C.; Ratnaparkhi, A. A maximum entropy approach to identifying sentence boundaries. In Proceedings of the Fifth Conference on Applied Natural Language Processing; Association for Computational Linguistics: Stroudsburg, PA, USA, 1997; pp. 16–19. [Google Scholar]
- Campbell, J. Grammatical Man: Information, Entropy, Language, and Life; Simon and Schuster: New York, NY, USA, 1982. [Google Scholar]
- Tame, J.R. On Entropy as Mixed-Up-Ness. In Approaches to Entropy; Springer: Berlin/Heidelberg, Germany, 2019; pp. 153–170. [Google Scholar]
- Adami, C. What is information? Philos. Trans. R. Soc. A Math. Phys. Eng. Sci.
**2016**, 374, 20150230. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Kovalev, A.V. Misuse of thermodynamic entropy in economics. Energy
**2016**, 100, 129–136. [Google Scholar] [CrossRef] - Hayflick, L. Entropy explains aging, genetic determinism explains longevity, and undefined terminology explains misunderstanding both. PLoS Genet.
**2007**, 3, e220. [Google Scholar] [CrossRef] [PubMed] - Morowitz, H. Entropy and nonsense. Biol. Philos.
**1986**, 1, 473–476. [Google Scholar] [CrossRef] - Martyushev, L. Entropy and entropy production: Old misconceptions and new breakthroughs. Entropy
**2013**, 15, 1152–1170. [Google Scholar] [CrossRef] [Green Version] - Henderson, L. The von Neumann entropy: A reply to Shenker. Br. J. Philos. Sci.
**2003**, 54, 291–296. [Google Scholar] [CrossRef] - Sozbilir, M. What students’ understand from entropy?: A review of selected literature. J. Balt. Sci. Educ.
**2003**, 2, 21–27. [Google Scholar] - Wright, P. Entropy and disorder. Contemp. Phys.
**1970**, 11, 581–588. [Google Scholar] [CrossRef] - Schrodinger, E. Order, disorder and entropy. In Modern Systems Research for the Behavioral Scientist; Aldine: Chicago, IL, USA, 1968; pp. 143–146. [Google Scholar]
- Soubane, D.; El Garah, M.; Bouhassoune, M.; Tirbiyine, A.; Ramzi, A.; Laasri, S. Hidden Information, Energy Dispersion and Disorder: Does Entropy Really Measure Disorder? World
**2018**, 8, 197–202. [Google Scholar] [CrossRef] [Green Version] - Erlichson, H. Sadi Carnot, Founder of the Second Law of Thermodynamics’. Eur. J. Phys.
**1999**, 20, 183. [Google Scholar] [CrossRef] [Green Version] - Clausius, R. On the Motive Power of Heat, and on the Laws Which Can Be Deduced from It for the Theory of Heat; Annalen der Physik: Dover, NY, USA, 1960. [Google Scholar]
- Blinder, S. Mathematical methods in elementary thermodynamics. J. Chem. Educ.
**1966**, 43, 85. [Google Scholar] [CrossRef] - Boltzmann, L. The second law of thermodynamics. In Theoretical Physics and Philosophical Problems; Springer: Berlin/Heidelberg, Germany, 1974; pp. 13–32. [Google Scholar]
- Hoffmann, H.J. Energy and entropy of crystals, melts and glasses or what is wrong in Kauzmann’s paradox? Mater. Werkst.
**2012**, 43, 528–533. [Google Scholar] [CrossRef] - Jaynes, E.T. Gibbs vs. Boltzmann entropies. Am. J. Phys.
**1965**, 33, 391–398. [Google Scholar] [CrossRef] - Jones, D.S. Elementary Information Theory; Oxford University Press: New York, NY, USA, 1979. [Google Scholar]
- Tribus, M.; McIrvine, E.C. Energy and information. Sci. Am.
**1971**, 225, 179–188. [Google Scholar] [CrossRef] - Williams, P.L.; Beer, R.D. Nonnegative decomposition of multivariate information. arXiv
**2010**, arXiv:1004.2515. [Google Scholar] - Kolmogorov, A.N. On tables of random numbers. Sankhyā Indian J. Stat. Ser. A
**1963**, 25, 369–376. [Google Scholar] [CrossRef] [Green Version] - Teixeira, A.; Matos, A.; Souto, A.; Antunes, L. Entropy measures vs. Kolmogorov complexity. Entropy
**2011**, 13, 595–611. [Google Scholar] [CrossRef] - Zenil, H. Towards Demystifying Shannon Entropy, Lossless Compression and Approaches to Statistical Machine Learning. Proceedings
**2020**, 47, 24. [Google Scholar] [CrossRef] - Baez, J.; Stay, M. Algorithmic thermodynamics. Math. Struct. Comput. Sci.
**2012**, 22, 771–787. [Google Scholar] [CrossRef] [Green Version] - Zenil, H.; Kiani, N.A.; Tegnér, J. The thermodynamics of network coding, and an algorithmic refinement of the principle of maximum entropy. Entropy
**2019**, 21, 560. [Google Scholar] [CrossRef] [Green Version] - Zenil, H.; Kiani, N.A.; Tegnér, J. Low-algorithmic-complexity entropy-deceiving graphs. Phys. Rev. E
**2017**, 96, 012308. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Zenil, H.; Kiani, N.; Abrahão, F.; Tegner, J. Algorithmic Information Dynamics. Scholarpedia
**2020**, 15, 53143. [Google Scholar] [CrossRef] - Zenil, H.; Kiani, N.A.; Marabita, F.; Deng, Y.; Elias, S.; Schmidt, A.; Ball, G.; Tegnér, J. An algorithmic information calculus for causal discovery and reprogramming systems. Iscience
**2019**, 19, 1160–1172. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Zenil, H.; Hernández-Orozco, S.; Kiani, N.A.; Soler-Toscano, F.; Rueda-Toicen, A.; Tegnér, J. A decomposition method for global evaluation of shannon entropy and local estimations of algorithmic complexity. Entropy
**2018**, 20, 605. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Leff, H.S. What if entropy were dimensionless? Am. J. Phys.
**1999**, 67, 1114–1122. [Google Scholar] [CrossRef] - Bernard, T.N.; Shi, E.L.; Gentle, K.; Hakim, A.; Hammett, G.W.; Stoltzfus-Dueck, T.; Taylor, E.I. Gyrokinetic continuum simulations of plasma turbulence in the Texas Helimak. arXiv
**2018**, arXiv:1812.05703. [Google Scholar] [CrossRef] - Bagryansky, P.; Shalashov, A.; Gospodchikov, E.; Lizunov, A.; Maximov, V.; Prikhodko, V.; Soldatkina, E.; Solomakhin, A.; Yakovlev, D. Threefold increase of the bulk electron temperature of plasma discharges in a magnetic mirror device. Phys. Rev. Lett.
**2015**, 114, 205001. [Google Scholar] [CrossRef] [Green Version] - Maxwell, J.C.; Pesic, P. Theory of Heat; Courier Corporation: North Chelmsford, MA, USA, 2001. [Google Scholar]
- Szilard, L. Über die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen. Z. Für Phys.
**1929**, 53, 840–856. [Google Scholar] [CrossRef] - Landauer, R. Irreversibility and heat generation in the computing process. IBM J. Res. Dev.
**1961**, 5, 183–191. [Google Scholar] [CrossRef] - Sagawa, T.; Ueda, M. Fluctuation theorem with information exchange: Role of correlations in stochastic thermodynamics. Phys. Rev. Lett.
**2012**, 109, 180602. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev.
**1957**, 106, 620. [Google Scholar] [CrossRef] - Klauder, J.R.; Skagerstam, B.S. Coherent States: Applications in Physics and Mathematical Physics; World Scientific: Singapore, 1985. [Google Scholar]
- Callen, H.B. Thermodynamics and an Introduction to Thermostatistics. Am. J. Phys.
**1998**, 66, 164. [Google Scholar] [CrossRef]

**Figure 1.**Boltzmann’s entropy formula derivation: since it is known that total entropy S is the sum of its parts and the total number of microstates W is the product of its parts, the only function $S\left(W\right)$ relating these variables is a logarithm.

**Figure 2.**Maxwell’s demon: a being who knows the velocity of every particle in the box and can select their passage, using a opening in the wall that divides it, which could separate those with high energy from those with low energy without performing work, thus violating the second law of thermodynamics. The demon has to forget the past states of the system but, according to Landauer’s principle, this process generates heat (at least $kTlog2$ J per bit erased) and entropy.

**Figure 3.**The process of extracting work from a system, thought of by Szilard: in (

**a**), there is a single molecule of a fluid inside a box with energy Q. If one knows in which half of the box the molecule is (i.e., a single bit of information about its position), a piston can be inserted by halving the box (

**b**) and from the fluid expansion, work ((

**c**,

**d**)) $W=Q$ can be extracted from the system while it returns to its initial state.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Natal, J.; Ávila, I.; Tsukahara, V.B.; Pinheiro, M.; Maciel, C.D.
Entropy: From Thermodynamics to Information Processing. *Entropy* **2021**, *23*, 1340.
https://doi.org/10.3390/e23101340

**AMA Style**

Natal J, Ávila I, Tsukahara VB, Pinheiro M, Maciel CD.
Entropy: From Thermodynamics to Information Processing. *Entropy*. 2021; 23(10):1340.
https://doi.org/10.3390/e23101340

**Chicago/Turabian Style**

Natal, Jordão, Ivonete Ávila, Victor Batista Tsukahara, Marcelo Pinheiro, and Carlos Dias Maciel.
2021. "Entropy: From Thermodynamics to Information Processing" *Entropy* 23, no. 10: 1340.
https://doi.org/10.3390/e23101340