# On the Logic of a Prior Based Statistical Mechanics of Polydisperse Systems: The Case of Binary Mixtures

## Abstract

**:**

## 1. Introduction

## 2. Textbook Treatment of Binary Mixtures

## 3. Heuristic Generalisation

- The heuristic approach used in this section starts off by directly generalising Equation (3) into Equation (6). Mathematically, it cannot start from Equation (1) as, in general, it could involve a fractional number of phase space integrals over particles of type 1 or 2, respectively. Thus, the current approach cannot be used as a basis to devise a mathematically rigorous composition-probability-based statistical mechanics of mixtures.
- Equation (6) made use of the Euler Gamma function to replace the more traditionally accepted $n!$ terms at the denominator so as to account for $Np$ not being an integer. While mathematically this may be fine, it is not justified within statistical mechanics itself and, indeed, as the first point was being raised, it is hardly so, as one could have a fractional number of phase space integrals.
- Problems arise with $Np$ as well. What does it mean to switch from ${N}_{1}$ (an integer) to $Np$ (not necessarily an integer) in the canonical ensemble? If p is a probability for a particle to be of type 1, then the particle type is a Bernouilli random variable $\mathbf{t}=\{1,2\}$ such that $p(\mathbf{t}=1)=p$ and $p(\mathbf{t}=2)=1-p$. Suppose we now model a mixture as a collection ${\left\{{\mathbf{t}}_{i}\right\}}_{i=1,\dots ,N}$ of N independent such random variables. This leads us to define the random variable ${\mathbf{N}}_{1}\equiv {\sum}_{i=1}^{N}(-{\mathbf{t}}_{i}+2)$ corresponding to the number of particles of type 1 in the system. If we denote $\langle \langle \xb7\rangle \rangle $, from the composition average, $\langle \langle {\mathbf{N}}_{1}\rangle \rangle =Np$ comes. Thus, by substituting ${N}_{1}$ with $Np$ in Equation (3), one effectively replaces an integral number of particles by a real positive expectation value. Conceptually, this poses a problem, at least in principle, as any given individual mixture will only ever have an integer number of particles usually close to, but different from, Np.

## 4. Prior Based Statistical Mechanics of a Binary Mixture

- The large N limit leading to Equation (8) in the heuristic derivation (Section 3) implies that both $Np$ and $N(1-p)$ should be sufficiently large for the Stirling approximation to hold, which can require a very large N if p is either very small or very close to unity. On the contrary, Equation (14) converges rather quickly with N to the asymptotic form Equation (8) especially when p is either very small or very close to unity. This can be explained by remarking that, if $1-p$ is very small and N finite, any realisation will most likely have very few, if any, particles of type 2. As a consequence, the realisation entropy will be closer to zero in magnitude as there is less uncertainty in the composition realisations. Therefore, although the heuristic derivation leads to the asymptotic form Equation (8), it incorrectly—given that Equation (14) has stronger mathematical and conceptual foundations—predicts the composition for which the asymptotic regime is reached the fastest.
- For finite N, the realisation entropy term in Equation (14) actually decreases the entropy estimate given by the first four terms in the r.h.s of Equation (14) as it contributes negatively to the entropy of the system. This can be understood by adopting a “surprise” interpretation of entropy. The last two terms of Equation (14) can then be interpreted as the average surprise to have ${N}_{1}$ type 1 particles in the mixture. On the one hand, the $Ns(p)$ contribution to entropy stems from an estimation of the surprise for a given realisation ${N}_{1}$ as being ${N}_{1}lnp+(N-{N}_{1})ln(1-p)$. This would be exact if one were to either assign a particular order to the particles or if one were to repeat single-particles experiments N times, where the identity of the particle for each try is obtained from the underlying probability distribution of the Bernouilli variable $\mathbf{t}$, and add-up the observed individual surprises [22]. On another hand, the probability to have ${N}_{1}$ particles of type 1 among N is ${B}_{N,p}({N}_{1})$ and the corresponding surprise is $-ln{B}_{N,p}({N}_{1})$. The difference between the two gives us the relative surprise$-ln\frac{{p}^{{N}_{1}}{(1-p)}^{N-{N}_{1}}}{{B}_{N,p}({N}_{1})}$ which captures the contribution to entropy owing to composition.

## 5. Identifying the Gibbs Mixing Entropy

**Partitioning**: The first term on the r.h.s of Equation (21) corresponds to the entropy change owing to having removed a partition between the initial compartments or, said differently, results from the more numerous partitioning possibilities offered when the whole volume V can be explored instead of $V/2$. Indeed, when each particle can freely roam in the boxe of volume V, imagining a virtual division splitting the box into two equal half will lead each particle to be either on one side or the other of the virtual dividing wall. There are then ${2}^{N}$ possibilities. When the dividing wall switches from virtual to real, and $N/2$ particles have to be on each side, the number of ways of realising this partitioning is $N!/{((N/2)!)}^{2}$. The corresponding entropy change is therefore larger than zero for any finite N.**Composition distance**: The second term on the r.h.s of Equation (21) has already been discussed in Section 4 and corresponds to a contribution to the entropy variation stemming from how different are the compositions of mixtures A and B when characterised by underlying probability distributions for particle identities. Since ${D}_{JS}({p}_{A}|{p}_{B})$ is a square metric, it becomes strictly zero when the a priori compositions are identical.**Composition realisation**: The third and last term identified on the r.h.s of Equation (21) corresponds to the entropy variation owing to realisation considerations. It is worth noting that when the compositions are identical i.e., ${p}_{\mathrm{A}}={p}_{\mathrm{B}}$ then ${P}_{N,{p}_{\mathrm{A}},{p}_{\mathrm{B}}}={B}_{N,{p}_{\mathrm{A}}}$ but $H({B}_{N,{p}_{\mathrm{A}}},{B}_{N,{p}_{\mathrm{A}}})\le 2H({B}_{\frac{N}{2},{p}_{\mathrm{A}}},{B}_{\frac{N}{2},{p}_{\mathrm{A}}})$ because of the submodular character of Shannon’s entropy function [34]. As a result, the entropy variation due to composition realisations is larger than or equal to zero upon mixing even when substances have the same underlying composition.

## 6. Discussion

## 7. Conclusions

## Funding

## Conflicts of Interest

## Appendix A. Limiting Behaviour of H(P_{N,pA,pB},B_{N,pC})

## Appendix B. Deriving the Entropy Change Averaged over Compositions

- If $f({N}_{1}^{\mathrm{A}},{N}_{1}^{\mathrm{B}})=f({N}_{1}^{\mathrm{A}/\mathrm{B}})$ i.e., only depends on one of the variables, then$$\langle \langle f({N}_{1}^{\mathrm{A}},{N}_{1}^{\mathrm{B}})\rangle \rangle =\sum _{{N}_{1}^{\mathrm{A}/\mathrm{B}}=0}^{N/2}{B}_{\frac{N}{2},{p}_{\mathrm{A}/\mathrm{B}}}({N}_{1}^{\mathrm{A}/\mathrm{B}})f({N}_{1}^{\mathrm{A}/\mathrm{B}}).$$
- If $f({N}_{1}^{\mathrm{A}},{N}_{1}^{\mathrm{B}})=g({N}_{1}^{\mathrm{A}}+{N}_{1}^{\mathrm{B}})$, where $g(x)$ is a function of a single variable, then$$\begin{array}{cc}\hfill \langle \langle f({N}_{1}^{\mathrm{A}},{N}_{1}^{\mathrm{B}})\rangle \rangle & =\sum _{{N}_{1}^{\mathrm{A}}=0}^{N/2}\sum _{{N}_{1}^{\mathrm{B}}=0}^{N/2}{B}_{\frac{N}{2},{p}_{\mathrm{A}}}({N}_{1}^{\mathrm{A}}){B}_{\frac{N}{2},{p}_{\mathrm{B}}}({N}_{1}^{\mathrm{B}})g({N}_{1}^{\mathrm{A}}+{N}_{1}^{\mathrm{B}})\hfill \end{array}$$$$\begin{array}{c}\phantom{\rule{7em}{0ex}}=\sum _{{N}_{1}^{\mathrm{C}}=0}^{N}\sum _{{N}_{1}^{\mathrm{A}}=0}^{{N}_{1}^{\mathrm{C}}}{B}_{\frac{N}{2},{p}_{\mathrm{A}}}({N}_{1}^{\mathrm{A}}){B}_{\frac{N}{2},{p}_{\mathrm{B}}}({N}_{1}^{\mathrm{C}}-{N}_{1}^{\mathrm{A}})g({N}_{1}^{\mathrm{C}})\hfill \end{array}$$$$\begin{array}{c}\phantom{\rule{4em}{0ex}}=\sum _{{N}_{1}^{\mathrm{C}}=0}^{N}{P}_{N,{p}_{\mathrm{A}},{p}_{\mathrm{B}}}({N}_{1}^{\mathrm{C}})g({N}_{1}^{\mathrm{C}}).\hfill \end{array}$$

## References

- Darrigol, O. The Gibbs paradox: Early history and solutions. Entropy
**2018**, 20, 443. [Google Scholar] [CrossRef] - Gibbs, J.W. On the Equilibrium of Heterogenous Substances. Conn. Acad. Sci.
**1876**, 3, 108–248. [Google Scholar] - Gibbs, J.W. Elementary Principles in Statistical Mechanics; Ox Bow Press: Woodbridge, CT, USA, 1981. [Google Scholar]
- Glazer, M.; Wark, J. Statistical Mechanics a Survival Guide, 1st ed.; Oxford University Press: Oxford, UK, 2001. [Google Scholar]
- Balescu, R. Equilibrium and Nonequilibrium Statistical Mechanics, 1st ed.; John Wiley & Sons: Hoboken, NJ, USA, 1975. [Google Scholar]
- Lemons, D.S. A student’s Guide to Entropy; Cambridge University Press: Cambridge, UK, 2013. [Google Scholar]
- Penrose, O. Fundation of Statistical Mechanics: A deductive Treatment; Dover Publications: Mineola, NY, USA, 2005. [Google Scholar]
- Balian, R. From Microphysics to Macrophysics: Methods and Applications of Statistical Physics; Springer: Berlin, Germany, 2003. [Google Scholar]
- Huang, K. Statistical Mechanics, 2nd ed.; John Wiley & Sons: Hoboken, NJ, USA, 1987. [Google Scholar]
- Wannier, G.H. Statistical Physics; Dover: Mineola, NY, USA, 1966. [Google Scholar]
- Ben-Naim, A. Entropy and the Second Law; World Scientific: Singapore, 2012. [Google Scholar]
- Allahverdyan, A.; Nieuwenhuizen, T.M. Explanation of the Gibbs paradox within the framework of quantum thermodynamics. Phys. Rev. E
**2006**, 73, 066119. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Unnikrishnan, C.S. The Gibbs paradox and the physical criteria for indistinguishability of identical particles. Int. J. Quantum Inf.
**2016**, 14, 1640037. [Google Scholar] [CrossRef] - Rosen, R. The Gibbs’ Paradox and the Distinguishability of Physical Systems. Philos. Sci.
**1964**, 31, 232–236. [Google Scholar] [CrossRef] - Van Kampen, N.G. The Gibbs’ paradox. In Essays in Theoretical Physics: In Honor of Dirk ter Haar; Parry, W., Ed.; Pergamon Press: Oxford, UK, 1984. [Google Scholar]
- Jaynes, E.T. The Gibbs’ paradox. In Maximum Entropy and Bayesian Methods; Smith, C., Erickson, G., Neudorfer, P., Eds.; Kluwer Academic: Dordrecht, The Netherlands, 1992; pp. 1–22. [Google Scholar]
- Swendsen, R.H. Statistical mechanics of colloids and Boltzmann’s definition of the entropy. Am. J. Phys.
**2006**, 74, 187–190. [Google Scholar] [CrossRef] - Enders, P. Is Classical Statistical Mechanics Self-Consistent? Prog. Phys.
**2007**, 3, 85. [Google Scholar] - Frenkel, D. Why Colloidal Systems Can be described by Statistical Mechanics: Some not very original comments on the Gibbs’ paradox. Mol. Phys.
**2014**, 112, 2325–2329. [Google Scholar] [CrossRef] - Peters, H. Demonstration and resolution of the Gibbs paradox of the first kind. Eur. J. Phys. E
**2014**, 35, 015023. [Google Scholar] [CrossRef] - Cates, M.E.; Manoharan, V.N. Testing the Foundations of Classical Entropy: Colloid Experiments. arXiv
**2015**, arXiv:1507.04030. [Google Scholar] - Paillusson, F. Gibbs’ paradox according to Gibbs and slightly beyond. Mol. Phys.
**2018**, 116, 3196–3213. [Google Scholar] [CrossRef] - Saunders, S. The Gibbs Paradox. Entropy
**2018**, 20, 552. [Google Scholar] [CrossRef] - Dieks, D. The Gibbs Paradox and particle individuality. Entropy
**2018**, 20, 466. [Google Scholar] [CrossRef] - van Lith, V. The Gibbs Paradox: Lessons from thermodynamics. Entropy
**2018**, 20, 328. [Google Scholar] [CrossRef] - Sollich, P. Predicting phase equilibria in polydisperse systems. J. Phys. Condens. Matter
**2002**, 14, 79–117. [Google Scholar] [CrossRef] - Jacobs, W.; Frenkel, D. Predicting phase behaviour in multicomponent mixtures. J. Chem. Phys.
**2013**, 139, 024108. [Google Scholar] [CrossRef] [PubMed] - Swendsen, R.H. Probability, Entropy, and Gibbs’ Paradox(es). Entropy
**2018**, 20. [Google Scholar] [CrossRef] - Paillusson, F.; Pagonabarraga, I. On the role of compositions entropies in the statistical mechanics of polydisperse systems. J. Stat. Mech.
**2014**, 2014, P10038. [Google Scholar] [CrossRef] - Cheraghchi, M. Expressions for the entropy of basic discrete distributions. IEEE Trans. Inf. Theory
**2019**. [Google Scholar] [CrossRef] - Endres, D.M.; Schindelin, J.E. A new metric for probability distributions. IEEE Trans. Inf. Theory
**2003**, 49, 1858–1860. [Google Scholar] [CrossRef] [Green Version] - Knessl, C. Integral representations and asymptotic expansions for Shannon and Renyi entropies. Appl. Math. Lett.
**1998**, 11, 69–74. [Google Scholar] [CrossRef] [Green Version] - Sollich, P.; Warren, P.B.; Cates, M.E. Moment Free Energies for Polydisperse Systems. In Advances in Chemical Physics; John Wiley & Sons, Ltd: Hoboken, NJ, USA, 2007; pp. 265–336. [Google Scholar]
- Enders, P. Equality and identity and (in)distinguishability in classical and quantum mechanics from the point of view of Newton’s notion of state. In Proceedings of the 2008 IEEE Information Theory Workshop (ITW’ 08), Porto, Portugal, 5–9 May 2008; pp. 303–307. [Google Scholar]
- Szilard, L. On the decrease of entropy in a thermodynamic system by the intervention of inteligent beings. Z. Phys.
**1929**, 53, 840–856. [Google Scholar] [CrossRef] - Sandford, C.; Seeto, D.; Grosberg, A.Y. Active sorting of particles as an illustration of the Gibbs mixing paradox. arXiv
**2017**, arXiv:1705.05537v2. [Google Scholar] - Blavatska, V. Equivalence of quenched and annealed averaging in models of disordered polymers. J. Phys. Condens. Matter
**2013**, 25, 505101. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Jensen, J.L.W.V. Sur les fonctions convexes et les inégalités entre les valeurs moyennes. Acta Math.
**1906**, 30, 175–193. [Google Scholar] [CrossRef]

**Figure 1.**Schematic representation of a mixing scenario between two different substances. The top left panel uses a particle representation with type 1 particles as dark orange squares and type 2 particles as blue disks. We see that the compositions of substances A and B are different. This is further illustrated in representing their underlying probability distribution on the top right panel. Upon mixing the bottom panels, they form a new composition C that is a priori different from A and B.

© 2019 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Paillusson, F.
On the Logic of a Prior Based Statistical Mechanics of Polydisperse Systems: The Case of Binary Mixtures. *Entropy* **2019**, *21*, 599.
https://doi.org/10.3390/e21060599

**AMA Style**

Paillusson F.
On the Logic of a Prior Based Statistical Mechanics of Polydisperse Systems: The Case of Binary Mixtures. *Entropy*. 2019; 21(6):599.
https://doi.org/10.3390/e21060599

**Chicago/Turabian Style**

Paillusson, Fabien.
2019. "On the Logic of a Prior Based Statistical Mechanics of Polydisperse Systems: The Case of Binary Mixtures" *Entropy* 21, no. 6: 599.
https://doi.org/10.3390/e21060599