Next Article in Journal
Ricci Curvature, Isoperimetry and a Non-additive Entropy
Next Article in Special Issue
Geometric Shrinkage Priors for Kählerian Signal Filters
Previous Article in Journal
Entropic Measures of Complexity of Short-Term Dynamics of Nocturnal Heartbeats in an Aging Population
Previous Article in Special Issue
Distributed Consensus for Metamorphic Systems Using a GossipAlgorithm for CAT(0) Metric Spaces
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Meeting Report

Symmetry, Probabiliy, Entropy: Synopsis of the Lecture at MAXENT 2014 †

Institut Hautes Études Scientifiques, 35, Route de Chartres, F-91440 Bures-sur-Yvette, France
This paper was presented in MaxEnt 2014, Amboise, France, 21–26 September 2014.
Entropy 2015, 17(3), 1273-1277;
Submission received: 2 February 2015 / Revised: 9 March 2015 / Accepted: 10 March 2015 / Published: 13 March 2015
(This article belongs to the Special Issue Information, Entropy and Their Geometric Structures)


In this discussion, we indicate possibilities for (homological and non-homological) linearization of basic notions of the probability theory and also for replacing the real numbers as values of probabilities by objects of suitable combinatorial categories.

The success of the probability theory decisively, albeit often invisibly, depends on symmetries of systems this theory applies to. For instance:
  • The symmetry group of a single round of gambling with three dice has order 288 = 6 × 6 × 8: it is a semidirect product of the permutation group S3 of order 6 and the symmetry group of the 3d cube, that is, in turn, is a semidirect product of S3 and {±1}3.
  • The Bernoulli spaces ( p , 1 p ) , 0 < p < 1, of (■, ♦)-sequences indexed by integers z = { , 2 , 1 , 0 , 1 , 2 , }. are acted upon by a semidirect product of the infinite permutation group
    S = = { , 2 , 1 , 0 , 1 , 2 , }
    and the (compact) group { ± 1 } = { } , with the role of the latter being essential even for p 1 2 where the probability measure is not preserved.
  • The system of identical point-particlesi in the Euclidean 3-space 3, that are indexed by a countable set Ii, is acted upon by the isometry group of 3 times the infinite permutation group S∞=I.
  • Buffon’s probabilistic needle formula for π 3.141592653589793⋯ relies on the invariance of the Haar measure on the circle.
  • What happens if the symmetry is enhanced, e.g., from the permutation group S=I to the group G L F ( ) of liner transformations of the vector space F I (formally) spanned by symbols [i], iI, regarded as (linearly independent) vectors over a filed F?
  • What could you do if your system is inherently heterogeneous, such as a folding polypeptide chain or a natural language, for instance?
Hilbertisation/unitarisation/quantization of set categories brought along a development of several magnificent non-commutative probability theories, e.g., of those under the headings of von-Neumann algebras, von Neumann entropy [1,2], free probabilities [3].
By comparison, the achievements of the non-unitary linearisation of probability theory are modest—just a few amusing observations.
Example 1. Linearized Loomis-Whitney-Shannon-Shearer Submultiplicativity Inequality [4,5]
Let Φ = Φ (x1, x2, x3, x4) be a 4-linear function (form) over some field (where the variables xi run over some vector spaces Xi). Then the ranks of the following four bilinear forms Φ(x1, x2x3x4), Φ(x1x2, x3x4), Φ (x1x3, x2x4) and Φ(x1x4, x2x3,) satisfy
( r a n k [ 1 , 234 ] ) 2 r a n k [ 12 , 34 ] r a n k [ 13 , 24 ] r a n k [ 14 , 23 ] .
Example 2. Homology Measures [6].
Homologies H(X) = ⊕i Hi(X) of topological spaces X and natural subgroups in H are graded Abelian groups: their ranks are properly represented not by individual numbers ri, but by Poincaré polynomials PX(t) = ∑iri · ti.
The polynomial valued set function UPU, UX, has some measure/entropy-like properties that become more pronounced for the ideal valued function that assigns the kernels
K e r X \ U H * ( X ; A )
of the inclusion/restriction cohomology homomorphisms for the complements X \ UX for subsets UX,
U μ * ( U ) = d e f K e r X \ U = d e f K e r [ H * ( X ; A ) H * ( X \ U ; A ) ] ,
for some Abelian (cohomology coefficient) group A.
The basic properties of this μ* (stated slightly differently in topology textbooks) have an attractive measure theoretic flavour. Namely,
μ*(U) is additive for the sum-of-subsets in the group H*(X; A) and, if A is a commutative ring, then μ* is super-multiplicative for the the ◡-product of ideals:
μ * ( U 1 U 2 ) = μ * ( U i ) + μ * ( U 2 )
for disjoint open subsets U1 and U2 in A, and
μ * ( U 1 U 2 ) μ * ( U 1 ) μ * ( U 2 )
for all open U1, U2A.
Next, given a linear subspace Θ ⊂ H* (X; A), let
μ Θ ( U ) = Θ K e r X \ U
and, assuming A is (the additive group of) a field, denote the rank of μΘ (U) over this field by |μΘ(U)| = |μΘ(U)|A.
Linearized Matsumoto-Tokushige Separation Inequality in the N-torus.
Let U1, U2 U 2 T N be non-intersecting (closed or open) subsets and let
Θ 1 = H n 1 ( T N ; A ) , and Θ 2 = H n 2 ( T N ; A )
for niN/2, i = 1,2 and some field A. Then
| μ Θ 1 ( U 1 ) | | μ Θ 2 ( U 2 ) | c | Θ 1 | | Θ 2 |
for c = n1n2/N2 and where, observe, | Θ i = n i A | = ( n i N ).
If we think of the torus T N as a physical system of N uncoupled linear oscillators then the “measures” μ* (U) and/or μΘ U may be interpreted as
“the numbers of persistent degrees of freedom” of this system that are observable from U.
Probabilistic/entropic interpretation of homology, which is kind of “dual” to “homological interpretation of entropy-like invariants” by Bennequin [7], and also by Drummond-Cole et al. [8,9], is also possible for “coupled systems” [10] where particularly attractive ones are systems of moving disjoint balls in space where the configuration spaces of these systems support rich homology structures that are induced from the classifying spaces of (subgroups of) infinite symmetric groups S=I [11], that is expanded/corrected in [12].
A mathematical study of “loose structures” such as what you find in biology and linguistics needs generalisations that would allow a use of relaxed, rather than enhanced, symmetries.
For instance, just to warm up, one may start by elaborating on the category theoretic definition of the entropy suggested “In a Search for a Structure, Part 1: On Entropy” [13], where the entropy of a finite probability space P = {pi}, pi > 0, ∑ipi = 1, comes as the class [P]Gro of P in the Grothendieck group G r o ( P ) of the topological category P of finite probability spaces P and probability/measure preserving maps PQ with a properly defined topological structure in P.
Since the group G r o ( P ) is isomorphic to the multiplicative group of positive real numbers [13]—this is a reformulation of the Bernoulli law of large numbers – the Grothendieck class [P]Gro can be identified with exp ent(P).
In general, such a Grothendieck-style entropy would be not a number valued function of any kind, but (not quite) a functor from an elaborate combinatorial (not quite) category, e.g., comprised of fragments of a natural language with some (not always composable) “morphisms/arrows” between them, to some “simple category” e.g., the category of weighted trees.
The so modified probability/entropy theory is badly needed for designing algorithms that would model what we call (ego)learning described in “ Ergostructures, Ergodic and the Universal Learning Problem” [14] and in “ Understanding Languages and Making Dictionaries” [15], (in preparation) but I have not progressed much in pursuing this direction yet.


I want to thank Frederic Barbaresco for his interest in the subject matter of this paper and for inviting me to the MaxEnt’14 meeting in Amboise, France, and the anonymous referees for their friendly comments and suggestions.

Conflicts of Interest

The author declares no conflict of interest.

References and Notes

  1. Parthasarathy, K.R. An Introduction to Quantum Stochastic Calculus; Modern Birkhäuser Classics; Springer: Basel, Switzerland, 1992. [Google Scholar]
  2. Meyer, P.-A. Quantum Probability for Probabilists, 2nd ed; Lecture Notes in Mathematics; Springer: Berlin/Heidelberg, Germany, 1995. [Google Scholar]
  3. Nica, A.; Speicher, R.; Voiculescu, D. Free Probability Theory. Available online: accessed on 10 March 2015.
  4. Gromov, M. Entropy and isoperimetry for linear and non-linear group actions. Groups Geom. Dyn. 2008, 2, pp. 499–593. Available online: accessed on 10 March 2015.
  5. Gromov, M. Six Lectures on Probabiliy, Symmetry, Linearity. Available online: accessed on 10 March 2015.
  6. Gromov, M. Singularities, expanders and topology of maps. Part 2: From combinatorics to topology via algebraic isoperimetry. Geom. Funct. Anal. 2010, 20, pp. 416–526. Available online: accessed on 10 March 2015.
  7. Bennequin, D. Homological interpretation of entropy-like invariants. Proceedings of 34th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Amboise, France, 21–26 September 2014.
  8. Drummond-Cole, G.C.; Park, J.-S.; Terilla, J. Homotopy Probability Theory I. J. Homotopy Relat. Struct. 2013. [Google Scholar] [CrossRef]
  9. Drummond-Cole, G.C.; Park, J.-S.; Terilla, J. Homotopy Probability Theory II. J. Homotopy Relat. Struct. 2013. [Google Scholar] [CrossRef]
  10. Bertelson, M.; Gromov, M. Dynamical Morse Entropy; Modern dynamical systems and applications; Springer: Berlin, Germany, 2010. Available online: accessed on 10 March 2015.
  11. Gromov, M. Number of Questions. pp. 82–102. Available online: accessed on 10 March 2015. Section 4.
  12. Gromov, M. Morse Spectra, Homology Measures and Parametric Packing Problems. 2015; in preparation. [Google Scholar]
  13. Gromov, M. In a Search for a Structure, Part 1: On Entropy. Available online: accessed on 10 March 2015.
  14. Gromov, M.; Ergostuctures, Ergologic. the Universal Learning Problem: Chapters 1,2. Available online: accessed on 10 March 2015.
  15. Gromov, M. Understanding Languages and Making Dictionaries. 2015; in preparation. [Google Scholar]

Share and Cite

MDPI and ACS Style

Gromov, M. Symmetry, Probabiliy, Entropy: Synopsis of the Lecture at MAXENT 2014. Entropy 2015, 17, 1273-1277.

AMA Style

Gromov M. Symmetry, Probabiliy, Entropy: Synopsis of the Lecture at MAXENT 2014. Entropy. 2015; 17(3):1273-1277.

Chicago/Turabian Style

Gromov, Misha. 2015. "Symmetry, Probabiliy, Entropy: Synopsis of the Lecture at MAXENT 2014" Entropy 17, no. 3: 1273-1277.

Article Metrics

Back to TopTop