Next Article in Journal
Dynamic Information-Hiding Method with High Capacity Based on Image Interpolating and Bit Flipping
Next Article in Special Issue
Constraints on Tsallis Cosmology from Big Bang Nucleosynthesis and the Relic Abundance of Cold Dark Matter Particles
Previous Article in Journal
A Semi-Quantum Secret-Sharing Protocol with a High Channel Capacity
Previous Article in Special Issue
The Typical Set and Entropy in Stochastic Systems with Arbitrary Phase Space Growth
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Opinion

Senses along Which the Entropy Sq Is Unique

by
Constantino Tsallis
1,2,3
1
Centro Brasileiro de Pesquisas Físicas and National Institute of Science and Technology of Complex Systems, Rua Xavier Sigaud 150, Rio de Janeiro 22290-180, RJ, Brazil
2
Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM 87501, USA
3
Complexity Science Hub Vienna, Josefstädter Strasse 39, 1080 Vienna, Austria
Entropy 2023, 25(5), 743; https://doi.org/10.3390/e25050743
Submission received: 13 April 2023 / Revised: 25 April 2023 / Accepted: 26 April 2023 / Published: 1 May 2023
(This article belongs to the Special Issue The Statistical Foundations of Entropy II)

Abstract

:
The Boltzmann–Gibbs–von Neumann–Shannon additive entropy S B G = k i p i ln p i as well as its continuous and quantum counterparts, constitute the grounding concept on which the BG statistical mechanics is constructed. This magnificent theory has produced, and will most probably keep producing in the future, successes in vast classes of classical and quantum systems. However, recent decades have seen a proliferation of natural, artificial and social complex systems which defy its bases and make it inapplicable. This paradigmatic theory has been generalized in 1988 into the nonextensive statistical mechanics—as currently referred to—grounded on the nonadditive entropy S q = k 1 i p i q q 1 as well as its corresponding continuous and quantum counterparts. In the literature, there exist nowadays over fifty mathematically well defined entropic functionals. S q plays a special role among them. Indeed, it constitutes the pillar of a great variety of theoretical, experimental, observational and computational validations in the area of complexity—plectics, as Murray Gell-Mann used to call it. Then, a question emerges naturally, namely In what senses is entropy S q unique? The present effort is dedicated to a—surely non exhaustive—mathematical answer to this basic question.

1. Introduction

Boltzmann–Gibbs (BG) statistical mechanics can arguably be considered as one of the pillars of contemporary theoretical physics, together with Maxwell electromagnetism, Newtonian and quantum mechanics, and Einstein’s special and general relativity. Consistently, the concepts of entropy and energy provide the basis of classical thermodynamics [1]. The BG theory is grounded on the well known BG entropy [2,3,4,5,6], which is additive. The 1988 proposal [7] of nonadditive entropies as a basis to generalize the traditional BG theory led to what is currently referred to as nonextensive statistical mechanics. Let us briefly review here the basic issues.
BG statistical mechanics is constructed upon the following Boltzmann–Gibbs–von Neumann–Shannon entropic functional:
S B G = k i = 1 W p i ln p i i = 1 W p i = 1 ,
where k is a conventional positive constant chosen once for ever (typically k = k B in physics, and k = 1 in computational sciences). Its maximal value occurs for equal probabilities, i.e., p i = 1 / W , i , and is given by
S B G = k ln W ,
carved on the tombstone of Ludwig Boltzmann in Vienna. This relation constitutes a genius connection between the macroscopic and the microscopic descriptions of real systems. The entropy (1) is additive [8]. Indeed, if A and B are two probabilistically independent systems [i.e., p i j A + B = p i A p j B , ( i , j ) ], we straightforwardly verify that
S B G ( A + B ) = S B G ( A ) + S B G ( B ) .
Further, for a system in thermodynamical equilibrium with a thermostat at temperature T, the distribution which optimizes S B G is given by the celebrated BG weight
p i = e β E i j = 1 W e β E j ,
where β = 1 / k T and { E i } are the possible energies of the system.
A generalization of this theory was proposed in 1988 [7] on the basis of the entropic functional
S q = k 1 i = 1 W p i q q 1 ( q R ; S 1 = S B G ) .
This functional can also be written as
S q = k i = 1 W p i ln q 1 p i = k i = 1 W p i q ln q p i = k i = 1 W p i ln 2 q p i ,
where the q-logarithmic function is defined as
ln q z z 1 q 1 1 q ( ln 1 z = ln z ) .
The extremal value of S q is given by the generalization of Equation (2), namely
S B G = k W 1 q 1 1 q k ln q W .
This value corresponds to a maximum for q > 0 , to a minimum for q < 0 , and to the constant k ( W 1 ) for q = 0 .
Equation (3) is generalized as follows:
S q ( A + B ) k = S q ( A ) k + S q ( B ) k + ( 1 q ) S q ( A ) k S q ( B ) k ,
hence
S q ( A + B ) = S q ( A ) + S q ( B ) + 1 q k S q ( A ) S q ( B ) ,
which recovers Equation (3) in the ( q 1 ) / k 0 limit.
Equation (4) is generalized into
p i = e q β q ( E i μ q ) j = 1 W e q β q ( E j μ q ) ,
where μ q plays the role of a chemical potential, and e q ( x ) is the inverse function of ln q x , i.e.,
e q x [ 1 + ( 1 q ) x ] + 1 1 q ,
[ ] + being equal to [ ] if [ ] > 0 and zero otherwise; it satisfies e q x e 2 q x = 1 .
Details related to this q-generalized statistical mechanics, currently referred to as nonextensive statistical mechanics, are available at [9,10,11,12], and strong experimental validations are presented in [13,14,15], to cite but a few; full bibliography is available at [16]. In connection with Equation (11), see also [17], where it is shown that, through a Moebius group, one can find a Casimir invariant, which allows to define an observable inverse temperature  β q . Let us also mention at this point that exponential and logarithmic deformed functions that extend the q-exponential and the q-logarithmic ones [18] are already available in the literature [19,20,21].

2. On Uniqueness

Since the introduction of S B G in the XIX-th century, very many (nearly fifty) entropic functionals have emerged in the literature for a variety of informational, cybernetic, physical, mathematical reasons: see, for instance, Figure 1. For a meticulous listing of existing entropic functionals discussed from a chronological and logical perspective, see [22]; for further historical remarks, see [12] in Section 3.2.1.
S q and, naturally, its particular case S B G reveal some sort of special role within this ever increasingly long list, in the sense that they repeatedly appear, either directly or indirectly through their consequences, in a plethora of natural, artificial and social systems, especially those within which nonlinear dynamics is involved. The present effort is dedicated to review in what senses S q is nowadays known to be unique.
Before focusing on this task, let us mention that, among the many existing extensions of S B G , only one is presently known to be additive, all the others being nonadditive. This exception is the Renyi entropic functional [24], defined as follows:
S q R = k ln i = 1 W p i q 1 q ( q R ; S 1 R = S B G ) .
The nonadditive S q and the additive S q R are related through the following monotonically increasing function [7]:
S q R = k ln [ 1 + ( 1 q ) S q / k ] 1 q ( q ) .
It immediately follows that the extremization of S q R and S q for the same set of constraints yields the same optimizing distribution. For instance, if a value for q < 1 exists for a specific class of systems, such that S q ( N ) is extensive, i.e., S q ( N ) N ( N ), then S q R ln N . Such a nonlinear asymptotic behavior makes S q R to violate thermodynamical entropic extensivity, which violates in turn the mathematical Legendre structure upon which classical thermodynamics is based. For different purposes, however, the Renyi entropic functional exhibits some interesting mathematical properties (see [47] and references therein). Let us finally mention that the functional relationship (14) plays a central role in recently q-generalized mathematical objects [48].

2.1. Santos 1997 Theorem

Shannon formulated in 1948 a definitively relevant theorem [49,50], which we summarize here.
Let us assume that an entropic form S ( { p i } ) satisfies the following properties:
( i ) S ( { p i } ) i s a c o n t i n u o u s f u n c t i o n o f { p i } ;
( i i ) S ( p i = 1 / W , i ) m o n o t o n i c a l l y i n c r e a s e s w i t h t h e t o t a l n u m b e r o f p o s s i b i l i t i e s W ;
( i i i ) S ( A + B ) = S ( A ) + S ( B ) i f p i j A + B = p i A p j B ( i , j ) , w h e r e S ( A + B ) S ( { p i j A + B } ) , S ( A ) S ( { p i A } ) ( p i A j = 1 W B p i j A + B ) , a n d S ( B ) S ( { p j B } ) ( p j B i = 1 W A p i j A + B ) ;
( i v ) S ( { p i } ) = S ( p L , p M ) + p L S ( { p i / p L } ) + p M S ( { p i / p M } ) w i t h p L L t e r m s p i , p M M t e r m s p i , L + M = W , a n d p L + p M = 1 .
Then and only then [49,50] S ( { p i } ) is given by Equation (1).
It is therefore very clear in what sense the functional (1) is unique, namely that the axiomatic set ( i ) ( i v ) is mathematically equivalent to the functional (1). This neatly differs from the definitively wrong, and yet not rare, statement that form (1) is the unique physically admissible entropic functional. Axiom ( i v ) is sometimes referred to as the grouping property. Let us also mention that some authors prefer the notation S ( A × B ) instead of S ( A + B ) in order to emphasize the fact that the phase-space of the total system is the tensor product of the space-phases of the subsystems A and B.
In 1997, Santos theorem [51] generalized that of Shannon as follows:
Let us assume that an entropic form S ( { p i } ) satisfies the following properties:
( i ) S ( { p i } ) i s a c o n t i n u o u s f u n c t i o n o f { p i } ;
( i i ) S ( p i = 1 / W , i ) m o n o t o n i c a l l y i n c r e a s e s w i t h t h e t o t a l n u m b e r o f p o s s i b i l i t i e s W ;
( i i i ) S ( A + B ) k = S ( A ) k + S ( B ) k + ( 1 q ) S ( A ) k S ( B ) k i f p i j A + B = p i A p j B ( i , j ) , w i t h k > 0 ;
( i v ) S ( { p i } ) = S ( p L , p M ) + p L q S ( { p i / p L } ) + p M q S ( { p i / p M } ) w i t h p L L t e r m s p i , p L M t e r m s p i , L + M = W , a n d p L + p M = 1 .
Then and only then [51] S ( { p i } ) is given by Equation (5).

2.2. The 1997 Connection to Weak Chaos in the Logistic Map

The first connection between the entropy S q and nonlinear dynamical systems, namely the logistic map, was established in 1997 [52]. This connection was analytically complemented one year later [53]. Since then, a vast literature has been dedicated to this connection, which we summarize in what follows.
The logistic map is a paradigmatic one-dimensional dissipative nonlinear dynamical system. It is defined as follows:
x t + 1 = 1 a x t 2 ( t = 0 , 1 , 2 , ; a [ 0 , 2 ] ; x t [ 1 , 1 ] ) .
For a = 2 , the system is strongly chaotic, the sensitivity to the initial conditions is given by ξ lim Δ x 0 0 Δ x t Δ x 0 = e λ 1 t , the Lyapunov exponent λ 1 being equal to ln 2 = 0.69 , and its entropy production per unit time is given by the Pesin identity (see details in [54] and references therein)
K B G lim t S B G ( t ) t = λ 1 ,
where the subindex 1 will become clear here below.
At the Feigenbaum point a c = 1.40115518909205 , the system is weakly chaotic, the Lyapunov exponent λ 1 vanishes, the sensitivity to the initial conditions is given by ξ = e q λ q t , the q-generalized Lyapunov coefficient λ q being described in [55], and its q-generalized entropy production per unit time is given by the Pesin-like identity (see details in [54] and references therein)
K q lim t S q ( t ) t = λ q ,
where q = 0.24448770134128
These remarkable results by no means prove, on rigorous mathematical grounds, the uniqueness of S q in what concerns such connections with say generic dissipative nonlinear one-dimensional dynamical systems. For example, the Kaniadakis entropy also implies a finite slope lim t [ S κ K ( t ) / t ] ; this is in fact not surprising since the Kaniadakis entropy is a linear combination of S q ’s. However, to the best of our knowledge, no other entropic functional but S q has been shown to lead to a basic relation such as (25).

2.3. Connection with Jackson Derivative

We follow here along the lines of [12]. One century ago, the mathematician Jackson generalized [56,57] the concept of derivative of a generic function f ( x ) . He introduced his differential operator D q as follows:
D q f ( x ) f ( q x ) f ( x ) q x x .
We immediately verify that D 1 f ( x ) = d f ( x ) / d x . For q 1 , this operator replaces the usual (infinitesimal) translation operation on the abscissa x of the function f ( x ) by a dilatation operation.
Abe noticed in 1997 a remarkable property [34] which uniquely yields S q . In the same way that we can easily verify that
S B G = d d x i = 1 W p i x | x = 1 ,
we can verify that, q ,
S q = D q i = 1 W p i x | x = 1 .
This is an interesting property, where the usual infinitesimal translational operation is replaced by a finite operation, namely, in this case, by the one which is basic for scale-invariance. This fact is in some sense consistent with the definition of the entropy S q , which was inspired [7] by multifractal geometry.

2.4. Abe 2000 Theorem

In 1953, Khinchin uniqueness theorem [58] further reformulated that of Shannon in a very elegant manner:
Let us assume that an entropic form S ( { p i } ) satisfies the following properties:
( i ) S ( { p i } ) i s a c o n t i n u o u s f u n c t i o n o f { p i } ;
( i i ) S ( p i = 1 / W , i ) m o n o t o n i c a l l y i n c r e a s e s w i t h t h e t o t a l n u m b e r o f p o s s i b i l i t i e s W ;
( i i i ) S ( p 1 , p 2 , , p W , 0 ) = S ( p 1 , p 2 , , p W ) ;
( i v ) S ( A + B ) = S ( A ) + S ( B | A ) , w h e r e S ( A + B ) S ( { p i j A + B } ) , S ( A ) S ( { p i A } ) ( p i A j = 1 W B p i j A + B ) , a n d t h e c o n d i t i o n a l e n t r o p y S ( B | A ) i = 1 W A p i A S ( { p i j A + B / p i A } ) .
Then and only then [59] S ( { p i } ) is given by Equation (1).
It follows then that the Shannon and the Khinchin sets of axioms are mathematically equivalent.
In 2000, the Abe theorem [60] generalized that of Khinchin as follows:
Let us assume that an entropic form S ( { p i } ) satisfies the following properties:
( i ) S ( { p i } ) i s a c o n t i n u o u s f u n c t i o n o f { p i } ;
( i i ) S ( p i = 1 / W , i ) m o n o t o n i c a l l y i n c r e a s e s w i t h t h e t o t a l n u m b e r o f p o s s i b i l i t i e s W ;
( i i i ) S ( p 1 , p 2 , , p W , 0 ) = S ( p 1 , p 2 , , p W ) ;
( i v ) S ( A + B ) k = S ( A ) k + S ( B | A ) k + ( 1 q ) S ( A ) k S ( B | A ) k w h e r e S ( A + B ) S ( { p i j A + B } ) , S ( A ) S ( { j = 1 W B p i j A + B } ) , a n d t h e c o n d i t i o n a l e n t r o p y S ( B | A ) i = 1 W A ( p i A ) q S ( { p i j A + B / p i A } ) i = 1 W A ( p i A ) q ( k > 0 )
Then and only then [60] S ( { p i } ) is given by Equation (5).
The possibility of existence of such a theorem through the appropriate generalization of Khinchin’ s fourth axiom had already been considered by Plastino and Plastino [61,62]. Abe established [60] the precise form of this generalized fourth axiom, and proved the theorem.
Notice that, interestingly enough, what enters in the definition of the conditional entropy is the escort distribution, and not the original one. Notice also that Equation (35) only holds for q > 0 . Therefore the expression (5) for q < 0 can only be defined for strictly positive values of { p i } , and it is to be understood as an analytical extension of the q > 0 case.
Let us finally emphasize that both Santos axioms and Abe axioms are necessary and sufficient conditions for the emergence of S q . Consequently, those two sets of axioms are mathematically equivalent.
The axiomatic justification of diverse entropic functionals has in fact deserved great attention in both recent and not so recent literature. Let us summarize here, along lines close to those presented by Jizba and Korbel [63], the present status of this interesting path of research. Three main consistent lines of analysis exist, namely generalized Shannon–Khinchine axioms of the type of [51,60], the Shore and Johnson axioms [64,65,66], and the Uffink class of entropies [67]. All three lead to the same set of admissible entropies, which includes S q (and also, in some formulations, monotonic functions of S q ). The particular line related to the Shore–Johnson axioms deserves a special attention because it has been the object of a neat controversy, which is focused on in Section 2.10 hereafter.

2.5. Beck-Cohen 2003 Superstatistics

An interesting physical interpretation of nonextensive statistics was preliminary advanced in the early 2000s by Wilk and Wlodarczyk [68] and by Beck [69]. This interpretation was beautifully generalized and formalized, in 2003, in what is currently known nowadays as the Beck–Cohen superstatistics [70]. This phenomenological theory generalizes nonextensive statistics in the sense that its generic state distribution contains the q-exponential one as a particular case.
Beck and Cohen [70,71,72,73] start from the standard BG exponential factor but with β being itself a random variable (whence the name “superstatistics”) due to possible spatial and/or temporal fluctuations. They define
P ( E ) = 0 d β f ( β ) e β E ,
where f ( β ) is a normalized distribution, such that P ( E ) also is normalizable under the same conditions as the Boltzmann factor e β E itself is. They also define
q B C ( β ) 2 β 2 = 0 d β ( β ) 2 f ( β ) 0 d β β f ( β ) 2 ,
where we have introduced BC standing for Beck-Cohen. Unless f ( β ) is deduced from first principles, this theory is a phenomenological one.
If f ( β ) = δ ( β β ) we recover the BG weight
P ( E ) = e β E ,
and q B C = 1 .
If f ( β ) is the χ 2 -distribution with n degrees of freedom (particular case of the Gamma distribution), i.e.,
f ( β ) = n 2 β Γ n 2 n β 2 β n / 2 1 exp n β 2 β ( n = 1 , 2 , 3 , ) ,
we obtain
P ( E ) = e q β E ,
with q B C = q = n + 2 n 1 .
In addition to the so-called χ 2 -superstatistics described above, we have the so-called inverse χ 2 -superstatistics, where it is 1 / β , instead of β , that follows the χ 2 distribution. Finally, a third class is sometimes focused on in the literature. It is referred to as the log-normal superstatistics, and corresponds to the case where β is distributed along a log-normal distribution. These three classes are sometimes referred to as universality ones because they are all connected to Gaussians, which, in the Central Limit Theorem sense, are attractors in the space of distributions.
Several other examples of f ( β ) are discussed in [70], and it is eventually established an important result, namely that all narrowly peaked distributions f ( β ) yield, as its first nontrivial leading order, q-statistics with q = q B C . As we know, the q-exponential distribution emerges naturally from extremizing the entropic functional S q . Let us however emphasize that this argument does not prove a uniqueness sense for S q . It nevertheless points towards some special role being played by this entropic functional. Further issues along this line have been studied in [74,75,76,77,78].

2.6. Lattice-Boltzmann Models for Fluids

In the present Subsection we closely follow [12]. The incompressible Navier-Stokes equation has been considered, by Boghosian et al. in 2003 [79], on a discretized D-dimensional Bravais lattice of coordination number b. It is further assumed that there is a single value for the particle mass, and also for speed. The basic requirement for the lattice-Boltzmann model is to be Galilean-invariant (i.e., invariant under change of inertial reference frame), like the Navier–Stokes equation itself. It has been proved [79] that an H-theorem is satisfied for a trace-form entropy (i.e., of the form S ( { p i } ) = i W f ( p i ) ) only if it has the form of S q with
q = 1 2 D .
Therefore q < 1 in all cases ( q > 0 if D > 2 , q < 0 if D < 2 , and q = 0 for D = 2 ), and approaches unity from below in the D limit. This interesting result has been generalized by allowing multiple masses and multiple speeds. Galilean invariance once again mandates [80] an entropy of the form of S q , with a unique value of q determined by a transcendental equation involving the dimension and symmetry properties of the Bravais lattice as well as the multiple values of the masses and of the speeds. Of course, Equation (42) is recovered for the particular case of single mass and single speed. Summarizing, under quite general mathematical hypotheses (including the entropic functional to be of the trace-form), the natural imposition of the Galilean invariance for lattice-Boltzmann models for fluids mandates the use of S q .

2.7. Topsoe 2005 Factorizability in Game Theory

Topsoe proposed [81,82] an abstract zero-sum game theory between two players, namely “Nature”, aiming at high complexity, and “the physicist”, aiming at low complexity. We describe here a simple illustration of the game ingredients; full mathematical details are available in [81,82]. For simplicity, the set of possibilities (alphabet) is here assumed discrete and finite, W being the number of possibilities. The probability set associated with “Nature” is P { p i } (with i = 1 W p i = 1 ), and that associated with “the physicist” is Q { q i } (with i = 1 W q i = 1 ). The focus is then put on the triple ( Φ , S , D ) , where the complexity Φ , the entropy S, and the divergence D are respectively given by
Φ ( P | | Q ) = i = 1 W q i f p i q i f ( p i ) ,
S ( P ) = i = 1 W f ( p i ) ,
and
D = ( P | | Q ) = i = 1 W q i f p i q i ,
where the generator  f ( x ) is a real-valued analytic and strictly convex function on [ 0 , 1 ] such that f ( 0 ) = f ( 1 ) = 0 and f ( 1 ) = 1 (normalization condition).
The Topsoe 2005 theorem [82] states: A complexity function of the form (43) factorizes if and only if it is related to the Tsallis entropic function.

2.8. Amari-Ohara-Matsuzoe 2012 Conformally Invariant Geometry

A information-geometrical approach [83] leads to an abstract uniqueness property that we briefly summarize here. The generalized logarithm defined in Equation (7) is nowadays placed within a more general frame [19,20,21], referred to as χ -logarithm and defined as follows:
ln χ z 1 z d t χ ( t ) ,
where χ ( t ) is a generic function which satisfies simple properties such as being a concave monotonically increasing one; we define consistently the inverse function e χ z ln χ 1 ( z ) . We straightforwardly verify that χ ( t ) = t q yields ln χ z = ln q z and e χ z = e q z . For χ ( t ) = t we refer to the exponential family and for generic χ ( t ) we refer to the deformed exponential family; naturally, the deformed exponential family includes the exponential one as a particular instance. Many useful concepts such as generalized entropy, divergence and escort probability distribution are associated with each admissible choice of χ ( t ) . In the space of the probability distributions of a vector random variable x = ( x 1 , , x n ) , two different different types of geometrical structures can be defined from an information-geometrical perspective, namely the invariant and the flat ones (see details in [83]). The q-exponential family is the unique class in the extended class of positive measures, which simultaneously has the invariant and flat geometries. Furthermore, the q-family is the unique class of flat geometry that is connected conformally to the invariant geometry.

2.9. Enciso–Tempesta 2017 Theorem

In this Section we follow along the lines of [12].
A dimensionless entropic form S ( { p i } ) (i.e., whenever expressed in appropriate conventional units, e.g., in units of k) is said composable [84,85] (see also [1,10,86,87]) if the entropy S ( A + B ) / k corresponding to a system composed of two probabilistically independent subsystems A and B can be expressed in the form
S ( A + B ) k = F S ( A ) k , S ( B ) k ; { η } ,
where F ( x , y ; { η } ) is a smooth function of ( x , y ) which depends on a (typically small) set of universal indices { η } defined in such a way that F ( x , y ; { 0 } ) = x + y (additivity), and which satisfies F ( x , 0 ; { η } ) = x (null-composability), F ( x , y ; { η } ) = F ( y , x ; { η } ) (symmetry), F ( x , F ( y , z ; { η } ) ; { η } ) = F ( F ( x , y ; { η } ) , z ; { η } ) (associativity). For thermodynamical systems, this associativity appears to be consistent with the 0 t h Principle of Thermodynamics.
In other words, the whole concept of composability is constructed upon the requirement that the entropy of ( A + B ) does not depend on the microscopic configurations of A and of B. Equivalently, we are able to macroscopically calculate the entropy of the composed system without any need of entering into the knowledge of the microscopic states of the subsystems. This property appears to be a natural one for an entropic form if we desire to use it as a basis for a statistical mechanics which would naturally connect to thermodynamics.
The entropy S B G is composable since it satisfies Equation (3). In other words, we have F B G ( x , y ) = x + y . Being S B G nonparametric, no index exists in F B G . Further, the Renyi entropy S q R is composable as it satisfies F ( x , y ) = x + y for all values of q. The entropy S q also is composable since it satisfies (9).
Let us also mention that a linear combination of composable entropies is not necessarily composable. Such is the case of the Kaniadakis entropy S κ K [35,88,89]. Indeed, it is not composable for all values of κ in spite of being a linear combination of entropies S q .
Let us now focus on another relevant property, namely whether an entropic functional is trace-form. By definition, an entropy S ( { p i } ) is said trace-form if it can be written as S ( { p i } ) = i W f ( p i ) , where f ( z ) is a generic analytic function in the interval z ( 0 , 1 ) . Entropies S q , S κ K and many others are trace-form, in contrast with S q R , which is not.
In 2017, Enciso and Tempesta proved [23] that S q is the unique entropic functional being simultaneously composable and trace-form. See Figure 1.

2.10. The Shore–Johnson–Axioms Controversy (2005–2019)

To the best of our knowledge, the analysis of S q and its associated thermostatistics was initiated in 2005 [90,91] in connection with the Shore–Johnson axioms for statistical inference [64,65,66].
In 2013, Pressé et al. [92,93,94] started to lengthily insist that the Shore-Johnson axioms exclude entropies such as S q . Their arguments were boldly rebutted in [95] (The actual title Conceptual inadequacy of the Shore and Johnson axioms for wide classes of complex systems of [95] constitutes a sort of ambiguous shortcut. It should have rather been Conceptual inadequacy of the Presse et al. interpretation of the Shore and Johnson axioms for wide classes of complex systems.) where, among other points, it was explicitly written that generic probabilities { u i } and { v j } satisfy, for q 1 ,
S q ( { u i 2 q v j } ) k = i j ( u i 2 q v j ) ln 2 q ( u i 2 q v j ) = i j ( u i 2 q v j ) ( ln 2 q u i + ln 2 q v j ) i j u i v j ( ln 2 q u i + ln 2 q v j ) = i = 1 W u i ln 2 q u i j = 1 W v j ln 2 q v j = S q ( { u i } ) k + S q ( { v j } ) k .
Along their arguments, Presse et al. [92,93,94] definitively violate the imperative inequality present in the middle line of this mathematical chain. They ignore it not only in [92,93,94] but also in their reply [96] to [95]. In fact, the chain (48) is an interesting and nontrivial consequence of this class of correlations (strangely enough, referred in [92] to as “spurious correlations”) between probabilistic events. The fallacies contained in [92,93,94,96] have been meticulously discussed and rebutted in [63,97]. This fact seemingly closes that longstanding controversy, and we are allowed to believe that it is now irreversibly established that S q is admissible within the Shore-Johnson axioms. However, it remains nowadays somewhat unclear whether entropic functionals differing from S q (for instance, S q , δ or S κ K ) are, as well, admissible within those important axioms. Therefore, with respect to the uniqueness issue, the problem presently appears to be open. For example, it is unknown whether non-trace-form and/or non-composable entropies can satisfy those axioms (or other possible statistical consistency axioms) as well.

2.11. Plastino-Tsallis-Wedemann-Haubold 2022

One more sense along which S q is unique has been advanced recently [98].
The BG entropy is defined as the mean value of ln 1 p i . Moreover, under the usual linear constraints for the normalization and the energy mean value, it is optimized by the BG exponential factor, which precisely is the inverse function of the logarithmic function. Similarly, the S q entropy is defined as the mean value of ln q 1 p i . Moreover, under the usual linear constraints for the normalization and the energy mean value, it is optimized by a q ˜ -exponential factor, which precisely is the inverse function of the q ˜ -logarithmic function with the dual index q ˜ = 2 q . One may ask how general is such a structure for trace-form entropic functionals. This is the question that was analyzed in [98].
The most general trace-form entropy S G can always be written as follows:
S G ( { p i } ) = k i = 1 W p i ln G 1 p i ,
where the generalized logarithm ln G ( z ) (z being a real positive number) must be a monotonically increasing concave function for z > 0 , and also satisfy ln G ( 1 ) = 0 , among some other simple requirements [98]. The optimization of S G ( { p i } ) under the usual linear constraints yields a distribution given by the generalized ln G ˜ 1 ( z ) e G ˜ ( z ) , where G ˜ denotes dual functions, duality being possibly defined in various manners. The simplest such manner is as follows:
ln G ˜ ( z ) = ln G 1 z .
It is proved in [98] that the most general entropic functional (49) with duality given by (50) is precisely S q .

2.12. Plastino-Plastino 2023 Connection with the Micro-Canonical Ensemble

Among trace form entropic measures, the S q non-additive entropies exhibit a special link with the micro-canonical ensemble [99,100]. Systems described by the micro-canonical ensemble usually have parts described by q-exponentials. That is, parts described by probability distributions optimizing the S q entropies. This happens when the number of micro-states of the rest of the system having energy less or equal to a given energy E 0 grows as a power of E 0 . Here “the rest of the system” does not necessarily refer to a subsystem: it can refer, in the case of classical Hamiltonian systems, to a subset of the system’s canonical variables. For example, in classical non-relativistic scenarios, the kinetic energy of a system of N particles (interacting or not) is an homogeneous quadratic function of the momenta, implying that the volume in momentum-space grows in the above mentioned power-law fashion, the exponent depending on the number N of particles. Because of this power-law behavior, the marginal probability density for the configuration variables is a q-exponential of the total potential energy. The link between the micro-canonical ensemble and the S q -canonical distributions is, in a sense, unique. Up to now, the S q entropy appears to be the only trace-form entropy exhibiting entropy-optimizing distributions that have been related to the micro-canonical treatment of concrete and physically relevant families of systems.
The unique character of this connection is particularly transparent in the classical non-relativistic regime: within that regime we can say that, to the extent that Nature prefers quadratic kinetic energies, it also prefers q-exponentials. In this regard, it is significative that q-exponentials are clearly discernible in a paper by Maxwell from 1879 [101,102] (see Equation (41) of [102]), which is one of the first papers ever discussing the micro-canonical ensemble, as we call it nowadays. Let us emphasize that Maxwell arrived to the q-exponentials without explicitly optimizing any entropic functional at all, just by assuming equal probabilities in the occupancy of the phase-space corresponding to a given total energy. To the best of our knowledge, probability distributions optimizing other of the non-logarithmic trace-form entropies discussed in the current research literature are not present in those pioneering works.
Before concluding this Subsection, let us mention that the above uniqueness might be not unrelated to the concept of thermostat universal independence introduced by Biro, Barnafoldi and Van in 2015 [103].

3. Closely Related Issues

3.1. The Values of the Entropic Indices Might Depend on the Class of States of the System

To illustrate the claim in the title of the present Subsection, let us focus on a specific example. The ( 1 + 1 ) -dimensional first-neighbor-interacting Ising ferromagnet in the presence of an external transverse field has, at its thermodynamical limit, a zero-temperature second-order critical point, i.e., a quantum critical point. At this point, the total entropy of this system vanishes since it is a pure state. Consider now a L-sized block (with L 1 ) of this infinitely-sized system: it is in a mixed state and its nonvanishing BG entropy S B G ( L ) is given by [104,105]
S B G ( L ) k c 3 ln L ,
where c > 0 is the central charge within the corresponding conformal field theory; for example, c = 1 / 2 for the Ising ferromagnet, and c = 1 for the XY ferromagnet. We are unaware of a rigorous proof determining the subdominant term, but strong indications [106] suggest the following behavior:
S B G ( L ) k c 3 ln L + ln b = ln ( b L c / 3 ) ,
where b > 1 is a constant. With the definition W e f f b L c / 3 ( e f f stands for effective), we have S B G ( L ) k ln W e f f ( L ) . Consequently, if the system was in an equal-probability state, we could interpret W e f f as the total number of possibilities. Then, we would have that, for L 1 ,
S q ( L ) k ln q W e f f ( L ) ln q b L c / 3 = [ b L c / 3 ] 1 q 1 1 q L c ( 1 q ) / 3 ( q < 1 ) .
Thermodynamic extensivity of the entropy (generically required by the Legendre structure of thermodynamics [40]) would then imply S q e n t ( L ) k L with
q e n t = 1 3 c ( c 0 ) .
Let us emphasize that this result was obtained under the assumption of equal probabilities. It happens though that this assumption is wrong at the quantum critical point that we are focusing on! [106]. The correct result for this system is instead given by [105]
q e n t = 9 + c 2 3 c ,
which definitively differs from 1 c 3 . Interestingly enough, however, the correct expression (55) asymptotically reproduces, in the c limit, the wrong expression (54), i.e., relation (55) implies q e n t 1 c 3 ( c ).
It is allowed to think that, perhaps quite generically, not only for S q but for other entropic functionals as well ( S q , δ [40] among others), the a priori assumption of equal probabilities for specific systems yields, at the relevant stationary state, the correct asymptotic behavior when approaching the BG limit (i.e., ( q , δ ) ( 1 , 1 ) for S q , δ , for instance).

3.2. Entropic Functional vs. Entropy of a System

To be admissible, an entropic functional S ( { p i } ) must be one and the same for all possible states, i.e., for all possible sets of the probabilities { p i } of a generic system. Such is, of course, the case of all entropic functionals that we have discussed up to now.
The Barrow proposal for entropy [107], noted S Δ B here, is not an entropic functional, but rather an expected value for black-holes or cosmological possibilities under the assumption of a rough external surface. Indeed, it is usually written as follows:
S Δ B A 1 + Δ / 2 ,
or, equivalently, S Δ B L 2 + Δ , where A L 2 , L being the characteristic linear dimension of the system; for Δ = 0 , S 0 B L 2 recovers the usual Bekenstein-Hawking black-hole behavior; for Δ = 1 , S 1 B L 3 recovers the thermodynamically admissible entropy for a d = 3 system; for 0 < Δ < 1 and also for Δ > 1 , the Barrow entropy S Δ B hopefully corresponds to a fractal-like black-hole surface. Summarizing, the Barrow entropy is extensive, i.e., thermodynamically admissible only for Δ = 1 . Indeed, for Δ < 1 ( Δ > 1 ), S Δ B is subextensive (superextensive), thus violating the Legendre structure of thermodynamics.
On the other hand, let us focus on the entropic functional S q , δ [40], defined as follows:
S q , δ = k i = 1 W p i ln q 1 p i δ ( q R , δ R ) ,
which recovers, as particular instances, S q , 1 = S q and S 1 , δ = S δ k i = 1 W p i ln 1 p i δ . For equal probabilities, Equation (57) yields
S 1 , δ k = ( ln W ) δ
The Bekenstein-Hawking result S B G / k ln W A leads to
S 1 , δ k A δ .
This expression can be identified with (56) through:
δ 1 + Δ 2 .
This identity has produced in the literature some unfortunate confusion between the entropic functional S δ [10] and the so-called Barrow entropy [107]. Let us emphasize, at this stage, that S Δ B is by no means analogous to the entropic functional S δ ( { p i } ) [40]. Indeed, the latter is an entropic functional applicable a priori to any system in any state, whereas, in contrast, the former has been specifically proposed for black holes at their thermal equilibrium state.
Recent observational data concerning dark energy physics have been interpreted [108] as being consistent with S δ with δ = 1.565 . This value differs from δ = 3 / 2 advanced in [40] under the three-fold hypothesis that (i) the system is a d = 3 one with its surface being a d = 2 one, (ii) entropic extensivity of S δ for a d-dimensional system, and (iii) equal probabilities, which yields δ = d / ( d 1 ) . If the value of δ slightly different from 3 / 2 is taken as granted, then one or the other or even all three hypothesis could be inadequate. If we remind that, for a somewhat similar quantum system, the correct value of q differs from its value assuming equal probabilities (see Section 3.1), it cannot be excluded that δ = 1.565 differing from δ = 3 / 2 is rather caused by the failure of hypothesis (iii) and not necessarily by the failure of the other two hypothesis, which assume that the thermodynamically extensive entropy is based on the δ -entropic functional with the special value δ = d / ( d 1 ) .
An alternative explanation could of course be the failure of hypothesis (i), meaning that the system is a fractal-like one with say δ = d f B d f S , d f B and d f S being respectively the bulk and surface dimensionalities, not necessarily being given by ( d f B , d f S ) = ( d , d 1 ) . For example, if we impose d f d f 1 = 1.565 with ( d f B , d f S ) = ( d f , d f 1 ) we obtain d f 2.77 ; if we instead impose say 3 / d f S = 1.565 we obtain d f S 1.92 . Clearly, at this stage, the discrepancy 1.565 vs. 3/2 remains as an open, surely intriguing, question (Equation (57) yields, for equal probabilities, S q , δ k = ( ln q W ) δ = W 1 q 1 1 q δ W ( 1 q ) δ [ 1 q ] δ W ( 1 q ) δ ( W ) . If this behavior is correct for a given system, then the value of the exponent ( 1 q ) δ is to be preserved. Therefore, if a given wrong hypothesis (such as say equal probabilities) makes ( 1 q w r o n g ) to be larger than ( 1 q c o r r e c t ) (as proved in Section 3.1), this implies, assuming a fixed value for ( 1 q c o r r e c t ) δ c o r r e c t = ( 1 q w r o n g ) δ w r o n g , that δ w r o n g smaller than δ c o r r e c t , which is precisely the inequality sense of 3/2 as compared to 1.565 !).

4. Summary

A plethora of entropic functionals (close to fifty) and their associated optimizing distributions are today available in the literature [16]. Among those, S q is by far the most frequently validated in natural, artificial and social complex systems up to now. Then, seeking for a deeper understanding, a natural question emerges: in what senses is S q unique? In the present review, we have listed (basically in chronological order) many such senses. In some cases, (Section 2.1, Section 2.3, Section 2.4, Section 2.6, Section 2.7, Section 2.8, Section 2.9, Section 2.11 and Section 2.12), the uniqueness of S q is established on rigorous grounds. In others (Section 2.2 and Section 2.5), it is conjectured on partial analytical arguments and/or strong numerical indications.
A related controversy has been briefly reviewed in Section 2.10.
Finally, in Section 3.1 and Section 3.2, we have illustrated that (i) the values of the entropic indices (for example, q for S q and δ for S δ ) might depend on the class of states of the system (for example, assuming either equal or unequal probabilities, basically corresponding respectively to either a microcanonical or a canonical ensemble), and also that (ii) an entropic functional must be clearly distinguished from the same or from a different entropic functional applied to a specific system in specific states. Both are frequently referred to in the literature as “entropy”, but their mathematical role is sensibly distinct. Further senses along which S q , or other entropic functionals, would be unique are certainly welcome.

Funding

This research received no external funding.

Acknowledgments

I am deeply indebted to A.R. Plastino as he graciously contributed Section 2.12 within the present effort, thus enlightening the recent paper [100] by himself and his father A. Plastino. I am also very grateful to P. Jizba and J. Korbel for fruitful remarks about Section 2.10. Partial financial support by CNPq and FAPERJ (Brazilian agencies) is acknowledged as well.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Tsallis, C. Entropy. Encyclopedia 2022, 2, 264–300. [Google Scholar] [CrossRef]
  2. Boltzmann, L. Weitere Studien über das Wȧrmegleichgewicht unter Gas molekülen [Further Studies on Thermal Equilibrium Between Gas Molecules]. Wien. Ber. 1872, 66, 275. [Google Scholar]
  3. Boltzmann, L. Uber die Beziehung eines allgemeine mechanischen Satzes zum zweiten Haupsatze der Warmetheorie. Sitzungsberichte K. Akademie Wiss. Wien Math.-Naturwissenschaften 1877, 75, 67–73. [Google Scholar]
  4. Gibbs, J.W. Elementary Principles in Statistical Mechanics—Developed with Especial Reference to the Rational Foundation of Thermodynamics; C. Scribner’s Sons: New York, NY, USA, 1902. [Google Scholar]
  5. Gibbs, J.W. The collected works. In Thermodynamics; Yale University Press: New Haven, CT, USA, 1948; Volume 1. [Google Scholar]
  6. Gibbs, J.W. Elementary Principles in Statistical Mechanics; OX Bow Press: Woodbridge, CT, USA, 1981. [Google Scholar]
  7. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  8. Penrose, O. Foundations of Statistical Mechanics: A Deductive Treatment; Pergamon: Oxford, UK, 1970; p. 167. [Google Scholar]
  9. Gell-Mann, M.; Tsallis, C. (Eds.) Nonextensive Entropy—Interdisciplinary Applications; Oxford University Press: New York, NY, USA, 2004. [Google Scholar]
  10. Tsallis, C. Nonextensive Statistical Mechanics—Approaching a Complex World, 1st ed.; Springer: New York, NY, USA, 2009. [Google Scholar]
  11. Umarov, S.; Tsallis, C. Mathematical Foundations of Nonextensive Statistical Mechanics; World Scientific: Singapore, 2022. [Google Scholar]
  12. Tsallis, C. Nonextensive Statistical Mechanics—Approaching a Complex World, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2023. [Google Scholar]
  13. Wong, C.Y.; Wilk, G.; Cirto, L.J.L.; Tsallis, C. From QCD-based hard-scattering to nonextensive statistical mechanical descriptions of transverse momentum spectra in high-energy pp and p p ¯ collisions. Phys. Rev. D 2015, 91, 114027. [Google Scholar] [CrossRef]
  14. Combe, G.; Richefeu, V.; Stasiak, M.; Atman, A.P.F. Experimental validation of nonextensive scaling law in confined granular media. Phys. Rev. Lett. 2015, 115, 238301. [Google Scholar] [CrossRef]
  15. Wild, R.; Notzold, M.; Simpson, M.; Tran, T.D.; Wester, R. Tunnelling measured in a very slow ion-molecule reaction. Nature 2023, 615, 425. [Google Scholar] [CrossRef]
  16. Regularly Updated Bibliography. Available online: http://tsallis.cat.cbpf.br/biblio.htm (accessed on 25 April 2023).
  17. Jizba, P.; Korbel, J.; Zatloukal, V. Tsallis thermostatics as a statistical physics of random chains. Phys. Rev. E 2017, 95, 022103. [Google Scholar] [CrossRef]
  18. Tsallis, C. What are the numbers that experiments provide? Quim. Nova 1994, 17, 468. [Google Scholar]
  19. Naudts, J. Estimators, escort probabilities, and phi-exponential families in statistical physics. J. Ineq. Pure Appl. Math. 2004, 5, 102. [Google Scholar]
  20. Naudts, J. Generalized exponential families and associated entropy functions. Entropy 2008, 10, 131–149. [Google Scholar] [CrossRef]
  21. Naudts, J. Generalized Thermostatistics; Springer: London, UK, 2011. [Google Scholar]
  22. Ribeiro, M.; Henriques, T.; Castro, L.; Souto, A.; Antunes, L.; Costa-Santos, C.; Teixeira, A. The entropy universe. Entropy 2021, 23, 222. [Google Scholar] [CrossRef] [PubMed]
  23. Enciso, A.; Tempesta, P. Uniqueness and characterization theorems for generalized entropies. J. Stat. Mech. 2017, 2017, 123101. [Google Scholar] [CrossRef]
  24. Renyi, A. On measures of information and entropy. In Proceedings of the Fourth Berkeley Symposium, Davis, CA, USA, 20–30 July 1960; University of California Press: Berkeley, CA, USA; Los Angeles, CA, USA, 1961; Volume 4, p. 547. [Google Scholar]
  25. Tempesta, P. Formal groups and Z-entropies. Proc. R. Soc. A 2016, 472, 20160143. [Google Scholar] [CrossRef]
  26. Jensen, H.J.; Pazuki, R.H.; Pruessner, G.; Tempesta, P. Statistical mechanics of exploding phase spaces: Ontic open systems. J. Phys. A Math. Theor. 2018, 51, 375002. [Google Scholar] [CrossRef]
  27. Sharma, B.D.; Mittal, D.P. New non-additive measures of entropy for discrete probability distributions. J. Math. Sci. 1975, 10, 28. [Google Scholar]
  28. Landsberg, P.T.; Vedral, V. Distributions and channel capacities in generalized statistical mechanics. Phys. Lett. A 1998, 247, 211. [Google Scholar] [CrossRef]
  29. Landsberg, P.T. Entropies galore! Braz. J. Phys. 1999, 29, 46–49. [Google Scholar] [CrossRef]
  30. Rajagopal, A.K.; Abe, S. Implications of form invariance to the structure of nonextensive entropies. Phys. Rev. Lett. 1999, 83, 1711. [Google Scholar] [CrossRef]
  31. Arimoto, S. Information-theoretical considerations on estimation problems. Inf. Control 1971, 19, 181–194. [Google Scholar] [CrossRef]
  32. Curado, E.M.F.; Tempesta, P.; Tsallis, C. A new entropy based on a group-theoretical structure. Ann. Phys. 2016, 366, 22–31. [Google Scholar] [CrossRef]
  33. Borges, E.P.; Roditi, I. A family of non-extensive entropies. Phys. Lett. A 1998, 246, 399–402. [Google Scholar] [CrossRef]
  34. Abe, S. A note on the q-deformation theoretic aspect of the generalized entropies in nonextensive physics. Phys. Lett. A 1997, 224, 326–330. [Google Scholar] [CrossRef]
  35. Kaniadakis, G. Non linear kinetics underlying generalized statistics. Physica A 2001, 296, 405. [Google Scholar] [CrossRef]
  36. Kaniadakis, G.; Lissia, M.; Scarfone, A.M. Deformed logarithms and entropies. Physica A 2004, 340, 41. [Google Scholar] [CrossRef]
  37. Anteneodo, C.; Plastino, A.R. Maximum entropy approach to stretched exponential probability distributions. J. Phys. A 1999, 32, 1089. [Google Scholar] [CrossRef]
  38. Hanel, R.; Thurner, S. A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions. Europhys. Lett. 2011, 93, 20006. [Google Scholar] [CrossRef]
  39. Hanel, R.; Thurner, S. When do generalised entropies apply? How phase space volume determines entropy. Europhys. Lett. 2011, 96, 50003. [Google Scholar] [CrossRef]
  40. Tsallis, C.; Cirto, L.J.L. Black hole thermodynamical entropy. Eur. Phys. J. C 2013, 73, 2487. [Google Scholar] [CrossRef]
  41. Schwammle, V.; Tsallis, C. Two-parameter generalization of the logarithm and exponential functions and Boltzmann-Gibbs-Shannon entropy. J. Math. Phys. 2007, 48, 113301. [Google Scholar] [CrossRef]
  42. Tempesta, P. Beyond the Shannon-Khinchin formulation: The composability axiom and the universal-group entropy. Ann. Phys. 2016, 365, 180. [Google Scholar] [CrossRef]
  43. Curado, E.M.F. General aspects of the thermodynamical formalism. Braz. J. Phys. 1999, 29, 36–45. [Google Scholar] [CrossRef]
  44. Curado, E.M.F.; Nobre, F.D. On the stability of analytic entropic forms. Physica A 2004, 335, 94. [Google Scholar] [CrossRef]
  45. Curado, E.M.F. (Centro Brasileiro de Pesquisas Físicas, Rua Xavier Sigaud 150, Rio de Janeiro 22290-180, Brazil). Private communication, 2020. [Google Scholar]
  46. Tsekouras, G.A.; Tsallis, C. Generalized entropy arising from a distribution of q-indices. Phys. Rev. E 2005, 71, 046144. [Google Scholar] [CrossRef] [PubMed]
  47. Jacquet, P.; Szpankowski, W. Entropy computations via analytic depoissonization. IEEE Trans. Inf. Theory 1999, 45, 1072. [Google Scholar] [CrossRef]
  48. Borges, E.P.; da Costa, B.G. Deformed mathematical objects stemming from the q-logarithm function. Axioms 2022, 11, 138. [Google Scholar] [CrossRef]
  49. Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef]
  50. Shannon, C.E. The Mathematical Theory of Communication; University of Illinois Press: Urbana, IL, USA, 1949. [Google Scholar]
  51. Santos, R.J.V. Generalization of Shannon’ s theorem for Tsallis entropy. J. Math. Phys. 1997, 38, 4104. [Google Scholar] [CrossRef]
  52. Tsallis, C.; Plastino, A.R.; Zheng, W.-M. Power-law sensitivity to initial conditions—New entropic representation. Chaos Solitons Fractals 1997, 8, 885–891. [Google Scholar] [CrossRef]
  53. Lyra, M.L.; Tsallis, C. Nonextensivity and multifractality in low-dimensional dissipative systems. Phys. Rev. Lett. 1998, 80, 53. [Google Scholar] [CrossRef]
  54. Tsallis, C.; Borges, E.P. Time evolution of nonadditive entropies: The logistic map. Chaos Solitons Fractals 2023, in press. [Google Scholar] [CrossRef]
  55. Baldovin, F.; Robledo, A. Nonextensive Pesin identity: Exact renormalization group analytical results for the dynamics at the edge of chaos of the logistic map. Phys. Rev. E 2004, 69, 045202. [Google Scholar] [CrossRef] [PubMed]
  56. Jackson, F. The Messenger of Mathematics; Forgotten Books: London, UK, 1909. [Google Scholar]
  57. Jackson, F. On Q-Definite Integrals. Q. J. Pure Appl. Math. 1910, 41, 193. [Google Scholar]
  58. Khinchin, A.I. Mathematical Foundations of Information Theory, Uspekhi Matem. Nauk 1953, 8, 3. [Google Scholar]
  59. Khinchin, A.I. Mathematical Foundations of Information Theory; Silverman, R.A.; Friedman, M.D., Translators; Dover: New York, NY, USA, 1957. [Google Scholar]
  60. Abe, S. Axioms and uniqueness theorem for Tsallis entropy. Phys. Lett. A 2000, 271, 74. [Google Scholar] [CrossRef]
  61. Plastino, A.R.; Plastino, A. Generalized entropies. In Condensed Matter Theories; Ludeña, E.V., Vashista, P., Bishop, R.F., Eds.; Nova Science Publishers: New York, NY, USA, 1996; Volume 11, p. 327. [Google Scholar]
  62. Plastino, A.R.; Plastino, A. Tsallis entropy and Jaynes’ information theory formalism. Braz. J. Phys. 1999, 29, 50. [Google Scholar] [CrossRef]
  63. Jizba, P.; Korbel, J. When Shannon and Khinchin meet Shore and Johnson: Equivalence of information theory and statistical inference axiomatics. Phys. Rev. E 2020, 101, 042126. [Google Scholar] [CrossRef]
  64. Shore, J.E.; Johnson, R.W. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Trans. Inf. Theory 1980, 26, 26–37. [Google Scholar] [CrossRef]
  65. Shore, J.E.; Johnson, R.W. Properties of cross-entropy minimization. IEEE Trans. Inf. Theory 1981, 27, 472–482. [Google Scholar] [CrossRef]
  66. Shore, J.E.; Johnson, R.W. Comments on and correction to “Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy” (Jan 80 26–37). IEEE Trans. Inf. Theory 1983, 29, 942–943. [Google Scholar]
  67. Uffink, J. Can the maximum entropy principle be explained as a consistency requirement? Hist. Phil. Mod. Phys. 1995, 26, 223. [Google Scholar] [CrossRef]
  68. Wilk, G.; Wlodarczyk, Z. Interpretation of the nonextensivity parameter q in some applications of Tsallis statistics and Lévy distributions. Phys. Rev. Lett. 2000, 84, 2770. [Google Scholar] [CrossRef] [PubMed]
  69. Beck, C. Dynamical foundations of nonextensive statistical mechanics. Phys. Rev. Lett. 2001, 87, 180601. [Google Scholar] [CrossRef]
  70. Beck, C.; Cohen, E.G.D. Superstatistics. Physica A 2003, 322, 267. [Google Scholar] [CrossRef]
  71. Beck, C. Superstatistics: Theory and Applications. In Nonadditive Entropy and Nonextensive Statistical Mechanics; Sugiyama, M., Ed.; Continuum Mechanics and Thermodynamics; Springer: Heidelberg, Germany, 2004; Volume 16, p. 293. [Google Scholar]
  72. Cohen, E.G.D. Superstatistics. Physica D 2004, 193, 35. [Google Scholar] [CrossRef]
  73. Cohen, E.G.D. Boltzmann and Einstein: Statistics and Dynamics—An Unsolved Problem. Pramana 2005, 64, 635–643. [Google Scholar] [CrossRef]
  74. Tsallis, C.; Souza, A.M.C. Constructing a statistical mechanics for Beck-Cohen superstatistics. Phys. Rev. E 2003, 67, 026106. [Google Scholar] [CrossRef]
  75. Souza, A.M.C.; Tsallis, C. Stability of the entropy for superstatistics. Phys. Lett. A 2003, 319, 273. [Google Scholar] [CrossRef]
  76. Souza, A.M.C.; Tsallis, C. Stability analysis of the entropy for superstatistics. Physica A 2004, 342, 132. [Google Scholar] [CrossRef]
  77. Hanel, R.; Thurner, S.; Gell-Mann, M. Generalized entropies and the transformation group of superstatistics. Proc. Natl. Acad. Sci. USA 2011, 110, 3539108. [Google Scholar] [CrossRef]
  78. Hanel, R.; Thurner, S.; Gell-Mann, M. Generalized entropies and logarithms and their duality relations. Proc. Natl. Acad. Sci. USA 2012, 109, 19151–19154. [Google Scholar] [CrossRef] [PubMed]
  79. Boghosian, B.M.; Love, P.J.; Coveney, P.V.; Karlin, I.V.; Succi, S.; Yepez, J. Galilean-invariant lattice-Boltzmann models with H-theorem. Phys. Rev. E 2003, 68, 025103. [Google Scholar] [CrossRef] [PubMed]
  80. Boghosian, B.M.; Love, P.; Yepez, J.; Coveney, P.V. Galilean-invariant multi-speed entropic lattice Boltzmann models. Physica D 2004, 193, 169. [Google Scholar] [CrossRef]
  81. Topsoe, F. Entropy and equilibrium via games of complexity. Physica A 2004, 340, 11–31. [Google Scholar] [CrossRef]
  82. Topsoe, F. Factorization and escorting in the game-theoretical approach to non-extensive entropy measures. Physica A 2006, 365, 91–95. [Google Scholar] [CrossRef]
  83. Amari, S.; Ohara, A.; Matsuzoe, H. Geometry of deformed exponential families: Invariant, dually-flat and conformal geometries. Physica A 2012, 391, 4308–4319. [Google Scholar] [CrossRef]
  84. Tsallis, C. Nonextensive statistics: Theoretical, experimental and computational evidences and connections. Braz. J. Phys. 1999, 29, 1–35. [Google Scholar] [CrossRef]
  85. Tsallis, C. Talk at the IMS Winter School. In Nonextensive Generalization of Boltzmann-Gibbs Statistical Mechanics and Its Applications; Institute for Molecular Science: Okazaki, Japan, 1999. [Google Scholar]
  86. Tsallis, C. Nonextensive Statistical Mechanics and Thermodynamics: Historical Background and Present Status. In Nonextensive Statistical Mechanics and Its Applications; Abe, S., Okamoto, Y., Eds.; Series Lecture Notes in Physics; Springer: Heidelberg, Germany, 2001. [Google Scholar]
  87. Hotta, M.; Joichi, I. Composability and generalized entropy. Phys. Lett. A 1999, 262, 302. [Google Scholar] [CrossRef]
  88. Kaniadakis, G. Statistical mechanics in the context of special relativity. Phys. Rev. E 2002, 66, 056125. [Google Scholar] [CrossRef]
  89. Kaniadakis, G. Statistical mechanics in the context of special relativity. II. Phys. Rev. E 2005, 72, 036108. [Google Scholar] [CrossRef] [PubMed]
  90. Abe, S.; Bagci, G.B. Necessity of q-expectation value in nonextensive statistical mechanics. Phys. Rev. E 2005, 71, 016139. [Google Scholar] [CrossRef] [PubMed]
  91. Abe, S. Generalized molecular chaos hypothesis and H-theorem: Problem of constraints and amendment of nonextensive statistical mechanics. Phys. Rev. E 2009, 79, 041116. [Google Scholar] [CrossRef]
  92. Presse, S.; Ghosh, K.; Lee, J.; Dill, K.A. Nonadditive entropies yield probability distributions with biases not warranted by the data. Phys. Rev. Lett. 2013, 111, 180604. [Google Scholar] [CrossRef]
  93. Presse, S.; Ghosh, K.; Lee, J.; Dill, K.A. Principles of maximum entropy and maximum caliber in statistical physics. Rev. Mod. Phys. 2013, 85, 1115–1141. [Google Scholar] [CrossRef]
  94. Presse, S. Nonadditive entropy maximization is inconsistent with Bayesian updating. Phys. Rev. E 2014, 90, 052149. [Google Scholar] [CrossRef] [PubMed]
  95. Tsallis, C. Conceptual inadequacy of the Shore and Johnson axioms for wide classes of complex systems. Entropy 2015, 17, 2853–2861. [Google Scholar] [CrossRef]
  96. Presse, S.; Ghosh, K.; Lee, J.; Dill, K.A. Reply to C. Tsallis’ “Conceptual inadequacy of the Shore and Johnson axioms for wide classes of complex systems”. Entropy 2015, 17, 5043–5046. [Google Scholar] [CrossRef]
  97. Jizba, P.; Korbel, J. Maximum entropy principle in statistical inference: Case for non-Shannonian entropies. Phys. Rev. Lett. 2019, 122, 120601. [Google Scholar] [CrossRef]
  98. Plastino, A.R.; Tsallis, C.; Wedemann, R.S.; Haubold, H.J. Entropy optimization, generalized logarithms, and duality relations. Entropy 2022, 24, 1723. [Google Scholar] [CrossRef]
  99. Plastino, A.; Plastino, A.R. From Gibbs microcanonical ensemble to Tsallis generalized canonical distribution. Phys. Lett. A 1994, 193, 140–143. [Google Scholar] [CrossRef]
  100. Plastino, A.R.; Plastino, A. Brief review on the connection between the micro-canonical ensemble and the Sq-canonical probability distribution. Entropy 2023, 25, 591. [Google Scholar] [CrossRef]
  101. Niven, W.D. (Ed.) The Scientific Papers of James Clerk Maxwell; Cambridge University Press: Cambridge, UK, 1890; Volume II. [Google Scholar] [CrossRef]
  102. Maxwell, J.C. On Boltzmann’s theorem on the average distribution of energy in a system of material points. Trans. Camb. Philos. Soc. 1879, XII, 547–570. [Google Scholar]
  103. Biro, T.S.; Barnafoldi, G.G.; Van, P. New entropy formula with fluctuating reservoir. Physica A 2015, 417, 215–220. [Google Scholar] [CrossRef]
  104. Calabrese, P.; Cardy, J. Entanglement entropy and quantum field theory. J. Stat. Mech. 2004, 2004, P06002. [Google Scholar] [CrossRef]
  105. Caruso, F.; Tsallis, C. Nonadditive entropy reconciles the area law in quantum systems with classical thermodynamics. Phys. Rev. E 2008, 78, 021102. [Google Scholar] [CrossRef] [PubMed]
  106. Souza, A.M.C.; Rapcan, P.; Tsallis, C. Area-law-like systems with entangled states can preserve ergodicity. Eur. Phys. J. Spec. Top. 2020, 229, 759–772. [Google Scholar] [CrossRef]
  107. Barrow, J.D. The area of a rough black hole. Phys. Lett. B 2020, 808, 135643. [Google Scholar] [CrossRef] [PubMed]
  108. Jizba, P.; Lambiase, G. Tsallis cosmology and its applications in dark matter physics with focus on IceCube high-energy neutrino data. Eur. Phys. J. C 2022, 82, 1123. [Google Scholar] [CrossRef]
Figure 1. It has been proven [23] that S q is the unique entropic form which simultaneously is trace-form, composable, and recovers S B G as a particular instance. S q (hence S B G ), the Renyi entropy S q R [24], the Tempesta ( a , b , α ) -entropy S a , b , α T (Equation (9.1) in [25]), the Jensen–Pazuki–Pruessner–Tempesta entropy S γ , α J P P T [26] and many others belong to the class of group entropies and are therefore composable. To facilitate the identification, we are here using the following notations: Sharma–Mittal entropy S q , r S M [27], Landsberg-Vedral-Rajagopal-Abe entropy S q L V R A [28,29,30], Tsallis–Mendes–Plastino entropy S q T M P , Arimoto entropy S q A r [31], Curado–Tempesta–Tsallis entropy S a , b , r C T T [32], Borges–Roditi entropy S q , q B R [33], Abe entropy S q A b [34], Kaniadakis entropy S κ K [35], Kaniadakis–Lissia–Scarfone entropy S κ , r K L S [36], Anteneodo–Plastino entropy S η A P [37], Hanel–Thurner entropy S c , d H T [38,39], S q , δ [40], Schwammle–Tsallis entropy S q , q S T [41], the Tempesta ( α , β , q ) -entropy S α , β , q T [42], the Curado b-entropy S b C [43,44], the Curado λ -entropy S λ C [45] (see [12]), and the exponential c-entropy S c E (see [10,46]). The entropic form S λ C is one among the rare cases which do not include S B G and is neither trace-form nor composable. From [1,12].
Figure 1. It has been proven [23] that S q is the unique entropic form which simultaneously is trace-form, composable, and recovers S B G as a particular instance. S q (hence S B G ), the Renyi entropy S q R [24], the Tempesta ( a , b , α ) -entropy S a , b , α T (Equation (9.1) in [25]), the Jensen–Pazuki–Pruessner–Tempesta entropy S γ , α J P P T [26] and many others belong to the class of group entropies and are therefore composable. To facilitate the identification, we are here using the following notations: Sharma–Mittal entropy S q , r S M [27], Landsberg-Vedral-Rajagopal-Abe entropy S q L V R A [28,29,30], Tsallis–Mendes–Plastino entropy S q T M P , Arimoto entropy S q A r [31], Curado–Tempesta–Tsallis entropy S a , b , r C T T [32], Borges–Roditi entropy S q , q B R [33], Abe entropy S q A b [34], Kaniadakis entropy S κ K [35], Kaniadakis–Lissia–Scarfone entropy S κ , r K L S [36], Anteneodo–Plastino entropy S η A P [37], Hanel–Thurner entropy S c , d H T [38,39], S q , δ [40], Schwammle–Tsallis entropy S q , q S T [41], the Tempesta ( α , β , q ) -entropy S α , β , q T [42], the Curado b-entropy S b C [43,44], the Curado λ -entropy S λ C [45] (see [12]), and the exponential c-entropy S c E (see [10,46]). The entropic form S λ C is one among the rare cases which do not include S B G and is neither trace-form nor composable. From [1,12].
Entropy 25 00743 g001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tsallis, C. Senses along Which the Entropy Sq Is Unique. Entropy 2023, 25, 743. https://doi.org/10.3390/e25050743

AMA Style

Tsallis C. Senses along Which the Entropy Sq Is Unique. Entropy. 2023; 25(5):743. https://doi.org/10.3390/e25050743

Chicago/Turabian Style

Tsallis, Constantino. 2023. "Senses along Which the Entropy Sq Is Unique" Entropy 25, no. 5: 743. https://doi.org/10.3390/e25050743

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop