Next Article in Journal
Remarks on the Compatibility of Opposite Arrows of Time
Previous Article in Journal
The FIS2005 Conference in Paris
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Entropy, Fluctuation Magnified and Internal Interactions

Department of Physics, Yunnan University, Kunming, 650091, China
Entropy 2005, 7(3), 190-198;
Submission received: 22 November 2004 / Accepted: 20 May 2005 / Published: 28 August 2005


Since fluctuations can be magnified due to internal interactions under a certain condition, the equal-probability does not hold. The entropy would be defined as S(t) = −k r Pr(t) ln Pr(t). From this or S = k ln Ω in an internal condensed process, possible decrease of entropy is calculated. Internal interactions, which bring about inapplicability of the statistical independence, cause possibly decreases of entropy in an isolated system. This possibility is researched for attractive process, internal energy, system entropy and nonlinear interactions, etc. An isolated system may form a self-organized structure probably.


Usual development of the second law of the thermodynamics was based on an open system, for example, the dissipative structure theory [1]. Weiss, et al., [2] discussed the extended thermodynamics, and proposed a new principle for stationary thermodynamic processes: Maximum of the local entropy production becomes minimal in the process [3]. Fort and Llebot [4] proved that the classical entropy does not increase monotonically for an isolated fluid, and considered that the generalized entropy of extended irreversible thermodynamics is more suitable for this fluid.
The basis of thermodynamics is the statistics, in which a basic principle is statistical independence: The state of one subsystem does not affect the probabilities of various states of the other subsystems, because different subsystems may be regarded as weakly interacting [5]. It shows that various interactions among these subsystems should not be considered. But, if various internal complex mechanism and interactions cannot be neglected, perhaps a state with smaller entropy (for example, self-organized structure) will be able to appear. In this case, the statistics and the second law of thermodynamics are possibly different [6]. For instance, the entropy of an isolated fluid whose evolution depends on viscous pressure and the internal friction does not increase monotonically [4].


The second law of thermodynamics is a probability law. The transition probability from molecular chaotic motion to regular motion of a macroscopic body is very small. But, this result may not hold if interactions existed within a system. According to the Boltzmann and Einstein fluctuation theories, all possible microscopic states of a system are equal-probable in thermodynamic equilibrium, and the entropy tends to a maximum value finally. It is known from statistical mechanics that fluctuations of the entropy may occur [7], while fluctuations always bring the entropy to decrease [5,8]. Further, under a certain condition fluctuations can be magnified [7,9] due to internal interactions, etc.
It is well known, the entropy of the system can be expressed using the probability of finding the system in state r as [10,11]
S = k r P r ln P r
The probability of the particular state r is
P r = 1 / W ( E r )
If the probabilities of all states are equal in the equilibrium, Eq. (1) will find a sum, and
S = klnW
It is namely the Boltzmann-Planck equation. Here S>0. But, if these probabilities are not equal always, only Eq. (1) will be appropriate. When the probability changes with time, the entropy changes also with time, and would be defined as
S ( t ) = k r P r ( t ) ln P r ( t )
From this we derive
Entropy 07 00190 i001
d S = k r ( ln P r + 1 ) d P r
It is known that the probability 1 > Pr > 0 . For these cases of the increasing probability dPr > 0, and 1 > 1 + ln Pr > 0, so
dSr = −k (1 + ln Pr)dPr < 0.
It implies that the entropy Sr decreases when the probability of a state r increases. It is consistent with the disorder decrease when the determinability increases. In this case, Sr is additive.
Further, we discuss change of the total entropy. Assume that the initial probabilities of all states are equal, P r = 1 / r ( r P r = 1 ) . According to Eq. (1), the totalr entropy is S=klnr. If there are various internal complex mechanisms in a system (for example, self-interactions exist), fluctuations will occur and be magnified, and a probability of one of these states will increase as
Pfm = 1/ n > Pr.
But, the probabilities of other r-1 states will be equal yet,
Entropy 07 00190 i002
Therefore, the entropy of a final state will be
Entropy 07 00190 i003
All equal-probabilities of these r-1 states will find a sum,
Entropy 07 00190 i004
Entropy 07 00190 i005
By the numeral calculation, we obtain:
When r=50,
Let (n/r)=x<1, so [n(1-x)/(n-x)]=(r-n)/(r-1)<1. Then Eq. (11) shows that a Taylor series are converged,
Entropy 07 00190 i006
Above results point quantitatively out that the entropy decreases with fluctuations, and with a state fixed continuously. While for n=1, i.e., a state is determined due to fluctuation magnified finally, dS f = 0, dS = −dSi =-3.9120k. So the entropy decreases necessarily.


According to
S = k ln Ω ,
in an isolated system there are the n-particles, which are in different states of energy respectively, so Ω1 = n!. Assume that internal attractive interaction exists in the system, the n-particles will cluster to m-particles. If they are in different states of energy still, then Ω2 = m!. Therefore, in this process
S2S1 = dS = k ln(Ω2 / Ω1 ) = k ln(m!/ n!) .
So long as m<n for the condensed process, entropy decreases dS<0. Conversely, m>n for the dispersed process, entropy increases dS>0. In these cases it is independent that each cluster includes energy more or less. In an isolated system, cluster number is lesser, the degree of irregularity and entropy are smaller also. It is consistent with a process in which entropy decreases from gaseous state to liquid and solid states. Moreover, according to Eq.(1), so long as n-particles are equal-probable, Pr = 1/n, S1 = k ln n . These particles cluster to the equal-probable m-particles, S2 = k ln m, dS = k ln(m / n) . The conclusion is the same. We have discussed the possibility on decrease of entropy, its mechanism and some examples [6]. Here from the definition of entropy a possibly developed direction is researched.
The energy of a system is [12]
Entropy 07 00190 i007
where εs is the additive part of the particle energy in the state s, in most cases it and E are the kinetic energy; Wss' and Uss' are the absolute values of the attraction and repulsion energies of particles in the states s and s’, respectively.
According to the basic equation of thermodynamics, i.e., Euler equation [8],
Entropy 07 00190 i008
For an equal-temperature process, a simple result of Eq.(17) is [5]:
where U is the internal energy of body. When internal interactions exist among different subsystems in an isolated system, the internal energy and the entropy will be not additive extensive quantities. They relate to different structures of a system. For example, the entropy of the coherent light is not an additive quantity. In this case, statistical independence and equal-probability in thermodynamic equilibrium are unavailable. The additivity of entropy is postulated in statistical physics [5], but interactions among subsystems are neglected. As Riedi pointed out [10]: A strongly interacting system must be treated as a whole. The total energy U cannot be broken up into individual particle energies, here the potential energy of a given molecule depends upon the position of all the other molecules. Only when the potential energy term is zero, the total energy of the system is separable into a sum of single-particle energies.
In this case, the total entropy should be extended to
dS = dS a + dS i ,
where dSa is an additive part of entropy, and dSi is an interacting part of entropy. Eq.(19) is similar to a well known formula:
dS = diS + deS,
in the theory of dissipative structure proposed by Prigogine. Two formulae are applicable for internal or external interactions, respectively.
Only the first term of right of Eq.(17) is considered,
S = U T = E + U i T ,
d S = d U T - U d T T 2 .
Further, we discuss concretely a particular case with attract interactions in a system. The attractive force F = − A / r2 may be gravitational or electromagnetic force. The potential energy is
U i = A r
In an attractive process the distance and energy all decrease
Entropy 07 00190 i009
According to Eq. (22), 1. If the temperature T is changeless, dS=dU/T<0, the entropy will decrease. 2. The total energy of an isolated system should be conservative, dU=0. Then the potential energy transforms the kinetic energy, and temperature will increase, dT>0,
Entropy 07 00190 i010
3. For an equal-temperature process, since the total energy is conservative dU=0, the volume decreases in an attractive process dV<0, and so is entropy dS<0 based on Eq.(18). Three cases all show that the entropy decreases. In a word, the entropy decreases for an attractive process.


Our conclusions are consistent with the system theory and with the nonlinear theory. In a system composed of two subsystems, which are not independent, the subadditivity states that
S (ρ)S (ρ1) + S(ρ2) ,
where ρ = λ1 ρ1 + λ2 ρ2 [13]. This shows that the entropy decreases with the internal interaction. Not only is this conclusion the same with the conditioned entropy on ρ1 and ρ2, but also it is consistent with the systems theory in which the total may not equal the sum of parts.
Weinberg has proposed a generalized theory of nonrelativistic nonlinear quantum mechanics as a framework for the development and analysis of experimental tests of the linearity of quantum mechanics [14]. The nonlinear quantum theory is a notable development [15,16]. However, Peres has proven that nonlinear variants of the Schrodinger equation violate the second law of thermodynamics [17]. We are sure that a nonlinear development of various theories is a necessary direction. The above contradiction implies that the second law of thermodynamics seems to exclude the nonlinearity, which must include certain interactions.
In a general case, in systems with nonlinear interaction the computer experiments show that the coupling together of complex systems often increases rather than decreases the degree of order in the composite system [18]. It corresponds to an order parameter occurring for a lower symmetrical state in simple systems.
In more general situations, when internal interactions exist in an isolated system, if a mechanism produces a process (e.g., repulsive force, fall of temperature, diamagnetic body, etc.) to increase the entropy, a reverse mechanism will produce a process (e.g., attractive force, rise of temperature, paramagnetic body, etc.) to decrease the entropy. For example, if
S = V T ,
S2 > S1 When (∂V2 / ∂T2 ) < (∂V1 / ∂T1 ); conversely, S2 < S1 when (∂V2 / ∂T2 ) > (∂V1 / ∂T1 ).
For an ideal gas,
Si = S0 + cV ln T + vR ln V .
When temperature or volume of an isolated system decreases, for example, attractive force exists, or star is formed from nebula, or the quantity of heat is released in chemical reaction, the entropy should decrease for these processes.
In a theory of the phase transition on hadronic matter expounded by Weinberg [19], the Lagrangian density is
Entropy 07 00190 i011
where P(φ) corresponds to the potential energy. Such the entropy density is
Entropy 07 00190 i012
Here S is direct proportion to − θ. If various quantities all are positive, θ2 > θ1 for the attraction cases, then S2 < S2 , and the entropy will decrease.


In a biological self-organizing process some isolated systems may tend to the order states spontaneously. Ashby pointed out [20]: Ammonia and hydrogen are two gases, but they mix to form a solid. There are about 20 types of amino acid in germ, they gather together, and possess a new reproductive property. It is a usual viewpoint that solid is more order than gas, and corresponding solid entropy should be smaller than gaseous entropy. Germ should be more order than amino acid yet. Prigogine and Stengers [9] discussed a case: When a circumstance of Dictyostelium discoideum becomes lack of nutrition, they as some solitary cells will unite to form a big cluster spontaneously. In this case these cells and nutrition-liquid may be regarded as an isolated system. Jantsch [21] pointed out: When different types of sponge and water are mixed up in a uniform suspension, they rest after few hours, and then separate different types automatically.
It is more interesting, a small hydra is cut into single cell, then these cells will spontaneously evolve, firstly form some cell-clusters, next form some malformations, finally will become a normal hydra.
In chemistry the Belousov-Zhabotinski reaction shows a period change automatically, at least a certain time. In microscopic region, the Pauli exclusion principle may keep an order state spontaneously.
In fact the auto-control mechanism in an isolated system may produce a degree of order. If it does not need the input energy, at least in a given time interval, the auto-control will act like a type of Maxwell demon, which is just a type of internal interactions. The demon may be a permeable membrane. For the isolated system, it is possible that the catalyst and other substance are mixed to produce new order substance with smaller entropy. Ordering is the formation of structure through the self-organization from a disordered state.
In a word, thermodynamics and its second law are based on certain prerequisites, such as statistical independence, etc. Then the entropy increase principle is extended to any case. We think that the applicability of the principle should be tested again. When there are interactions among the subsystems in an isolated system: 1. All of generalized second law of thermodynamics may not be applicable. 2.The entropy increase principle in a nonequilibrium process may not hold always. 3. It should be discussed that all middle change process from begin to end is always entropy increase. There are rise and fall for the relation between time and entropy, namely, the entropy of this system can increase or decrease for different time intervals. The possible mechanism behind conclusions is fluctuation and self-interaction, from which self-organization may form a lower entropy state.
Perhaps, the second law of thermodynamics should be developed for a system with complex relations. Haken has pointed out [7] that for thermodynamics, in closed systems the entropy never decreases. The proof of this theorem is left to statistical mechanics. To be quite frank, in spite of many efforts this problem is not completely solved. When the internal interactions exist among subsystems, the statistical independence and equal-probability are unavailable. If fluctuations are magnified, and the order parameter comes to a threshold value, phase transition will occur. In this case, the entropy may decreases in an isolated system, at least within a certain time. A self-organized structure whose entropy is smaller will be formed.


  1. Prigogine, I. From Being to Becoming. Freeman, W.H., Ed.; San Francisco, 1980. [Google Scholar]
  2. Weiss, W. Continuous Shock Structure in Extended Thermodynamics. Phys.Rev. 1995, E52(6), R5760–5763. [Google Scholar] [CrossRef]
  3. Struchtrup, H.; Weiss, W. Maximum of the Local Entropy Production Becomes Minimal in Stationary Processes. Phys.Rev.Lett. 1998, 80(23), 5048–5051. [Google Scholar] [CrossRef]
  4. Fort, J.; Llebot, J.E. Behaviour of Entropy in Relaxational Hydrodynamics. Phys. Lett. 1995, A205(4), 281–286. [Google Scholar] [CrossRef]
  5. Landau, L.D.; Lifshitz, E.M. Statistical Physics. Pergamon Press. 1980.
  6. Chang, Yi-Fang. In Entropy, Information and Intersecting Science, Yu, C.Z., Ed.; Yunnan Univ. Press, 1994; 53, 60, Possible Decrease of Entropy due to Internal.
  7. Interactions in Isolated Systems. Apeiron, 1997; 4, 4, 97–99.
  8. Haken, H. Synergetics. In An introduction; Springer-Verlag, 1977. [Google Scholar]
  9. Reichl, L.E. A Modern Course in Statistical Physics. Univ.of Texas Press. 1980. [CrossRef]
  10. Prigogine, I.; Stengers, I. Order Out of Chaos. Bantam Books Inc.: New York, 1984. [Google Scholar]
  11. Riedi, P.C. Thermal Physics (Sec.Edi.). Oxford Science Publications, 1988. [Google Scholar]
  12. Schafer, H.; Mark, A.E.; van Gunsteren, W.F. Absolute Entropies from Molecular Dynamics Simulation Trajectories. J.Chem.Phys. 2000, 113(18), 7809–7817. [Google Scholar] [CrossRef]
  13. Lev, B.I.; Zhugaevych, A.Ya. Statistical Description of Model Systems of Interacting Particles and Phase Transitions Accompanied by Cluster Formation. Phys.Rev. 1998, E57(6), 6460–6469. [Google Scholar] [CrossRef]
  14. Wehrl, A. General Properties of Entropy. Rev.Mod.Phys. 1978, 50(2), 221–260. [Google Scholar] [CrossRef]
  15. Weinberg, S. Precision Tests of Quantum Mechanics. Phys.Rev.Lett. 1989, 62(5), 485–488. [Google Scholar] [CrossRef] [PubMed]
  16. Chang, Yi-Fang. New Research of Particle Physics and Relativity. Yunnan Science and Technology Press, 1989; 41–66. [Google Scholar]
  17. Chang, Yi-Fang. Test of Pauli’s Exclusion Principle in Particle Physics, Astrophysics and Other Fields. Hadronic J. 1999, 22(3), 257–268. [Google Scholar]
  18. Peres, A. Nonlinear Variants of Schrodinger’s Equation Violate the Second Law of Thermodynamics. Phys.Rev.Lett. 1989, 63(10), 1114. [Google Scholar] [CrossRef]
  19. Shaw, H.R. The Periodic Structure of the Natural Record, and Nonlinear Dynamics. EOS. 1987, 68(50), 1651–1666. [Google Scholar] [CrossRef]
  20. Weinberg, S. Gauge and Global Symmetries at High Temperature. Phys.Rev. 1974, D9(12), 3357–3378. [Google Scholar] [CrossRef]
  21. Ashby, W.R. An Introduction to Cybernetics. Chapman & Hall Ltd., 1956. [Google Scholar]
  22. Jantsch, E. The Self-Organizing Universe. Pergamon. 1979.

Share and Cite

MDPI and ACS Style

Chang, Y.-F. Entropy, Fluctuation Magnified and Internal Interactions. Entropy 2005, 7, 190-198.

AMA Style

Chang Y-F. Entropy, Fluctuation Magnified and Internal Interactions. Entropy. 2005; 7(3):190-198.

Chicago/Turabian Style

Chang, Yi-Fang. 2005. "Entropy, Fluctuation Magnified and Internal Interactions" Entropy 7, no. 3: 190-198.

Article Metrics

Back to TopTop