Next Article in Journal
Nonlinear Statistical Features of the Seismicity in the Subduction Zone of Tehuantepec Isthmus, Southern México
Next Article in Special Issue
Entropy and the Experience of Heat
Previous Article in Journal
Forecasting Network Interface Flow Using a Broad Learning System Based on the Sparrow Search Algorithm
Previous Article in Special Issue
A Robust Protocol for Entropy Measurement in Mesoscopic Circuits
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

On the Thermal Capacity of Solids

Institute of Physical Chemistry and Electrochemistry, Leibniz University Hannover, Callinstraße 3A, D-30167 Hannover, Germany
Entropy 2022, 24(4), 479; https://doi.org/10.3390/e24040479
Submission received: 26 February 2022 / Revised: 23 March 2022 / Accepted: 24 March 2022 / Published: 29 March 2022
(This article belongs to the Special Issue Nature of Entropy and Its Direct Metrology)

Abstract

:
The term thermal capacity appears to suggest a storable thermal quantity. However, this claim is not redeemed when thermal capacity is projected onto “heat”, which, like all energy forms, exits only in transit and is not a part of internal energy. The storable thermal quantity is entropy, and entropy capacity is a well-defined physical coefficient which has the advantage of being a susceptibility. The inverse of the entropy capacity relates the response of the system (change of temperature) to a stimulus (change of entropy) such as the fluid level responses to a change in amount of fluid contained in a vessel. Frequently, entropy capacity has been used implicitly, which is clarified in examples of the low-temperature analysis of phononic and electronic contributions to the thermal capacity of solids. Generally, entropy capacity is used in the estimation of the entropy of a solid. Implicitly, the thermoelectric figure of merit refers to entropy capacity. The advantage of the explicit use of entropy capacity comes with a descriptive fundamental understanding of the thermal behaviour of solids, which is made clear by the examples of the Debye model of phonons in solids, the latest thermochemical modelling of carbon allotropes (diamond and graphite) and not least caloric materials. An electrocaloric cycle of barium titanate close to its paraelectric–ferroelectric phase transition is analysed by means of entropy capacity. Entropy capacity is a key to intuitively understanding thermal processes.

Graphical Abstract

1. Introduction

1.1. Energy and Entropy

In the traditional approach of thermodynamics, which identifies “heat” as thermal energy [1], “heat capacity” is a inadequate term. The word capacity conveys the notion that the quantity “heat” is contained by the receiving vessel, however, it has been pointed out by several authors that “heat” cannot be stored in a system (e.g., solid). In his paragraph on the concept of “heat”, Zemansky [2] (p. 76) wrote that “‘heat’ is internal energy in transit” and “it would be incorrect to refer to the ‘heat’ in a body”. Similarly, Callen [3] (p. 112) wrote that “‘heat’ refers to a mode of energy flux rather than to an attribute of a state of a thermodynamic system”. Strunk [4] directly explicated that “‘heat’ is the strange thing that is flowing only, but disappears upon arrival in any system”. Falk and Ruppel [5] (p. 92) emphasised that “‘heat’ is not a part of internal energy, but an energy form. The fact that ‘heat’ is not in a system but only occurs when energy is exchanged, like all energy forms [6], is one of the most crucial points in thermodynamics, which cannot be stated often enough [7]”.
By principle, a “heat” current is coupled to an entropy current [5] (p. 92). As accentuated by Falk et al. [8] “...one must focus on the substance-like quantities accompanying the flow of energy if one wants to get a suitable description of energy transfer.” It is helpful to consider entropy as an energy carrier [8]. The amount of energy carried by entropy is “heat”. It only makes sense to speak about “heat” (thermal energy) when entropy flows. Interestingly, Callen [3] (p. 32) wrote that “a quasi-static flux of ‘heat’ into a system is associated with an increase of entropy of that system” [9]. Because there is no such thing as stored thermal energy, but stored entropy, it is reasonable to consider the entropy capacity of a solid. Although little known, entropy capacity is a well-defined coefficient carrying the real meaning of a capacity and can be used in that manner with great advantage to the understanding of thermal processes and has been addressed by several authors.

1.2. Outline

The aim of this work was to provide access to sources of the dispersed knowledge on entropy capacity and to illustrate the usefulness of this concept. After an overview of the existing literature on this topic, basic relationships are reviewed. Then, the part of Wiberg’s textbook [10] related to entropy capacity is recapitulated, which gives a vivid picture of entropy capacity in general and of carbon allotropes in particular. A bridge is built between fundamental considerations and current fields of application by presenting the example of caloric materials and thermoelectrics. The phonon-related entropy capacity of solids is discussed with respect to the Debye model by examples given in Debye’s classical work [11]. In addition, the concept of entropy capacity is expanded to the electronic contribution using the model of the free electron gas at low temperature. Entropy capacity is often implicitly used when separating electronic and phononic contributions from the “heat capacity”. The discussion turns to persistent confusion due to the disruptive development of thermodynamics; a probable resolution is to leave dead metaphors behind.

2. Materials and Methods

Experiments to illustrate the analogy with a fluid vessel were performed using red wine—specifically Monopoles Nicola Napoléon Bordeaux Superior 2018 (Nicola Napoléon CIE & S.A.R.L, Saint-Émilion, Gironde, France, packager code EMB 33394)—and a glass of the type Schott Zwiesel Whisky Nose 120 (Zwiesel Kristallglas AG, Zwiesel, Germany). Video recording and photographing were performed using a Sony DSC-RX100 Mark 3 digital camera (Sony Corporation, Tokyo, Japan). Items were placed on a portable shooting table (Calumet Photographic Inc., Chicago, IL, USA) and the scene was illuminated using two Nanlite Lumipad 25 (Guangdong Naguang Photo & Video Systems Co., Ltd., Shantou City, Guangdong, China). Video editing was performed using HitFilm Express 14 (FXhome, Norwich, Norfolk, UK). The music in the videos is “Cute” from Bensound.com. Photo editing was performed using Image J, version 1.53o (Wayne Rasband, US National Institutes of Health, Bethesda, MD, USA).
Calculations of the graphs were performed using Python embedded into OriginPro, Version 2022 (OriginLab Corporation, Northampton, MA, USA). Graphs were set and analysed in OriginPro. Composite figures were arranged in PowerPoint in Microsoft Office Professional Plus 2016 (Redmond, WA, USA) and exported in portable data format (PDF).

3. Entropy Capacity

Lunn [12] (p. 2) stated that “at constant volume, the capacity of an ideal gas for change of thermal energy is constant but its capacity for change of entropy varies inversely as the absolute temperature”, which refers to the Dulong–Petit relation of the ideal gas.
Falk [13,14] recalled that entropy capacity (at the time known as heat capacity) was introduced by Joseph Black (1728–1799), who refined the term heat (caloric) that has been around for centuries. Entropy is a resurrection of the caloric [13,15,16,17] and endows entropy capacity with the real meaning of capacity. Falk made clear that entropy capacity is a susceptibility, i.e., a second derivative of a Massieu–Gibbs function with respect to intensive variable(s), and must be positive within the stability boundaries of the system. The inverse of the entropy capacity, which has been called heatability by Herrmann and Hauptmann [18] (p. 28ff), relates the response of the system (change of temperature) to a stimulus (change of entropy contained). The metrology of entropy capacity is addressed and so is the fact that at least two different entropy capacities need to be considered for a gas, e.g., at constant volume and at constant pressure, because the entropy contained in gas depends not only on temperature, but also noticeably on pressure. Falk and Ruppel [5] (p. 297f) addressed these aspects in a condensed form.
Strunk [19] (pp. 57f, 331) mentioned that (specific) entropy capacities are susceptibilities and can be obtained either by differentiating the entropy of the system with respect to the temperature or by dividing the “heat capacity” by the absolute temperature. He used the latter relationship to formulate susceptibility matrices for simple phases, which are symmetric because of the Maxwell relations, and each comprises an entropy capacity on its diagonal, respectively, [20] (pp. 7, 11). Constraints explicitly considered with respect to entropy capacities are the intensive variables pressure and chemical potential being constant.
Mareŝ et al. [21] criticised the inconvenient choice of the conceptual basis of thermodynamics created in the 19th century and identified entropy with the caloric (heat) of older theories. Entropy capacity [22] is presented by differentiating the entropy of the system with respect to temperature.
Job [23,24] treated several aspects of entropy capacity and explicitly addressed the Debye model of solids (p. 114f). The change from parabolic dependence (Debye model) to hyperbolic dependence (Dulong–Petit relationship) on temperature is interpreted as a phase transition, which is characterised by a maximum entropy capacity at approximately a quarter of the Debye temperature. In their undergraduate textbook, Job and Rüffler [25] dealt with molar and specific (per mass) entropy capacities in analogy to matter capacity and buffer capacity with demonstrative examples. The accompanying workbook [26] comprises several exercises on entropy capacity and provides the corresponding detailed solutions with helpful comments.
Fuchs [27,28,29,30] has developed the most extensive view on entropy capacity to date, which uses analogies to gravitation, hydraulics, electricity, and mechanics, (i.e., mass as momentum capacitance). With respect to heating, he showed that entropy capacity relates the rate of change of temperature to the rate of change of the entropy. He mentioned that the direct measurement of entropy capacity is not simple and addressed the difficulties involved. He presented a temperature–entropy capacity diagram for the ideal gas and tables for the entropy capacity of some substances. A formulaic expression for the entropy capacity of phonons, according to the low-temperature approximation of the Debye model, is given. The entropy capacity of black body radiation is treated. Consider the entropy capacity at the constant magnetisation of a paramagnetic substance leads to a vivid and simple interpretation of magnetocalorics. Thoroughly analysed examples as well as detailed explanations and exercises with solutions [28] are provided. Fuchs et al. [31,32] put entropy capacity into the context of the historical development of the caloric theory and linked its value under different constraints (i.e., constant volume or constant pressure) via the adiabatic coefficient.
In his student textbook, Wiberg [10] vividly demonstrated that the chemical substances are capacities for entropy. The abstract terms entropy and reaction entropy are substantiated as capacity factors for thermal energy analogous to charge in electricity and the amount of fluid (water) in hydraulics. The intensity factor is then the absolute temperature analogously to electrical potential in electricity and the height of fluid level in hydraulics. Wiberg [10] (p. 140) wrote: “When ‘heat’ is supplied to a chemical substance, its entropy content is increased. In the same way, the fluid level (i.e., the height of the amount of water) is raised while filling a water vessel with water, the entropy level (i.e., the temperature of the respective chemical substance) is raised while filling an entropy vessel (e.g., a gas or a liquid [or a solid]) with entropy. In both cases, the increase in height is dependent on the shape of the vessel [33].” The shape of the entropy vessel is given by the entropy capacity of the chemical substance. In the example of the allotropic phase transition from graphite to diamond or vice versa, which he discussed in a general concept of chemical reactions, Wiberg very clearly showed the consequence of the changed shape of the entropy vessel, which can cause the emission of entropy or absorption of entropy. Analogous to traditional adjectives exothermic and endothermic reactions, the adjectives exotropic and endotropic are suggested, which allow distinguishing reactions with entropy being released (negative reaction entropy) from reactions with entropy being absorbed (positive reaction entropy), because of increased or decreased entropy capacity of products compared to educts. Wiberg gives very detailed figures for the amount of entropy stored in graphite and diamond at different temperatures.

4. Entropy Capacity versus “Heat Capacity”

The entropy capacity K relates the change in entropy S with the absolute temperature T:
K : = S T
regardless of the constraints [13] of constant volume V and a constant number of particles N:
K V , N = S T V , N
or constant pressure p and a constant number of particles N:
K p , N = S T p , N
If the number of particles is implicitly kept constant, these quantities can be denoted as the entropy capacity at constant volume K V or the entropy capacity at constant pressure K p .
The “heat capacity” C V is related to the entropy capacity K V at constant volume by Equation (4), which refers to the change in internal energy E and thus transferred energy. Following the approach of Fuchs [29], it is semantically more appropriate to call C V the temperature coefficient of energy, which reflects its real meaning:
C V = T · K V = T · S T V = E T V
The “heat capacity” C p is related to the entropy capacity K p at constant pressure by Equation (5), which refers to the change in enthalpy H and thus the transferred enthalpy. Following the approach of Fuchs [29], it is semantically more appropriate to call C p the temperature coefficient of enthalpy, which reflects its real meaning:
C p = T · K p = T · S T p = H T p
The fact that C V and C p refer to the exchange of different quantities was discussed by Falk [13] (p. 188). Falk stated that the term “heat capacity” is linguistically and conceptually a trap.
Zemansky [2] (p. 306) stated that the expression derived from the partition function in statistical thermodynamics for entropy is simpler than the expression derived for the internal energy. Interestingly, when Callen [3] (p. 353f) treated the Debye model of solids, he did not derive the contribution of the phonons to the internal energy but to the molar entropy. As such, he implicitly used the entropy capacity to deduce the “heat capacity” as was performed herein in Equations (4) and (5). If temperature is known in addition to the values of one or the other, C V and K V or C p and K p are easily convertible. Values of entropy capacity for some substances are given in [25,29].

5. Analogy: Storage of a Fluid in a Vessel

Wiberg [10] drew an analogy between the capacity of chemical substances to store entropy and the hydraulic capacity of a vessel to store a fluid. The latter is illustrated in Figure 1. The capacity of a glass to store a fluid depends upon its shape, which perhaps changes with the fluid level. In Figure 1, the fluid level is subsequently raised by equal height differences of 25 mm each, but the respective amount of fluid is quite different in each step, because of the shape of the vessel being wider or narrower.
Of course, the chemical substances are containers for entropy with permeable walls. Due to entropy permeation through nonadiabatic walls, at a certain rate, the entropy level in the container will drop when the temperature (entropy level) in the surrounding decreases, and the entropy level will rise when the temperature in the surrounding increases. For solids, equilibration takes a long time [34]. The situation with chemical substances in general is even more than intricate because the flow of entropy under nonisothermal conditions is associated with the production of additional entropy [23,25,29]. Nevertheless, for storing entropy in chemical substances, the analogy is very instructive.

6. Entropy Capacity of Diamond and Graphite

The shape of the entropy vessel in Figure 2a corresponds to graphite and that in Figure 2b to diamond under isobaric conditions. The hatched area in Figure 2a marks an infinitesimal amount of entropy d S = K p · d T , which is linked to an infinitesimal temperature interval d T by the entropy capacity K p . The wider the vessel is, the more entropy d S must be filled in to increase the entropy level (i.e., the temperature) by d T . Easily, the entropy S contained in the vessel at a certain entropy level (i.e., temperature T) can be estimated according to Equation (6):
S ( T ) = 0 T K p · d T
Interestingly, the entropy is commonly estimated by such integrals with the integrand being C p / T , which implicitly refers to the isobaric entropy capacity (as can be seen in Equation (5)). Using the entropy capacity explicitly comes with the benefit of clarity. The entropy stored at equivalent temperature intervals of 300 K each can be estimated the same way and is added to these intervals in Figure 2a for graphite and in Figure 2b for diamond. From Figure 2c, diamond is obviously the narrower entropy vessel compared to graphite. The amount of entropy being stored less in diamond compared to graphite in each 300 K interval is added to the graph. The accumulated entropies of the present paper are in qualitative agreement with Wiberg’s book [10] (see Table A2), except that Wiberg equated, perhaps for didactic reasons, the entropy capacities of both carbon allotropes from 600 K upward.
The graphs T as a function of K p in Figure 2 show the inverse of the entropy capacity, i.e., T / S , which has been called heatability by Herrmann and Hauptmann [18] (p. 28ff).
Usually, the diagrams are plotted as in Figure 3 with the temperature in the axis of ordinates and the entropy capacity on the axis of abscissae, which, however, does not change anything regarding its vivid meaning [10]. In Figure 3, in addition to the molar isobaric entropy capacity K ^ p of graphite and diamond, the molar isochoric entropy capacity K ^ V of the Dulong–Petit relationship (hyperbolic) is plotted. The respective C ^ p and C ^ V curves are given in Figure A1, but lack vivid meaning. Above 1800 K, K ^ p of graphite converges to the classical Dulong–Petit relationship, which here is 3 · R · T 1 [36] (p. 427), and eventually exceeds it, while K ^ p for diamond remains below. Here, R is the universal gas constant.
Graphite is an extreme example that cannot be described by a Debye model with a single Debye temperature. Its phonon density of states is likely to be excessively complex, which is due to its strong anisotropy with weak van der Waals interplane forces and strong covalent bonds in the basal plane [37]. Even though diamond does not have such anisotropy, the Debye model oversimplifies the phonon dispersion and gives an inappropriate prediction of the entropy capacity or the temperature coefficient of enthalpy [37]. In a heuristic approach, some researchers have introduced a temperature-dependent Debye temperature [38], which allows the maintenance of the Debye model at varying temperature ranges [39] (p. 105ff), but thwarts the intention to predict the temperature dependence of the isobaric entropy capacity or the temperature coefficient of enthalpy over a wide temperature range using a single parameter [36] (p. 459f). As a consequence, quite different values of the Debye temperature have been reported for diamond depending upon how the Debye model was fitted to low-, mid- or high-temperature empirical data. Examples are given in Figure A2.
Moreover, the difference between K p or C p (empirical) and K V or C V (model) can be significant at high temperatures exceeding 10% [40]. The temperature dependence of the thermal expansion, isothermal compression, and molar volume must be considered to match the models to empirical data at higher temperatures. Due to incomplete data at high temperature, the combination of the aforementioned parameters to the Grüneisen parameter is often considered. With the assumption of a temperature-independent Grüneisen parameter, which implies that the Debye temperature is only dependent on the molar volume, the ratio K p / K V or C p / C V can be estimated [39] (p. 102ff). The current approach to provide model data for the calculation of phase diagrams (Calphad) databases [37,41], however, is multiparameter fitting to empirical data. The model proposed by Bigdeli et al. [37] relies on multiple Einstein temperatures and gives reasonable estimates for a wide temperature range, but fades away from empirical data above 3000 K (graphite) or 1000 K (diamond). Recently, a reliable description of C p of the carbon allotropes diamond and graphite from 0.1 K to the melting point has been given by Vassiliev and Taldrik [35] using a Debye–Maier–Kelley hybrid model, and was used in this work with the parameters listed in Table A1 to describe isobaric entropy capacity.

7. Reaction Entropy

Wiberg [10] considered the allotropic phase transition of diamond to graphite as the special case of a chemical reaction. If the transformation of 1 mol diamond to 1 mol graphite is considered, integrating the difference between the entropy capacities of the product (i.e., graphite) and reagent (i.e., diamond) in Figure 3 leads to the molar reaction entropy Δ S ^ according to Equation (7):
Δ S ^ diamond graphite = 0 T K ^ p , graphite ( T ) K ^ p , diamond ( T ) · d T
The molar reaction entropy according to Equation (7) is plotted in Figure 4 versus temperature.
The reaction entropy of this work is in qualitative agreement with Wiberg’s book [10] (see Table A2), except that Wiberg equated, perhaps for didactic reasons, the entropy capacity of both carbon allotropes from 600 K upward. Therefore, in Wiberg’s diagram, reaction entropy is constant from 600 K upward, but in Figure 4, the reaction entropy further increases at a decreasing rate. The reaction entropy at integral multiples of 300 K is indicated by horizontal dashed lines and corresponds to subsequent sums of the values in Figure 3.
In an inert atmosphere (absence of oxygen), diamond can be heated to approximately 2000 K. However, its surface is covered by a thin layer of graphite [35,42]. If the transformation is considered to proceed at constant temperature, the reaction entropy is isothermally absorbed. Otherwise, the temperature of the resulting graphite would temporarily drop until the reaction entropy balances the entropy level. Following the term endothermic, Wiberg [10] coined such a process with positive reaction entropy as endotropic. The reverse reaction, whereby graphite is transformed into diamond, has been reported to appear at high temperature (ca. 1573 K to 3573 K), which was achieved by flash heating and high pressure (ca. 15 GPa), which were applied to keep the graphite at a strictly constant volume [43,44]. Under this high pressure, due to a rigidly fixed volume, the entropy capacity was expected to be smaller than the entropy capacity given in Figure 2a. The latter implicitly refers to a constant ambient pressure (ca. 100 kPa). The transformation of graphite into diamond has a negative reaction entropy and can thus be classified as an exotropic reaction. The classification is given as follows:
  • Endotropic reaction, Δ S > 0 : entropy is isothermally absorbed by the chemical substance(s) from the environment (Ref. [10], p. 155, Ref. [25], p. 231ff).
  • Exotropic reaction, Δ S < 0 : entropy is isothermally ejected from the chemical substance(s) to the environment (Ref. [10], p. 155, Ref. [25], p. 231ff).
Illustrative examples of the isothermal squeezing out or soaking up of entropy (sponge model) are given in [23,25,45]. If the reaction cannot be exchanged with the environment at a sufficiently fast rate, however, the temperature of the chemical substance(s) temporarily changes. In the extreme case, the temperature change is adiabatic. This is discussed in the example of an electrocaloric material in Section 8.
Wiberg [10] integrated a graph analogous to Figure 4 to produce a vivid picture of the Gibbs–Helmholtz equation and deduced graphs of Gibbs free energy and enthalpy versus temperature for the transformation of diamond into graphite. Wiberg clearly deduced that endotropic reactions are possible at decreasing temperature only if the entropy to be absorbed ( Δ S > 0 ) is sufficiently large to compensate for the Gibbs free energy (e.g., transformation, evaporation, dissociation). In contrast, highly endothermic reactions with small reaction entropy only occur at very high temperatures. Exotropic reactions ( Δ S < 0 ) with large reaction entropy require highly exothermic conditions to occur at high temperature, while weakly exothermic reactions with small reaction entropy to be absorbed are only possible at low temperature (e.g., condensation, association). Thus, knowledge of reaction entropy as a function of temperature for the system of products and the system of reagents is important to obtain a vivid picture of possible reactions, and reaction entropy is closely linked to the entropy capacity of these systems.
In general, the molar reaction entropy Δ S ^ can be estimated according to Equation (8) by integrating the difference of molar entropy capacities of products K ^ p , i and reagents K ^ p , j weighted by respective stoichiometric coefficients ν i and ν j :
Δ S ^ ( T ) = 0 T i products ν i · K ^ p , i ( T ) j reagents ν j · K ^ p , j ( T ) · d T

8. Caloric Materials

So-called caloric materials often exhibit polymorphic phase transitions, which cause the absorption or release of entropy due to changed entropy capacity, and are triggered by magnetic stress (magnetocaloric [46]), mechanical stress (elastocaloric [47]), electrical field (electrocaloric [48,49]) or hydrostatic pressure (barocaloric [45]).
Using the example of magnetocaloric materials, Fuchs [29] (p. 234) discussed the coupling of magnetic and thermal processes. The flow of entropy from the environment into the material or out of the material into the environment depends on the latent entropy (with respect to magnetisation) and the entropy capacity K M at constant magnetisation M:
K M = S T M
directly leading to the latent entropy with respect to magnetisation (the extensive magnetic quantity). Latent entropy is related to the isothermal change of entropy [25] (p. 85) and it coincidences with what has been called reaction entropy in context of Equation (7) and Equation (8). “The term latent denotes the property of entropy not to affect the temperature of the system during phase change [29] (p. 191f).” The common understanding of entropy is that it changes the temperature of a system. When it does not, it is termed latent entropy in contrast to sensible entropy. Latent entropy (i.e., latent reaction entropy) gives an illustrative view of the isothermal absorption of entropy when the magnetisation is lowered. Upon lowering magnetisation, the entropy vessel becomes wider and can store more entropy at a given temperature. Recall that the relationship of stored entropy to the entropy capacity is given by Equation (6).
When considering the adiabatic demagnetisation of a paramagnetic substance, which is used to reach ultralow temperature, the entropy capacity K H at a constant magnetic field H (the intensive magnetic quantity) is used:
K H = S T H
These views can easily be extended to other members of the family of caloric materials with appropriate entropy capacity to be identified. Considering the multitude of thermal cycles that are possible, countless entropy capacities may be considered. Figure 5 provides some examples with either intensive or extensive fixed quantities. The respective symbols are explained in Table 1.
Giant electrocaloric effects have been reported for single-crystal BaTiO 3 [50]. In refs. [48,51], the theoretical electrical entropy versus temperature diagram for BaTiO 3 is discussed for different strengths of the applied electrical field. By differentiating these curves with respect to temperature, the electrical entropy capacity at constant electrical field K can be obtained. However, preference is usually given to the empirical data on the specific temperature coefficient of energy at the constant electrical field C ˜ (see Figure A3b), which were reported by Bai et al. [52]. These data were used to deduce the specific entropy capacity at a constant electrical field K ˜ for zero field and = 10 kV · cm 1 (see Figure A3b).
The electrocaloric cycle given by Scott [48] is adapted to barium titanate and analysed in Figure 6 with respect to entropy capacity. The cycle starts at zero field at a temperature of 412 K, slightly above the paraelectric–ferroelectric phase transition. With the electrical field applied in an adiabatic process, BaTiO 3 becomes a narrower entropy vessel, which can store the initial entropy only with the entropy level (i.e., temperature) increased by Δ T = 0.8 K to 412.8 K. Then, the reaction entropy Δ S = 1.45 J · K 2 · kg 1 is ejected and the temperature decreases to 412 K again. In another adiabatic process, the electrical field is decreased to zero again, which makes BaTiO 3 a wider entropy vessel, and its temperature decreases to 411.2 K. Then, the entropy of an amount equal to the reaction entropy is absorbed, the temperature rises to 412 K, and the cycle is closed. The process is driven by polarisation energy and leads to the pumping of thermal energy. Note that arrows related to energy forms have different thicknesses. Reaction entropy Δ S is ejected at a higher temperature than that at which it is absorbed, which makes the associated thermal energy ejected in the warm leg of the cycle larger than the thermal energy absorbed in the cold leg of the cycle. The absorption or ejection of polarisation energy or thermal energy changes the internal energy, but none of the energy forms are part of the internal energy.
The figures given here for entropy and temperature are not superbly accurate because the graphs given in [52] were sampled in steps of only 0.5 K and interpolated to 0.1 K steps using an Akima spline fit. Bai et al. [52] reported a specific reaction entropy of Δ S ˜ = 1.9 J · K 2 · kg 1 ( Δ T = 1.6 K ) at 412 K. The values in [52] were estimated from empirical C ˜ T , (see Equation (A6)) and T values using the relationship equivalent to Equation (11), but without explicitly mentioning entropy capacity:
Δ S ˜ = 0 T K ˜ ( T , ) K ˜ ( T , 0 ) · d T
In Figure 6 and the discussion given above, irreversibility is omitted for clarity. For the treatment of generated reaction entropy in addition to latent reaction entropy, the reader is referred to [25] (p. 241ff).

9. Thermoelectrics and Thermal Conductivity

The thermoelectric figure of merit f = z T can be expressed as
f = power factor Λ = power factor λ · T z T
with the open-circuited specific thermal conductivity expressed either as entropy conductivity Λ or “heat” conductivity λ , which are related by the absolute temperature T according to λ = T · Λ [53,54]. An established method to measure the thermal conductivity is based on light flash analyser, which estimates the thermal diffusivity D th [34]. When the density ρ and the specific isobaric “heat capacity” C ˜ p or the specific isobaric entropy capacity K ˜ p are also known, the “heat” conductivity:
λ = D th · ρ · C ˜ p
or the entropy conductivity:
Λ = D th · ρ · K ˜ p
can be obtained. The thermal diffusivity D th can be regarded as the diffusion coefficient of “heat” as well as the diffusion coefficient of entropy [19]. With temperature T explicitly showing up in the right part of Equation (12), entropy conductivity is implicitly used (left part of Equation (12)), which implicitly refers to the entropy capacity according to Equation (14). Notes on Fourier’s original work support the view to centre considerations on thermal conductivity around a storable quantity [31].

10. Phononic Contributions to Entropy Capacity: Debye Model

According to Equations (A4) and (A3), the molar isochoric entropy capacity of the phonon gas with Debye temperature Θ D is given by Equation (15):
K ^ V = 9 · R · T 2 Θ D 3 · 0 Θ D T x 4 e x 1 2 d x
In Figure 7a, the molar isochoric entropy capacity according to Equation (15) is plotted as a function temperature for five different Debye temperatures. The examples were taken from Debye’s original work [11] and correspond to extrapolations for lead ( Θ D = 95 K ), silver ( Θ D = 215 K ), copper ( Θ D = 309 K ), aluminium ( Θ D = 396 K ) and diamond ( Θ D = 1830 K , accurate description is given in Figure A2). The parabolic low-temperature course according to the phonon-related part of Equation (17) is also shown for each Debye temperature. The corresponding graphs for the molar temperature coefficient of energy C ^ V versus temperature are plotted in Figure 7b, which follows from combining Equations (4) and (15).
Remember that the purpose of considering the “heat capacity” C V is to estimate the amount of entropy stored [3] (p. 32, 353f), which can easily be estimated by visually integrating the graphs in Figure 7a where the entropy stored in 1 mol lead at 300 K is many times over the entropy stored in 1 mol diamond at the same temperature, which is not obvious from Figure 7b.

11. Phononic and Electronic Contributions to Entropy Capacity

In the low-temperature limit of the Debye model, the electronic and phononic contributions to the molar temperature coefficient of energy C ^ V are traditionally considered according to Equation (16) [36]:
C ^ V = γ · T + β · T 3 ,
Combining Equation (16) with Equation (4), the electronic and phononic contributions to the molar entropy capacity K ^ V in the low-temperature limit of the Debye model are obtained:
K ^ V = γ + β · T 2 ,
Here, γ is the molar isochoric entropy capacity of the electron gas, often called the Sommerfeld coefficient [36] (p. 47):
γ = π 2 3 · R · k B · D E F ,
Here, k B is Boltzmann’s constant and D E F is the electronic density of states at the Fermi energy E F .
The coefficient β in the phononic contribution to the entropy capacity is as follows [36] (p. 459) and allows us to estimate the Debye temperature Θ D :
β = 12 5 · π 4 · R · Θ D 3
Equation (17) gives a physical meaning to the so-called Sommerfeld coefficient γ , which is the electronic contribution to the entropy capacity. Obviously, the electronic contribution to the entropy capacity is independent of temperature in the low-temperature approximation of the Sommerfeld–Drude model.
To retrieve the coefficients γ and β in Equation (16), C ^ V / T is often plotted as a function of T 2 without reference to the entropy capacity. However, as mentioned previously, it is indeed the molar entropy capacity K ^ V , which is plotted versus T 2 . Examples of K ^ V regression lines to empirical K ^ p data according to Equation (17) are shown in Figure 8 for gold, silver, copper and aluminium in the nonsuperconducting state. Note that K ^ V (model) and K ^ p (empirical) are considered to coincide at very low temperatures.
It is obvious from Table 2 that values for the Debye temperature in [55,56] (Figure 8) differ from the values in [11] (Figure 7). The former values are higher because they were obtained from fitting to the empirical data obtained at lower temperatures compared to the latter.

12. Discussion

12.1. Thermal Capacity

The storable thermal quantity is not the energy form “heat”, but the fluid-like quantity entropy. Thus, it is reasonable to associate the term thermal capacity with entropy capacity, which has been widely used implicitly. Its explicit use comes with the advantage of a descriptive fundamental understanding of thermal processes. Entropy capacity is a susceptibility and its inverse relates the response of the solid (change of temperature) to a stimulus (change of entropy contained).

12.2. Units of Entropy and Entropy Capacity

To emphasise that entropy is a countable quantity in its own rights, Wiberg [10] introduced for its units the special name Clausius ( 1 Clausius = 1 cal · K 1 = 4.1868 J · K 1 ). The use of the special name Carnot for the unit of entropy ( 1 Carnot = 1 Ct = 1 J · K 1 [25,57]), which goes back to a proposal by Callendar [15], has also been suggested. With respect to Figure 2a, it is then possible to state that 1 mol graphite contains an amount of entropy of 5.66 Ct at 300 K and 5.66 Ct + 8.95 Ct = 14.61 Ct at 600 K, which sounds better than saying 5.66 J · K 1 or 14.61 J · K 1 . Following this approach, the entropy capacity of 1 mol graphite at 600 K can be expressed as 0.029 Ct · K 1 , i.e., Carnot per Kelvin, which sounds better than 0.029 J · K 2 .
It is a curious irony that a quantity that is central not only to thermal processes but involved in all dissipative (i.e., irreversible) physical processes has not yet received a special name in the International System of Units (SI).

12.3. Confusion and Resolution

Zemansky [2] (p. 76) summarised that the “idea” of heat (in older theories) as a form of energy was put forward in 1839 by Séguin and in 1842 by Mayer. Experiments by Joule during the period from 1840 to 1849 convinced the world. In 1847, von Helmholtz wrote a paper in which he applied Joule’s ideas to the sciences of physical chemistry and physiology. Fuchs [27] (p. 295) put forth the question “What did early experiments on heat prove?”. The short answer is that these “measurements were too crude” and “did not substantially add to the progress of thermodynamics”. Identifying heat with energy or an energy form was guided by prejudice rather than by a logical chain of reasoning.
“Heat capacity” has become a dead metaphor due to semantic shifts in the meaning of caloric (heat) during the development of thermodynamics from 1830 to 1850 [17,31,58]. Further dead metaphors are “heat storage”, “heat storage density”, “thermal energy storage density”, “heat reservoir” and “heat sink”, which generate images in mind that are inconsistent with thermodynamics. More examples are given in [31].
Semantic and conceptual impositions of the traditional mechanical theory of “heat” can be avoided if instead of thermal energy, entropy is seen as a resurrection of Carnot’s caloric (heat). This view follows notes by Ostwald (Ref. [59] p. 77, Ref. [17] p. 10, [60]) and others [15,16,27,58,61]. In the first edition of his famous book, Fuchs [27] (p. 289ff) clearly outlined the misconceptions of the traditional mechanical theory of “heat”, but reconciled it with the caloric theory of heat by identifying corresponding terms and definitions of both approaches.
To overcome the dichotomy between theory and clarity, several authors have suggested the correction of the semantics in thermodynamics [17,23,27,29,31,58]. The traditional “heat” should be substituted by thermal energy and entropy substituted by heat. The quantity entropy, which is mostly considered difficult, would become such a simple thing that “any school boy [and school girl] can understand” [15] and that “can be learned intuitively” [62].

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/e24040479/s1, Video S1: Filling glass vessel with fluid; Video S2: Squeezing and relaxing a fluid-filled vessel; Computer Code S1: Python code to calculate entropy capacity and “heat capacity”.

Funding

The publication of this article was funded by the Open Access Fund of Leibniz University Hannover.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The author declares no conflict of interest.

Symbols

The following symbols are used in this manuscript:
C ˜ specific temperature coefficient of energy (at a constant electrical field)
(specific “heat capacity” at a constant electrical field)
C V temperature coefficient of energy (at constant volume)
(“heat capacity” at constant volume)
C ^ V molar temperature coefficient of energy
(molar “heat capacity” at constant volume)
C V , N temperature coefficient of energy at a constant number of particles
(“heat capacity” at constant volume and at a constant number of particles)
C p temperature coefficient of enthalpy
(“heat capacity” at constant pressure)
C ^ p molar temperature coefficient of enthalpy
(molar “heat capacity” at constant pressure)
C ˜ p specific temperature coefficient of enthalpy
(specific isobaric “heat capacity”)
C p , N temperature coefficient of enthalpy at a constant number of particles
(“heat capacity” at constant pressure and at a constant number of particles)
D th thermal diffusivity (diffusion coefficient of “heat”, diffusion coefficient of entropy)
D E F electronic density of states at the Fermi energy
Eenergy
E F Fermi energy
electrical field
fdimensionless thermoelectric figure of merit (see also z T )
Henthalpy
k B Boltzmann’s constant
K entropy capacity at constant electrical field
K ˜ specific entropy capacity at constant electrical field
K H entropy capacity at constant magnetic field
K M entropy capacity at constant magnetisation
K p entropy capacity at constant pressure (isobaric entropy capacity)
K ^ p molar isobaric entropy capacity
K ^ p , i molar isobaric entropy capacity of substance i
K ^ p , j molar isobaric entropy capacity of substance j
K p , N entropy capacity at constant pressure and at a constant number of particles
K ˜ p specific isobaric entropy capacity
K P entropy capacity at constant (electrical) polarisation
K V entropy capacity at constant volume (isochoric entropy capacity)
K V , N entropy capacity at constant volume and at a constant number of particles
K ^ V molar isochoric entropy capacity
K σ entropy capacity at constant stress
K ε entropy capacity at constant strain
Namount of substance (number of particles), given in mol
ppressure
P(electrical) polarisation
Runiversal gas constant
Sentropy
Tabsolute temperature
Vvolume
xintegration variable in Debye model
z T dimensionless thermoelectric figure of merit (see also f)
γ molar isochoric entropy capacity of the electron gas (Sommerfeld coefficient)
β factor in the Debye model (low-temperature limit)
Δ S reaction entropy
Δ S ^ molar reaction entropy
Δ T temperature difference
ε strain
λ open-circuited specific “heat” conductivity
Λ open-circuited specific entropy conductivity
ν i stoichiometric coefficient of substance i
ν j stoichiometric coefficient of substance j
ρ density
σ (mechanical) stress
Θ D Debye temperature

Appendix A. Entropy Capacity and “Heat Capacity” of Graphite and Diamond

Appendix A.1. “Heat Capacity” of Graphite and Diamond According to Vassiliev and Taldrik

Figure A1. Temperature dependence of the molar temperature coefficient of enthalpy C ^ p of graphite and diamond and the molar temperature coefficient of energy C ^ V according to the Dulong–Petit relationship (constant) of the classical ideal gas. C ^ p values were calculated according to the multiparameter model of Vassiliev and Taldrik [35] (see Appendix A.3).
Figure A1. Temperature dependence of the molar temperature coefficient of enthalpy C ^ p of graphite and diamond and the molar temperature coefficient of energy C ^ V according to the Dulong–Petit relationship (constant) of the classical ideal gas. C ^ p values were calculated according to the multiparameter model of Vassiliev and Taldrik [35] (see Appendix A.3).
Entropy 24 00479 g0a1

Appendix A.2. Entropy Capacity and “Heat Capacity” of Diamond

Figure A2. (a) Molar entropy capacity of diamond ( K ^ p ) according to the multiparameter model of Vassiliev and Taldrik [35] (see Appendix A.3), compared to the Debye model ( K ^ V ) with two different Debye temperatures (1830 K [11] and 2240 K [38]). The low-temperature Debye model and Dulong–Petit relationship are also displayed. (b) Molar temperature coefficient of enthalpy C ^ p of diamond according to the multiparameter model by Vassiliev and Taldrik [35] (see Appendix A.3), compared to the Debye model ( C ^ V ) with two different Debye temperatures (1830 K [11] and 2240 K [38]). The low-temperature Debye model and Dulong–Petit relationship are also displayed.
Figure A2. (a) Molar entropy capacity of diamond ( K ^ p ) according to the multiparameter model of Vassiliev and Taldrik [35] (see Appendix A.3), compared to the Debye model ( K ^ V ) with two different Debye temperatures (1830 K [11] and 2240 K [38]). The low-temperature Debye model and Dulong–Petit relationship are also displayed. (b) Molar temperature coefficient of enthalpy C ^ p of diamond according to the multiparameter model by Vassiliev and Taldrik [35] (see Appendix A.3), compared to the Debye model ( C ^ V ) with two different Debye temperatures (1830 K [11] and 2240 K [38]). The low-temperature Debye model and Dulong–Petit relationship are also displayed.
Entropy 24 00479 g0a2

Appendix A.3. Multiparameter Modelling of the Entropy Capacity and “Heat Capacity” of Graphite and Diamond

Recently, a reliable description of the C p of the carbon allotropes diamond and graphite from 0.1 K to the melting point was given by Vassiliev and Taldrik [35] using a Debye–Maier–Kelley hybrid model. In this model, 9 parameters must be estimated. Two parameters (a, b) are fixed by fitting the Maier–Kelley model to high-temperature empirical C p values. An additional six parameters are fixed by fitting the low-temperature empirical C p using a linear combination of Debye functions for C V with three Debye temperatures ( Θ D , 1 , Θ D , 2 , Θ D , 3 ) and three corresponding prefactors ( A 1 , A 2 , A 3 ). The last parameter ( T 0 ) is fixed to provide a smooth transition from the temperature coefficient of energy C V to the temperature coefficient of enthalpy C P . The parameter settings used in this work are listed in Table A1:
C p = a + b · T 1000 + 3 · R a 1 + T T 0 2 · C V
With C V being:
C V = 9 · N · R · j = 1 3 A j · f ˜ D T , Θ D , j · T Θ D , j 3
Here, f ˜ D is the Debye function:
f ˜ D T , Θ D , j = 0 Θ D , j T x 4 e x 1 2 d x = 0 Θ D , j T x 3 e x 1 d x Θ D , j T 4 · e Θ D , j T 1 1
Using Equations (4) and (5), the entropy capacities have been estimated according to:
K p = a + b · T 1000 + 3 · R a 1 + T T 0 2 · K V
With K V being:
K V = 9 · N · R · j = 1 3 A j · f ˜ D T , Θ D , j · T 2 Θ D , j 3
Table A2 shows good agreement between this work, based on the multiparameter model of Vassiliev and Taldrik [35], and Wiberg’s book [10] regarding accumulated entropies and reaction entropy. Recall that Wiberg equated the entropy capacity of both carbon allotropes from 600 K upward.
Table A1. Parameters for Equations (A1)–(A5) according to the Debye–Maier–Kelley hybrid model in the range of 0.1 K to the melting point for diamond (Table 6, 1b, in [35]) and graphite (Table 6, 2c, in [35]). Reprinted from Journal of Alloys and Compounds, 872, Vassiliev, V.P., Taldrik, A.F., Description of the heat capacity of solid phases by a multiparameter family of functions, 159682, Copyright (2021), with permission from Elsevier.
Table A1. Parameters for Equations (A1)–(A5) according to the Debye–Maier–Kelley hybrid model in the range of 0.1 K to the melting point for diamond (Table 6, 1b, in [35]) and graphite (Table 6, 2c, in [35]). Reprinted from Journal of Alloys and Compounds, 872, Vassiliev, V.P., Taldrik, A.F., Description of the heat capacity of solid phases by a multiparameter family of functions, 159682, Copyright (2021), with permission from Elsevier.
Phase T 0 A 1 Θ D , 1 A 2 Θ D , 2 A 3 Θ D , 3 ab
Diamond13660.0311833.60.4881968.70.4821824.524.590.287
Graphite282.60.7731949.90.114426.40.114947.924.250.848

Appendix A.4. Comparison to Wiberg’s Book

Table A2. Amount of accumulated entropy S Δ T in graphite and diamond in equal temperature intervals Δ T of 300 K and the associated integrated reaction entropy Δ S Δ T .
Table A2. Amount of accumulated entropy S Δ T in graphite and diamond in equal temperature intervals Δ T of 300 K and the associated integrated reaction entropy Δ S Δ T .
Temperature Interval S graphite Δ T S diamond Δ T Δ S Δ T
Wiberg [10] 1,2This work 3Wiberg [10] 1,2This Work 3Wiberg [10] 1,2This Work 3
1500–1800N/A4.43N/A4.29N/A0.14
1200–1500N/A5.22N/A5.07N/A0.15
900–12006.286.326.286.1200.20
600–9007.497.807.497.3900.41
300–6008.838.957.957.560.881.39
0–3005.785.662.432.333.353.33
1 Wiberg [10] presented values of entropy in the unit 1 Clausius = 1 cal · K−1 = 4.1868 J · K−1. 2 Wiberg likely used thermochemical data (pp. 24, 149, [10]) from the Landolt–Börnstein [63] to construct entropy capacity versus temperature diagrams in units of Clausius per Kelvin versus Kelvin. 3 Thermochemical data for graphite and diamond used in this work rely on the multiparameter model of Vassiliev and Taldrik [35].

Appendix B. Entropy Capacity and “Heat Capacity” of Barium Titanate

Figure A3a was deduced from Figure A3b by sampling graphs for zero field and = 10 kV cm 1 in steps of 0.5 K and using Equation (A6).
Figure A3. (a) Graph of the specific entropy capacity K ˜ of BaTiO 3 versus temperature for zero field and = 10 kV cm 1 ; and (b) Graph of the specific “heat capacity” C ˜ of BaTiO 3 versus temperature for zero field and 4 different field strength levels from [52]. Figure (b) was reprinted from Physica Status Solidi A, 209, Bai, Y., Ding, K., Zheng, G.P., Shi, S.Q., Qiao, L., Entropy-change measurement of electrocaloric effect of BaTiO 3 single crystal., 941–944, Copyright (2012), with permission from Wiley-VCH.
Figure A3. (a) Graph of the specific entropy capacity K ˜ of BaTiO 3 versus temperature for zero field and = 10 kV cm 1 ; and (b) Graph of the specific “heat capacity” C ˜ of BaTiO 3 versus temperature for zero field and 4 different field strength levels from [52]. Figure (b) was reprinted from Physica Status Solidi A, 209, Bai, Y., Ding, K., Zheng, G.P., Shi, S.Q., Qiao, L., Entropy-change measurement of electrocaloric effect of BaTiO 3 single crystal., 941–944, Copyright (2012), with permission from Wiley-VCH.
Entropy 24 00479 g0a3
The specific “heat capacity” at a constant electrical field is as follows:
C ˜ = T · K ˜ = T · S T

References and Notes

  1. For clarity, the traditional term “heat” is put into quotation marks when it addresses the thermal energy. When the term heat is left without quotation marks, it can be read as entropy. “Heat capacity” can be read either as temperature coefficient of energy or as temperature coefficient of enthalpy depending if isochoric or isobaric conditions apply.
  2. Zemansky, M. Heat and Thermodynamics; Mc Graw Hill: New York, NY, USA, 1951. [Google Scholar]
  3. Callen, H.B. Thermodynamics—An Introduction to the Physical Theories of Equilibrium Thermostatics and Irreversible Thermodynamics; John Wiley and Sons: New York, NY, USA, 1960. [Google Scholar]
  4. Strunk, C. Quantum transport of particles and entropy. Entropy 2021, 23, 1573. [Google Scholar] [CrossRef] [PubMed]
  5. Falk, G.; Ruppel, W. Energie und Entropie; Springer: Berlin, Germany, 1976. [Google Scholar] [CrossRef]
  6. The term energy form should not be taken literally. There is only one kind of energy.
  7. “Die Wärme ist aber kein Energieanteil, sondern eine Energieform....Daß aber Wärmeenergie nicht in einem System ‘drinsteckt’, sondern nur bei Energieaustausch auftritt, wie alle Energieformen, ist ein springender Punkt der Thermodynamik, auf den man nicht hartnäckig genug hinweisen kann.” (p. 92).
  8. Falk, G.; Herrmann, F.; Schmid, G. Energy forms or energy carriers? Am. J. Phys. 1983, 51, 1074–1077. [Google Scholar] [CrossRef]
  9. Callen’s famous book is dealing a lot with entropy as a central quantity in thermal phenomena.
  10. Wiberg, E. Die Chemische Affinität; Walter de Gruyter & Co: Berlin, Germany, 1951. [Google Scholar] [CrossRef]
  11. Debye, P. Zur Theorie der spezifischen Wärmen. Ann. Phys. 1912, 344, 789–839. [Google Scholar] [CrossRef] [Green Version]
  12. Lunn, A.C. The measurement of heat and the scope of Carnot’s principle. Phys. Rev. (Ser. I) 1919, 14, 1–19. [Google Scholar] [CrossRef] [Green Version]
  13. Falk, G. Physik—Zahl und Realität, 1st ed.; Birkhäuser: Basel, Switzerland, 1990. [Google Scholar] [CrossRef] [Green Version]
  14. Here, Falk uses heat as a synonym for entropy. When it goes to heat capacity, the wording is the same regardless if entropy capacity is meant or the traditional term. Careful reading is required, but from the context it is obvious which term is meant in each case. Falk states that the traditional term “heat capacity” is semantically and conceptually a trap.
  15. Callendar, H. The caloric theory of heat and Carnot’s principle. Proc. Phys. Soc. Lond. 1911, 23, 153–189. [Google Scholar] [CrossRef] [Green Version]
  16. Falk, G. Entropy, a resurrection of caloric—A look at the history of thermodynamics. Eur. J. Phys. 1985, 6, 108–115. [Google Scholar] [CrossRef] [Green Version]
  17. Herrmann, F.; Pohlig, M. Which physical quantity deserves the name “quantity of heat”? Entropy 2021, 23, 1078. [Google Scholar] [CrossRef]
  18. Herrmann, F.; Hauptmann, H. (Eds.) The Karlsruhe Physics Course—Thermodynamics. 2019. Available online: http://www.physikdidaktik.uni-karlsruhe.de/Parkordner/transit/KPK%20Thermodynamics%20Sec%20II.pdf (accessed on 25 February 2022).
  19. Strunk, C. Moderne Thermodynamik—Band 1: Physikalische Systeme und ihre Beschreibung, 2nd ed.; De Gruyter: Berlin, Germany, 2018. [Google Scholar] [CrossRef]
  20. Strunk, C. Moderne Thermodynamik—Band 2: Quantenstatistik aus Experimenteller Sicht, 2nd ed.; De Gruyter: Berlin, Germany, 2018. [Google Scholar] [CrossRef]
  21. Mareŝ, J.; Hulík, P.; Ŝesták, J.; Ŝpiĉka, V.; Kriŝtofik, J.; Stávek, J. Phenomenological approach to the caloric theory of heat. Thermochim. Acta 2008, 474, 16–24. [Google Scholar] [CrossRef]
  22. Here, Mareŝ et al. use the term caloric capacity in place of entropy capacity.
  23. Job, G. Neudarstellung der Wärmelehre—Die Entropie als Wärme; Akademische Verlagsgesellschaft: Frankfurt, Germany, 1972. [Google Scholar]
  24. Here, Job calls entropy heat* with an asterisks to distinguish it from the traditional use of “heat”.
  25. Job, G.; Rüffler, R. Physical Chemistry from a Different Angle, 1st ed.; Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar] [CrossRef]
  26. Job, G.; Rüffler, R. Physical Chemistry from a Different Angle Workbook, 1st ed.; Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar] [CrossRef]
  27. Fuchs, H.U. The Dynamics of Heat—A Unified Approach to Thermodynamics and Heat Transfer, 1st ed.; Springer: New York, NY, USA, 1996. [Google Scholar] [CrossRef]
  28. Fuchs, H.U. Solutions Manual for The Dynamics of Heat, 1st ed.; Springer: New York, NY, USA, 1996. [Google Scholar] [CrossRef]
  29. Fuchs, H.U. The Dynamics of Heat—A Unified Approach to Thermodynamics and Heat Transfer, 2nd ed.; Springer: New York, NY, USA, 2010. [Google Scholar] [CrossRef]
  30. Here, Fuchs uses heat and caloric as synonyms for entropy. He uses heat capacity as a synonym for entropy capacity. When the thermal energy or the temperature coefficient of energy is meant, he puts the terms “heat” or “heat capacity” into quotation marks. In the second issue of the book, Fuchs uses the term entropy capacitance instead of entropy capacity to underline analogies to other branches of physics.
  31. Fuchs, H.U.; D’Anna, M.; Corni, F. Entropy and the experience of heat. Entropy 2022. submitted. [Google Scholar]
  32. Here, Fuchs et al. use caloric as a synonym for entropy and caloric capacity as a synonym for entropy capacity.
  33. “Führt man einem chemischen Stoff Wärme zu, so vergrößert man seinen Entropiegehalt. Und in derselben Weise, in der sich beim Füllen eines Wassergefäßes mit Wasser der Flüssigkeitsspiegel im Behälter hebt, also die Höhe der eingefüllten Wassermenge, steigt auch beim Füllen irgendeines Entropiegefäßes (z.B. eines Gases oder einer Flüssigkeit) mit Entropie die Höhe des Entropiespiegels, d.h. die Temperatur des betreffenden Körpers. Die durch eine bestimmte Substanzmenge bewirkte Höhenzunahme hängt in beiden Fällen von der Gestalt des Behälters ab.” (p. 140).
  34. Buck, W.; Rudtsch, S. Chapter Thermal Properties. In Springer Handbook of Metrology and Testing; Springer: Berlin, Germany, 2011; pp. 453–484. [Google Scholar] [CrossRef]
  35. Vassiliev, V.P.; Taldrik, A.F. Description of the heat capacity of solid phases by a multiparameter family of functions. J. Alloys Compd. 2021, 872, 159682. [Google Scholar] [CrossRef]
  36. Ashcroft, N.W.; Mermin, N.D. Solid State Physics; Harcourt College Publishers: Fort Worth, TX, USA, 1976. [Google Scholar]
  37. Bigdeli, S.; Chen, Q.; Selleby, M. A new description of pure C in developing the third generation of Calphad databases. J. Phase Equilibria Diffus. 2018, 39, 832–840. [Google Scholar] [CrossRef] [Green Version]
  38. Tohie, T.; Kuwabara, A.; Oba, F.; Tanaka, I. Debye temperature and stiffness of carbon and boron nitride polymorphs from first principles calculations. Phys. Rev. 2006, 73, 064304. [Google Scholar] [CrossRef] [Green Version]
  39. Schmalzried, H.; Navrotsky, A. Festkörperthermodynamik—Chemie des Festen Zustandes; Verlag Chemie: Weinheim, Germany, 1975. [Google Scholar]
  40. Voronin, G.F.; Kutsenok, I.B. Universal method for approximating the standard thermodynamic functions of solids. J. Chem. Eng. Data 2013, 58, 2083–2094. [Google Scholar] [CrossRef]
  41. Ohtani, H. Chapter The CALPHAD Method. In Springer Handbook of Metrology and Testing; Springer: Berlin, Germany, 2011; pp. 1061–1090. [Google Scholar] [CrossRef]
  42. Davies, G.; Evans, T. Graphitization of diamond at zero pressure and at a high pressure. Proc. R. Soc. A 1972, 382, 413–427. [Google Scholar] [CrossRef]
  43. Bundy, F. Direct conversion of graphite to diamond in static pressure apparatus. J. Chem. Phys. 1963, 38, 631–643. [Google Scholar] [CrossRef]
  44. Wentorf, R. Behavior of some carbonaceous materials at very high pressures and high temperatures. J. Phys. Chem. 1965, 69, 3063–3069. [Google Scholar] [CrossRef]
  45. Boldrin, D. Fantastic barocalorics and where to find them. Appl. Phys. Lett. 2021, 118, 170502. [Google Scholar] [CrossRef]
  46. Zarkevich, N.A.; Zverev, V.I. Viable materials with a giant magnetocaloric effect. Crystals 2020, 10, 815. [Google Scholar] [CrossRef]
  47. Chauhan, A.; Patel, S.; Vaish, R.; Bowen, C.R. A review and analysis of the elasto-caloric effect for solid-state refrigeration devices: Challenges and opportunities. MRS Energy Sustain. 2015, 2, E16. [Google Scholar] [CrossRef] [Green Version]
  48. Scott, J. Electrocaloric materials. Annu. Rev. Mater. Res. 2011, 41, 229–240. [Google Scholar] [CrossRef]
  49. Liu, Y.; Scott, J.F.; Dkhil, B. Direct and indirect measurements on electrocaloric effect: Recent developments and perspectives. Appl. Phys. Rev. 2016, 3, 031102. [Google Scholar] [CrossRef] [Green Version]
  50. Moya, X.; Stern-Taulats, E.; Crossley, S.; González-Alonso, D.; Kar-Narayan, S.; Planes, A.; Mañosa, L.; Mathur, N.D. Giant electrocaloric strength in single-crystal BaTiO3. Adv. Mater. 2013, 25, 1360–1365. [Google Scholar] [CrossRef] [PubMed]
  51. Cao, H.X.; Li, Z.Y. Electrocaloric effect in BaTiO3 thin films. J. Appl. Phys. 2009, 106, 94104. [Google Scholar] [CrossRef]
  52. Bai, Y.; Ding, K.; Zheng, G.P.; Shi, S.Q.; Qiao, L. Entropy-change measurement of electrocaloric effect of BaTiO3 single crystal. Phys. Status Solidi A 2012, 209, 941–944. [Google Scholar] [CrossRef]
  53. Feldhoff, A. Thermoelectric material tensor derived from the Onsager–de Groot–Callen model. Energy Harvest. Syst. 2015, 2, 5–13. [Google Scholar] [CrossRef]
  54. Feldhoff, A. Power conversion and its efficiency in thermoelectric materials. Entropy 2020, 22, 803. [Google Scholar] [CrossRef] [PubMed]
  55. Corak, W.S. Atomic heat of copper, silver, and gold from 1 K to 5 K. Phys. Rev. 1955, 98, 1699–1708. [Google Scholar] [CrossRef]
  56. Phillips, N.E. Heat capacity of aluminum between 0.1 K and 4.0 K. Phys. Rev. 1959, 114, 676–686. [Google Scholar] [CrossRef]
  57. Herrmann, F. The Karlsruhe Physics Course. Eur. J. Phys. 2000, 21, 49–58. [Google Scholar] [CrossRef]
  58. Job, G. Der Zwiespalt zwischen Theorie und Anschauung in der heutigen Wärmelehre und seine geschichtlichen Ursachen. Sudhoffs Archiv. 1969, 53, 378–396. [Google Scholar]
  59. Ostwald, W. Die Energie; Verlag Johann Ambrosius Barth: Leipzig, Germany, 1908. [Google Scholar]
  60. Hermann and Pohlig provide an English translation of Ostwald’s note that entropy is concordant with Carnot’s caloric (heat).
  61. Hirshfeld, M.A. On “Some current misinterpretations of Carnot’s memoir”. Am. J. Phys. 1955, 23, 103. [Google Scholar] [CrossRef]
  62. Chen, M. Comment on ’A new perspective of how to understand entropy in thermodynamics’. Phys. Educ. 2021, 56, 028002. [Google Scholar] [CrossRef]
  63. Landolt, H.; Börnstein, R. Physikalisch-Chemische Tabellen, 5th ed.; Springer: Berlin, Germany, 1923. [Google Scholar]
Figure 1. The capacity of a glass to store a fluid (here red wine) depends on the shape of the glass and changes with the fluid level. Depending on the shape of the glass vessel, different amounts of fluid are needed to raise the fluid level by 25 mm each. From left to right, the beakers contain 63 mL, 120 mL, 83 mL and 54 mL of fluid, which add to 320 mL when filled into the glass. A video sequence of filling the glass is available as Video S1.
Figure 1. The capacity of a glass to store a fluid (here red wine) depends on the shape of the glass and changes with the fluid level. Depending on the shape of the glass vessel, different amounts of fluid are needed to raise the fluid level by 25 mm each. From left to right, the beakers contain 63 mL, 120 mL, 83 mL and 54 mL of fluid, which add to 320 mL when filled into the glass. A video sequence of filling the glass is available as Video S1.
Entropy 24 00479 g001
Figure 2. Temperature dependence of isobaric entropy capacity K p of 1 mol carbon allotropes: (a) graphite; (b) diamond; (c) graphite and diamond with differences highlighted. Entropy capacities were calculated according to the multiparameter model by Vassiliev and Taldrik [35] (see Appendix A.3). Following Wiberg [10].
Figure 2. Temperature dependence of isobaric entropy capacity K p of 1 mol carbon allotropes: (a) graphite; (b) diamond; (c) graphite and diamond with differences highlighted. Entropy capacities were calculated according to the multiparameter model by Vassiliev and Taldrik [35] (see Appendix A.3). Following Wiberg [10].
Entropy 24 00479 g002
Figure 3. Temperature dependence of molar isobaric entropy capacity K ^ p of graphite and diamond and temperature dependence of molar isochoric entropy capacity K ^ V according to the Dulong–Petit relationship (hyperbolic) of the classical ideal gas. Isobaric entropy capacities were calculated according to the multiparameter model by Vassiliev and Taldrik [35] (see Appendix A.3). Following Wiberg [10].
Figure 3. Temperature dependence of molar isobaric entropy capacity K ^ p of graphite and diamond and temperature dependence of molar isochoric entropy capacity K ^ V according to the Dulong–Petit relationship (hyperbolic) of the classical ideal gas. Isobaric entropy capacities were calculated according to the multiparameter model by Vassiliev and Taldrik [35] (see Appendix A.3). Following Wiberg [10].
Entropy 24 00479 g003
Figure 4. Molar reaction entropy of the transformation of diamond into graphite versus temperature as calculated according to the multiparameter model by Vassiliev and Taldrik [35] (see Appendix A.3). Following Wiberg [10].
Figure 4. Molar reaction entropy of the transformation of diamond into graphite versus temperature as calculated according to the multiparameter model by Vassiliev and Taldrik [35] (see Appendix A.3). Following Wiberg [10].
Entropy 24 00479 g004
Figure 5. Examples of entropy capacity K of caloric materials at different intensive or extensive quantities being constant.
Figure 5. Examples of entropy capacity K of caloric materials at different intensive or extensive quantities being constant.
Entropy 24 00479 g005
Figure 6. An electrocaloric cycle of barium titanate interpreted using entropy capacity. Irreversibility is omitted for clarity. A video sequence of squeezing and relaxing a fluid-filled vessel is available as Video S2 as an analogon. Following Scott [48].
Figure 6. An electrocaloric cycle of barium titanate interpreted using entropy capacity. Irreversibility is omitted for clarity. A video sequence of squeezing and relaxing a fluid-filled vessel is available as Video S2 as an analogon. Following Scott [48].
Entropy 24 00479 g006
Figure 7. (a) Graph of isochoric molar entropy capacity versus absolute temperature; (b) graph of molar temperature coefficient of energy C ^ V versus absolute temperature. The graphs were calculated according to the Debye model for five different Debye temperatures on the examples given in Debye’s original work [11] and include low-temperature approximations (i.e., T 2 dependence in (a) and T 3 dependence in (b)).
Figure 7. (a) Graph of isochoric molar entropy capacity versus absolute temperature; (b) graph of molar temperature coefficient of energy C ^ V versus absolute temperature. The graphs were calculated according to the Debye model for five different Debye temperatures on the examples given in Debye’s original work [11] and include low-temperature approximations (i.e., T 2 dependence in (a) and T 3 dependence in (b)).
Entropy 24 00479 g007
Figure 8. Plot of regression lines to the molar entropy capacity versus temperature squared. Based on parameters from [55] (Au, Ag, Cu) and [56] (Al) as given in Table 2.
Figure 8. Plot of regression lines to the molar entropy capacity versus temperature squared. Based on parameters from [55] (Au, Ag, Cu) and [56] (Al) as given in Table 2.
Entropy 24 00479 g008
Table 1. Energy forms in the context of caloric effects and related intensive and extensive quantities.
Table 1. Energy forms in the context of caloric effects and related intensive and extensive quantities.
Caloric EffectEnergy FormConjugated Quantities
Intensive QuantityExtensive Quantity
magnetocaloricmagnetisation energymagnetic field H magnetisationM
elastocaloricelastic energystress σ strain ε
electrocaloricpolarisation energyelectrical field polarisationP
barocaloriccompression energypressurepvolumeV
allthermal energy 1temperatureTentropyS
1 Thermal energy is also called “heat”.
Table 2. Comparison of the molar electronic entropy capacity γ (Sommerfeld coefficient) and Debye temperature Θ D for the elements displayed in Figure 7 and Figure 8.
Table 2. Comparison of the molar electronic entropy capacity γ (Sommerfeld coefficient) and Debye temperature Θ D for the elements displayed in Figure 7 and Figure 8.
Substance γ Θ D
(mJ K 2 mol 1 )(K)(K)
Au0.743 [55] 1164.57 [55] 1N/A [11]
Ag0.610 [55] 1225.3 [55] 1215 [11]
Cu0.688 [55] 1343.8 [55] 1309 [11]
Al1.35 [56] 2427.7 [56] 2396 [11]
1 Data from [55] refer to a corrected 1948 helium vapor pressure–temperature scale. 2 Data from [56] refer to the 1959 helium vapor pressure–temperature scale.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Feldhoff, A. On the Thermal Capacity of Solids. Entropy 2022, 24, 479. https://doi.org/10.3390/e24040479

AMA Style

Feldhoff A. On the Thermal Capacity of Solids. Entropy. 2022; 24(4):479. https://doi.org/10.3390/e24040479

Chicago/Turabian Style

Feldhoff, Armin. 2022. "On the Thermal Capacity of Solids" Entropy 24, no. 4: 479. https://doi.org/10.3390/e24040479

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop