Skip to main content
Chemistry LibreTexts

4.1: Swendsen’s Postulates of Thermodynamics

  • Page ID
    285766
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Cautionary Remarks on Entropy

    You will search in vain for a mathematical derivation or clear condensed explanation of entropy in textbooks and textbook chapters on statistical thermodynamics.. There is a simple reason for it: no such derivation or explanation exists. With entropy being a central concept, probably the central concept of the theory, this may appear very strange. However, the situation is not as bad as it may appear. The theory and the expressions that can be derived work quite well and have predictive power. There are definitions of entropy in statistical thermodynamics (unfortunately, more than one) and they make some sense. Hence, while it may be unnerving that we cannot derive the central state function from scratch, we can still do many useful things and gain some understanding.

    Textbooks tend to sweep the problem under the rug. We won’t do that here. We try to make an honest attempt to clarify what we do know and what we don’t know about entropy before accepting one working definition and base the rest of theory on this definition. It is probably best to start with a set of postulates that explains what we expect from the quantity that we want to define.

    Swendsen’s Postulates

    The following postulates are introduced and shortly discussed in Section 9.6 of Swendsen’s book . We copy the long form of these postulates verbatim with very small alterations that improve consistency or simplify the expression.

    1. There exist equilibrium states of a macroscopic system that are characterized uniquely by a small number of extensive variables.
    2. The values assumed at equilibrium by the extensive variables of an isolated system in the absence of internal constraints are those that maximize the entropy over the set of all constrained macroscopic states.
    3. The entropy of a composite system is additive over the constituent subsystems.
    4. For equilibrium states the entropy is a monotonically increasing function of the energy.
    5. The entropy is a continuous and differentiable function of the extensive variables.

    We have omitted Swendsen’s last postulate (The entropy is an extensive function of the extensive variables), because, strictly speaking, it is superfluous. If the more general third postulate of additivity is fulfilled, entropy is necessarily an extensive property.

    Swendsen’s first postulate (Equilibrium States) establishes the formalism of thermodynamics, while all the remaining postulates constitute a wish list for the quantity entropy that we need to predict the equilibrium states. They are a wish list in the sense that we cannot prove that a quantity with all these properties must exist. We can, however, test any proposed definition of entropy against these postulates.

    Some points need explanation. First, the set of postulates defines entropy as a state function, although this may be hidden. The first postulate implies that in equilibrium thermodynamics some extensive variables are state functions and that a small set of such state functions completely specifies all the knowledge that we can have about a macroscopic system. Because entropy in turn specifies the other state functions for an isolated system at equilibrium, according to the second postulate (Entropy Maximization), it must be a state function itself. It must be an extensive state function because of the third postulate (Additivity), but the third postulate requires more, namely that entropies can be added not only for subsystems of the same type in the same state, but also for entirely different systems. This is required if we want to compute a new equilibrium state (or entropy change) after unifying different systems. Otherwise, the simple calorimetry experiment of equilibrating a hot piece of copper with a colder water bath would already be outside our theory. The fourth postulate (Monotonicity) is new compared to what we discussed in phenomenological thermodynamics. For a classical ideal gas this postulate can be shown to hold. This postulate is needed because it ensures that temperature is positive. The fifth postulate is a matter of mathematical convenience, although it may come as a surprise in a theory based on integer numbers of particles. We assume, as at many other points, that the system is sufficiently large for neglecting any errors that arise from treating particle number as a real rather than an integer number. In other words, these errors must be smaller than the best precision that we can achieve in experiments. As we already know from phenomenological thermodynamics, the fifth postulate does not apply to first-order phase transitions, where entropy has a discontinuity. We further note that the second postulate is an alternative way of writing the Second Law of Thermodynamics. The term ’in the absence of internal constraints’ in the second postulate ensures that the whole state space (or, for systems fully described by Hamiltonian equations of motion, the whole phase space) is accessible.

    Entropy in Phenomenological Thermodynamics

    Textbook authors are generally much more comfortable in discussing entropy as an abstract state function in phenomenological thermodynamics than in discussing its statistical thermodynamics aspects. We recall that the concept of entropy is not unproblematic in phenomenological thermodynamics either. We had accepted the definition of Clausius entropy,

    \[\mathrm{d} s = \frac{\mathrm{d} q_\mathrm{rev}}{T} \ , \label{eq:entropy_phen}\]

    where \(\mathrm{d} q_\mathrm{rev}\) is the differentially exchanged heat for a reversible process that leads to the same differential change in other state variables as an irreversible process under consideration and \(T\) is the temperature. We could then show that entropy is a state function (Carnot process and its generalization) and relate entropy via its total differential to other state functions. With this definition we could further show that for closed systems, which can exchange heat, but not volume work with their environment (\(\mathrm{d}V = 0\)), minimization of Helmholtz free energy \(f = u - T s\) provides the equilibrium state and that for closed systems at constant pressure (\(\mathrm{d} p = 0\)), minimization of Gibbs free energy \(g = h - T s\) provides the equilibrium state. Partial molar Gibbs free energy is the chemical potential \(\mu_{k,\mathrm{molar}}\) and via \(\mu_{k,\mathrm{molecular}} = \mu_{k,\mathrm{molar}}/N_\mathrm{Av}\) it is related to terms in the partition function of the grand canonical ensemble, where we have abbreviated \(\mu_{k,\mathrm{molecular}}\) as \(\mu_k\) (Section [section:grand_canonical]).

    We were unable in phenomenological thermodynamics to prove that the definition given in Equation \ref{eq:entropy_phen} ensures fulfillment of the Second Law. We were able to give plausibility arguments why such a quantity should increase in some spontaneous processes, but not more.

    Boltzmann’s Entropy Definition

    Boltzmann provided the first statistical definition of entropy, by noting that it is the logarithm of probability, up to a multiplicative and an additive constant. The formula \(s = k \ln W\) by Planck, which expresses Boltzmann’s definition, omits the additive constant. We shall soon see why.

    We now go on to test Boltzmann’s definition against Swendsen’s postulates. From probability theory and considerations on ensembles we know that for a macroscopic system, probability density distributions for an equilibrium state are sharply peaked at their maximum. In other words, the macrostate with largest probability is such a good representative for the equilibrium state that it serves to predict state variables with better accuracy that the precision of experimental measurements. It follows strictly that any definition of entropy that fulfills Swendsen’s postulates must make \(s\) a monotonously increasing function of probability density12 for an isolated system.

    Why the logarithm? Let us express probability (for the moment discrete again) by the measure of the statistical weights of macrostates. We consider the isothermal combination of two independent systems A with entropies \(s_\mathrm{A}\) and \(s_\mathrm{B}\) to a total system with entropy \(s = s_\mathrm{A} + s_\mathrm{B}\). The equation for total entropy is a direct consequence of Swendsen’s third postulate. On combination, the statistical weights \(\Omega_\mathrm{A}\) and \(\Omega_\mathrm{B}\) multiply, since the subsystems are independent. Hence, with the monotonously increasing function \(f(\mathrm{\Omega})\) we must have

    \[s = f(\Omega) = f(\Omega_\mathrm{A}\cdot\Omega_\mathrm{B}) = f(\Omega_\mathrm{A}) + f(\Omega_\mathrm{B}) \ . \label{eq:s_additivity}\]

    The only solutions of this functional equation are logarithm functions. What logarithm we choose will only influence the multiplicative constant. Hence, we can write

    \[s = k \ln \Omega \ , \label{eq:Boltzmann_entropy}\]

    where, for the moment, constant \(k\) is unknown. Boltzmann’s possible additive constant must vanish at this point, because with such a constant, the functional equation ([eq:s_additivity]), which specifies additivity of entropy, would not have a solution.

    It is tempting to equate \(\Omega\) in Equation \ref{eq:Boltzmann_entropy} in the context of phase space problems with the volume of phase space occupied by the system. Indeed, this concept is known as Gibbs entropy (see Section [Gibbs_entropy]). It is plausible, since the phase space volume specifies a statistical weight for a continuous problem. No problem arises if Gibbs entropy is used for equilibrium states as it then coincides with Boltzmann entropy. There exists a conceptual problem, however, if we consider approach to equilibrium. The Liouville theorem (see Section [Liouville]) states that the volume in phase space taken up by a system is a constant of motion.13. Hence, Gibbs entropy is a constant of motion for an isolated system and the equilibrium state would be impossible to reach from any non-equilibrium state, which would necessarily occupy a smaller phase space volume. This leads to the following cautionary remark:

    Statistical thermodynamics, as we introduce it in this text, does not describe dynamics that leads from non-equilibrium to equilibrium states. Different equilibrium states can be compared and the equilibrium state can be determined, but we have made a number of assumptions that do not allow us to apply our expressions and concepts to non-equilibrium states without further thought. Non-equilibrium statistical thermodynamics is explicitly outside the scope of the theory that we present here.

    A conceptual complication with Boltzmann’s definition is that one might expect \(s\) to be maximal at equilibrium for a closed system, too, not only for an isolated system. In classical thermodynamics we have seen, however, that the equilibrium condition for a closed system is related to free energy. Broadly, we could say that for a closed system probability must be maximized for the system and its environment together. Unfortunately, this cannot be done mathematically as the environment is very large (in fact, for mathematical purposes infinite). The solution to this problem lies in the treatment of the canonical ensemble (Section [section_canonical]). In that treatment we have seen that energy enters into the maximization problem via the boundary condition of constant total energy of the system that specifies what exactly is meant by thermal contact between the system and its environment. We can, therefore, conclude that Boltzmann’s entropy definition, as further specified in Equation \ref{eq:Boltzmann_entropy}, fulfills those of Swendsen’s postulates that we have already tested and that the core idea behind it, maximization of probability (density) at equilibrium is consistent with our derivation of the partition function for a canonical ensemble at thermal equilibrium. We can thus fix \(k\) in Equation \ref{eq:Boltzmann_entropy}) by deriving \(s\) from the partition function.


    This page titled 4.1: Swendsen’s Postulates of Thermodynamics is shared under a CC BY-NC 3.0 license and was authored, remixed, and/or curated by Gunnar Jeschke via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.