4.8: The Statistical Interpretation of Entropy
Because entropy is such an important state function, it is natural to seek a description of its meaning on the microscopic level.
Entropy is sometimes said to be a measure of “disorder.” According to this idea, the entropy increases whenever a closed system becomes more disordered on a microscopic scale. This description of entropy as a measure of disorder is highly misleading. It does not explain why entropy is increased by reversible heating at constant volume or pressure, or why it increases during the reversible isothermal expansion of an ideal gas. Nor does it seem to agree with the freezing of a supercooled liquid or the formation of crystalline solute in a supersaturated solution; these processes can take place spontaneously in an isolated system, yet are accompanied by an apparent decrease of disorder.
Thus we should not interpret entropy as a measure of disorder. We must look elsewhere for a satisfactory microscopic interpretation of entropy.
A rigorous interpretation is provided by the discipline of statistical mechanics , which derives a precise expression for entropy based on the behavior of macroscopic amounts of microscopic particles. Suppose we focus our attention on a particular macroscopic equilibrium state. Over a period of time, while the system is in this equilibrium state, the system at each instant is in a microstate , or stationary quantum state, with a definite energy. The microstate is one that is accessible to the system—that is, one whose wave function is compatible with the system’s volume and with any other conditions and constraints imposed on the system. The system, while in the equilibrium state, continually jumps from one accessible microstate to another, and the macroscopic state functions described by classical thermodynamics are time averages of these microstates.
The fundamental assumption of statistical mechanics is that accessible microstates of equal energy are equally probable, so that the system while in an equilibrium state spends an equal fraction of its time in each such microstate. The statistical entropy of the equilibrium state then turns out to be given by the equation \begin{equation} S\subs{stat} = k \ln W + C \label{4.8.1} \end{equation} where \(k\) is the Boltzmann constant \(k=R/N\subs{A}\), \(W\) is the number of accessible microstates, and \(C\) is a constant.
In the case of an equilibrium state of a perfectly-isolated system of constant internal energy \(U\), the accessible microstates are the ones that are compatible with the constraints and whose energies all have the same value, equal to the value of \(U\).
It is more realistic to treat an equilibrium state with the assumption the system is in thermal equilibrium with an external constant-temperature heat reservoir. The internal energy then fluctuates over time with extremely small deviations from the average value \(U\), and the accessible microstates are the ones with energies close to this average value. In the language of statistical mechanics, the results for an isolated system are derived with a microcanonical ensemble, and for a system of constant temperature with a canonical ensemble.
A change \(\Del S\subs{stat}\) of the statistical entropy function given by Eq. 4.8.1 is the same as the change \(\Del S\) of the macroscopic second-law entropy, because the derivation of Eq. 4.8.1 is based on the macroscopic relation \(\dif S\subs{stat}=\dq/T=(\dif U-\dw)/T\) with \(\dif U\) and \(\dw\) given by statistical theory. If the integration constant \(C\) is set equal to zero, \(S\subs{stat}\) becomes the third-law entropy \(S\) to be described in Chap. 6.
Equation 4.8.1 shows that a reversible process in which entropy increases is accompanied by an increase in the number of accessible microstates of equal, or nearly equal, internal energies. This interpretation of entropy increase has been described as the spreading and sharing of energy (Harvey S. Leff, Am. J. Phys., 64, 1261–1271, 1996) and as the dispersal of energy (Frank L. Lambert, J. Chem. Educ., 79, 1241–1246, 2002). It has even been proposed that entropy should be thought of as a “spreading function” with its symbol \(S\) suggesting spreading (Frank L. Lambert and Harvey S. Leff, J. Chem. Educ., 86, 94–98, 2009). The symbol \(S\) for entropy seems originally to have been an arbitrary choice by Clausius.