# 2.10: Entropy and Equilibrium in an Isolated System

In an isolated system, the probability of population set $$\{N_1,\ N_2,\dots ,N_i,\dots \}$$ is $$W\left(N_i,g_i\right){\rho }_{MS,N,\left\langle E\right\rangle }$$, where $${\rho }_{MS,N,\left\langle E\right\rangle }$$ is a constant. It follows that $$W=W\left(N_i,g_i\right)$$ is proportional to the probability that the system is in one of the microstates associated with the population set $$\{N_1,\ N_2,\dots ,N_i,\dots \}$$. Likewise, $$W^{\#}=W\left(N^{\#}_i,g_i\right)$$ is proportional to the probability that the system is in one of the microstates associated with the population set $$\{N^{\#}_1,N^{\#}_2,\dots N^{\#}_i,\dots \}$$. Suppose that we observe the isolated system for a long time. Let $$F$$ be the fraction of the time that the system is in microstates of population set $$\{N_1,\ N_2,\dots ,N_i,\dots \}$$ and $$F^{\#}$$ be the fraction of the time that the system is in microstates of the population set $$\{N^{\#}_1,N^{\#}_2,\dots N^{\#}_i,\dots \}$$. The principle of equal a priori probabilities implies that we would find

$\frac{F^{\#}}{F}=\frac{W^{\#}}{W}$

Suppose that $$W^{\#}$$ is much larger than $$W$$. This means there are many more microstates for $$\{N^{\#}_1,N^{\#}_2,\dots N^{\#}_i,\dots \}$$ than there are for $$\{N_1,\ N_2,\dots ,N_i,\dots \}$$. The fraction of the time that the population set $$\{N^{\#}_1,N^{\#}_2,\dots N^{\#}_i,\dots \}$$ characterizes the system is much greater than the fraction of the time $$\{N_1,\ N_2,\dots ,N_i,\dots \}$$ characterizes it. Alternatively, if we examine the system at an arbitrary instant, we are much more likely to find the population set $$\{N^{\#}_1,N^{\#}_2,\dots N^{\#}_i,\dots \}$$ than the population set $$\{N_1,\ N_2,\dots ,N_i,\dots \}$$. The larger $$W\left(N_1,g_1,\ N_2,g_2,\dots ,N_i,g_i,\dots \right)$$, the more likely it is that the system will be in one of the microstates associated with the population set $$\{N_1,\ N_2,\dots ,N_i,\dots \}$$. In short, $$W$$ predicts the state of the system; it is a measure of the probability that the macroscopic properties of the system are those of the population set $$\{N_1,\ N_2,\dots ,N_i,\dots \}$$.

If an isolated system can undergo change, and we re-examine it at after a few molecules have moved to different energy levels, we expect to find it in one of the microstates of a more-probable population set; that is, in one of the microstates of a population set for which $$W$$ is larger. At still later times, we expect to see a more-or-less smooth progression: the system is in microstates of population sets for which the values of $$W$$ are increasingly larger. This can continue only until the system occupies one of the microstates of the population set for which $$W$$ is a maximum or a microstate of one of the population sets whose macroscopic properties are essentially the same as those of the constant-$$N$$-$$V$$-$$E$$ population set for which $$W$$ is a maximum.

Once this occurs, later inspection may find the system in other microstates, but it is overwhelmingly probable that the new microstate will still be one of those belonging to the largest-$$W$$ population set or one of those that are macroscopically indistinguishable from it. Any of these microstates will belong to a population set for which $$W$$ is very well approximated by $$W\left(\ N^{\textrm{⦁}}_1,g_1,\ N^{\textrm{⦁}}_2,g_2,\dots ,N^{\textrm{⦁}}_i,g_i,\dots \right)$$. Evidently, the largest-$$W$$ population set characterizes the equilibrium state of the either the constant-$$N$$-$$V$$-$$T$$ system or the constant–$$N$$-$$V$$-$$E$$ system. Either system can undergo change until $$W$$ reaches a maximum. Thereafter, it is at equilibrium and can undergo no further macroscopically observable change.

Boltzmann recognized this relationship between $$W$$, the thermodynamic probability, and equilibrium. He noted that the unidirectional behavior of $$W$$ in an isolated system undergoing spontaneous change is like the behavior we found for the entropy function. Boltzmann proposed that, for an isolated (constant energy) system, $$S$$ and $$W$$ are related by the equation $$S=k{ \ln W\ }$$, where $$k$$ is Boltzmann’s constant. This relationship associates an entropy value with every population set. For an isolated macroscopic system, equilibrium corresponds to a state of maximum entropy. In our microscopic model, equilibrium corresponds to the population set for which $$W$$ is a maximum. By the argument we make in §6, this population set must be well approximated by the most probable population set, $$\{N^{\textrm{⦁}}_1,N^{\textrm{⦁}}_2,\dots N^{\textrm{⦁}}_i,.,,,\}$$. That is, the entropy of the equilibrium state of the macroscopic system is

\begin{align*} S &= k ~ { \ln W_{max}\ } \\[4pt] &=k ~ { \ln \frac{N!}{N^{\textrm{⦁}}_i!N^{\textrm{⦁}}_i!\dots N^{\textrm{⦁}}_i!\dots }\ }+k ~ \sum^{\infty }_{i=1}{N^{\textrm{⦁}}_i{ \ln g_i\ }} \end{align*}

This equation can be taken as the definition of entropy. Clearly, this definition is different from the thermochemical definition, $$S={q^{rev}}/{T}$$. We can characterize—imperfectly—the situation by saying that the two definitions provide alternative scales for measuring the same physical property. As we see below, our statistical theory enables us to define entropy in still more ways, all of which prove to be functionally equivalent. Gibbs characterized these alternatives as “entropy analogues;” that is, functions whose properties parallel those of the thermochemically defined entropy.

We infer that the most probable population set characterizes the equilibrium state of either the constant-temperature or the constant-energy system. Since our procedure for isolating the constant-temperature system affects only the thermal interaction between the system and its surroundings, the entropy of the constant-temperature system must be the same as that of the constant-energy system. Using $$N^{\textrm{⦁}}_i=NP_i=Ng_i\rho \left({\epsilon }_i\right)$$ and assuming that the approximation $${ \ln N^{\textrm{⦁}}_i!\ }=N^{\textrm{⦁}}_i{ \ln N^{\textrm{⦁}}_i\ }-N^{\textrm{⦁}}_i$$ is adequate for all of the energy levels that make a significant contribution to $$S$$, substitution shows that the entropy of either system depends only on probabilities:

\begin{align*} S &= kN ~ { \ln N - kN - k\sum^{\mathrm{\infty }}_{i\mathrm{=1}}{\left[NP_i{ \ln \left(NP_i\right)\ } - NP_i\right]}\ } + k\sum^{\mathrm{\infty }}_{i\mathrm{=1}}{NP_i{ \ln g_i\ }} \\[4pt]&= kN ~ { \ln N\ }\mathrm{-kN} -kN\sum^{\mathrm{\infty }}_{i\mathrm{=1}}{\left[P_i{ \ln \left(N\right)\ } + P_i{ \ln P_i\ } - P_i - P_i{ \ln g_i\ }\right]} \\[4pt] &= k ~ \left(N{ \ln N\ } - N\right) - k\left(N{ \ln N\ } - N\right)\sum^{\mathrm{\infty }}_{i\mathrm{=1}}{P_i} - kN\sum^{\mathrm{\infty }}_{i\mathrm{=1}}{P_i}\left[{ \ln P_i - { \ln g_i\ }\ }\right]\mathrm{=-}kN\sum^{\mathrm{\infty }}_{i\mathrm{=1}}{P_i{ \ln \rho \left({\epsilon }_i\right)\ }} \end{align*}

The entropy per molecule, $${S}/{N}$$, is proportional to the expected value of $${ \ln \rho \left({\epsilon }_i\right)\ }$$; Boltzmann’s constant is the proportionality constant. At constant temperature, $$\rho \left({\epsilon }_i\right)$$ depends only on $${\epsilon }_i$$. The entropy per molecule depends only on the quantum state properties, $$g_i$$ and $${\epsilon }_i$$.