3.1: Finding the Boltzmann Equation
-
- Last updated
- Save as PDF
The probabilities of the energy levels of a constant-temperature system at equilibrium must depend only on the intensive variables that serve to characterize the equilibrium state. In Section 20.8 , we introduce the principle of equal a priori probabilities, which asserts that any two microstates of an isolated system have the same probability. From the central limit theorem, we infer that an isolated system is functionally equivalent to a constant-temperature system when the system contains a sufficiently large number of molecules. From these ideas, we can now find the relationship between the energy values, \({\epsilon }_i\), and the corresponding probabilities,
\[P_i=P\left({\epsilon }_i\right)=g_i\rho \left({\epsilon }_i\right).\nonumber \]
Let us consider the microstates of an isolated system whose energy is \(E^{\#}\). For any population set, \(\{N_1,\ N_2,\dots ,N_i,\dots \}\), that has energy \(E^{\#}\), the following relationships apply.
- The sum of the energy-level populations is the total number of molecules: \[N=N_1+N_2+\dots +N_i=\displaystyle \sum^{\infty }_{j=1}{N_j}\nonumber \]
- The energy of the system is the sum of the energies of its constituent molecules: \[E^{\#}=N_1{\epsilon }_1+N_2{\epsilon }_2+\dots +N_i{\epsilon }_i=\displaystyle \sum^{\infty }_{j=1}{N_j}{\epsilon }_j\nonumber \]
- The product of powers of quantum-state probabilities is a constant: \[{\rho \left({\epsilon }_1\right)}^{N_1}{\rho \left({\epsilon }_1\right)}^{N_1}\dots {\rho \left({\epsilon }_1\right)}^{N_1}\dots =\textrm{ĸ}\nonumber \] or, equivalently, \[\begin{align*} N_1{ \ln \rho \left({\epsilon }_1\right)\ }+N_2{ \ln \rho \left({\epsilon }_2\right)\ }+\dots +N_i{ \ln \rho \left({\epsilon }_i\right)\ }+ … &=\displaystyle \sum^{\infty }_{i=1}{N_i{ \ln \rho \left({\epsilon }_i\right)\ }} \\[4pt] &={ \ln \textrm{ĸ}\ } \end{align*}\nonumber \]
- For the system at constant temperature, the sum of the energy-level probabilities is one. When we infer that the constant-temperature system and the isolated system are functionally equivalent, we assume that this is true also for the isolated system: \[1=P\left({\epsilon }_1\right)+P\left({\epsilon }_2\right)+\dots +P\left({\epsilon }_i\right)+\dots =\displaystyle \sum^{\infty }_{j=1}{P\left({\epsilon }_j\right)}\nonumber \]
We want to find a function, \(\rho \left(\epsilon \right)\), that satisfies all four of these conditions. One way is to keep trying functions that look like they might work until we find one that does. A slightly more sophisticated version of this approach is to try the most general possible version of each such function and see if any set of restrictions will make it work. We could even try an infinite series. Suppose that we are clever (or lucky) enough to try the series solution
\[{ \ln \rho \left(\epsilon \right)\ }=c_0+c_1\epsilon +\dots +c_i{\epsilon }^i+\dots =\displaystyle \sum^{\infty }_{k=0}{c_k}{\epsilon }^k\nonumber \]
Then the third condition becomes
\[\begin{align*} {\ln \textrm{ĸ}\ } &=\displaystyle \sum^{\infty }_{i=1}{N_i}{ \ln \rho \ }\left({\epsilon }_i\right) \\[4pt]&=\displaystyle \sum^{\infty }_{i=1}{N_i}\displaystyle \sum^{\infty }_{k=0}{\left[c_k{\epsilon }^k_i\right]}\\[4pt]&=\displaystyle \sum^{\infty }_{k=0}{\displaystyle \sum^{\infty }_{i=1}{c_kN_i{\epsilon }^k_i}}=c_0\displaystyle \sum^{\infty }_{i=1}{N_i}{\epsilon }^0_i+c_1\displaystyle \sum^{\infty }_{i=1}{N_i}{\epsilon }^1_i+\dots +c_k\displaystyle \sum^{\infty }_{k=2}{\displaystyle \sum^{\infty }_{i=1}{N_i{\epsilon }^k_i}}+\dots \\[4pt]&=c_0N+c_1E^{\#}+\dots +c_k\displaystyle \sum^{\infty }_{k=2}{\displaystyle \sum^{\infty }_{i=1}{N_i{\epsilon }^k_i}}+\dots \end{align*}\nonumber \]
We see that the coefficient of \(c_0\) is \(N\) and the coefficient of \(c_1\) is the total energy, \(E^{\#}\). Therefore, the sum of the first two terms is a constant. We can make the trial function satisfy the third condition if we set \(c_k=0\) for all \(k>1\). We find
\[{ \ln \textrm{ĸ}\ }=\displaystyle \sum^{\infty }_{i=1}{N_i}{ \ln \rho \ }\left({\epsilon }_i\right)=\displaystyle \sum^{\infty }_{i=1}{N_i}\left(c_0+c_1{\epsilon }_i\right)\nonumber \]
The last equality is satisfied if, for each quantum state, we have
\[{ \ln \rho \ }\left({\epsilon }_i\right)=c_0+c_1{\epsilon }_i\nonumber \] or \[\rho \left({\epsilon }_i\right)=\alpha \ \mathrm{exp}\left(c_1{\epsilon }_i\right)\nonumber \]
where \(\alpha =\mathrm{exp}\left(c_0\right)\). Since the \({\epsilon }_i\) are positive and the probabilities \(\rho \left({\epsilon }_i\right)\) lie in the interval \(0<\rho \left({\epsilon }_i\right)<1\), we must have \(c_1<0\). Following custom, we let \(c_1=-\beta\), where \(\beta\) is a constant, and \(\beta >0\). Then,
\[\rho \left({\epsilon }_i\right)=\alpha \ \mathrm{exp}\left(-\beta {\epsilon }_i\right)\nonumber \] and \[P_i=g_i\rho \left({\epsilon }_i\right)=\alpha g_i\ \mathrm{exp}\left(-\beta {\epsilon }_i\right)\nonumber \]
The fourth condition is that the energy-level probabilities sum to one. Using this, we have
\[1=\displaystyle \sum^{\infty }_{i=1}{P\left({\epsilon }_i\right)}=\alpha \displaystyle \sum^{\infty }_{i=1}{g_i\ \mathrm{exp}\left(-\beta {\epsilon }_i\right)}\nonumber \]
The sum of exponential terms is so important that it is given a name. It is called the molecular partition function . It is often represented by the letter “\(z\).” Letting
\[z=\displaystyle \sum^{\infty }_{i=1}{g_i\ \mathrm{exp}\left(-\beta {\epsilon }_i\right)}\nonumber \] we have
\[\alpha =\frac{1}{\displaystyle \sum^{\infty }_{i=1}{g_i\ \mathrm{exp}\left(-\beta {\epsilon }_i\right)}}=z^{-1}\nonumber \]
Thus, we have the Boltzmann probability:
\[\begin{align*} P\left({\epsilon }_i\right) &=g_i\rho \left({\epsilon }_i\right) \\[4pt] &=\frac{g_i\ \mathrm{exp}\left(-\beta {\epsilon }_i\right)}{\displaystyle \sum^{\infty }_{i=1}{g_i\ \mathrm{exp}\left(-\beta {\epsilon }_i\right)}} \\[4pt] &=\frac{g_i}{z}\ \mathrm{exp}\left(-\beta {\epsilon }_i\right) \end{align*} \nonumber \]
The probability of an energy level depends only on its degeneracy, \(g_i\), its energy, \({\epsilon }_i\), and the constant \(\beta\). Since the equilibrium-characterizing population set is determined by the probabilities, we have \(P_i={N^{\textrm{⦁}}_i}/{N}\), and
\[\frac{N^{\textrm{⦁}}_i}{N}=\frac{g_i}{z}\ \mathrm{exp}\left(-\beta {\epsilon }_i\right)\nonumber \]
In Section 21.2 , we develop Lagrange’s method of undetermined multipliers . In Section 21.3 , we develop the same result by applying Lagrange’s method to our model for the probabilities of the microstates of an isolated system. That is, we find the Boltzmann probability equation by applying Lagrange’s method to the entropy relationship,
\[S=-Nk\displaystyle \sum^{\infty }_{i=1}{P_i}{ \ln \rho \left({\epsilon }_i\right)\ }\nonumber \]
that we first develop in § 20-11. In § 4, we find the Boltzmann probability equation by using Lagrange’s method to find the values of \(N^{\textrm{⦁}}_i\) that produce the largest possible value for \(W_{max}\) in an isolated system. This argument requires us to assume that there is a very large number of molecules in each of the occupied energy levels of the most probable population set. Since our other arguments do not assume anything about the magnitude of the various \(N^{\textrm{⦁}}_i\), it is evident that some of the assumptions we make when we apply Lagrange’s method to find the \(N^{\textrm{⦁}}_i\) are not inherent characteristics of our microscopic model.