Loading [MathJax]/jax/output/HTML-CSS/jax.js
Skip to main content
Library homepage
 

Text Color

Text Size

 

Margin Size

 

Font Type

Enable Dyslexic Font
Chemistry LibreTexts

5.2: The Thermal Boltzman Distribution

( \newcommand{\kernel}{\mathrm{null}\,}\)

Consider a N-particle ensemble. The particles are not necessarily indistinguishable and possibly have mutual potential energy. Since this is a large system, there are many different ways to arrange its particles and yet yield the same thermodynamic state. Only one arrangement can occur at a time. The sum of the probabilities of each separate arrangement equals the total number of separate arrangements. Then the probability of a system is:

pN=WNpi

where pN is the probability of the system, WN is the total number of different possible arrangements of the N particles in the system, and pi is the probability of each separate arrangement. Heisenberg's uncertainty principle states that it is impossible to simultaneously know the momentum and the position of an object with complete precision. In agreement with the uncertainty principle, the total possible number of combinations can be defined as the total number of distinguishable rearrangements of the N particles.

The most practical ensemble is the canonical ensemble with N, V, and T fixed. We can imagine a collection of boxes with equal volumes and number of particles with the entire collection kept in thermal equilibrium. Based on the Boltzmann factor, we know that for a system that has states with energies e1,e2,e3..., the probability pj that the system will be in the state j with energy Ej is exponentially proportional to the energy of state j. The partition functions of the state places a very important role in calculating the properties of a system, for example, it can be used to calculate the probability, as well as the energy, heat capacity, and pressure.

The Boltzmann Distribution

We are ultimately interested in the probability that a given distribution will occur. The reason for this is that we must have this information in order to obtain useful thermodynamic averages. Let's consider an ensemble of A systems. We will define aj as the number of systems in the ensemble that are in the quantum state j. For example, a1 represents the number of systems in the quantum state 1. The total number of possible microstates is:

W(a1,a2,...)=A!a1!a2!...

The overall probability that Pj that a system is in the jth quantum state is obtained by averaging aj/A over all the allowed distributions. Thus, Pj is given by:

Pj=ajA=1AaW(a)aj(a)aW(a)

where the angle brackets indicate an ensemble average. Using this definition we can calculate any average property (i.e. any thermodynamic property):

M=jMjPj

The method of the most probable distribution is based on the idea that the average over Pj is identical to the most probable distribution. Physically, this results from the fact that we have so many particles in a typical system that the fluctuations from the mean are extremely (immeasurably) small. The equivalence of the average probability of an occupation number and the most probable distribution is expressed as follows:

Pj=ajA=ajA

The probability function is subject to the following constraints:

  • Constraint 1: Conservation of energy requires: Etotal=jajej where ej is the energy of the jth quantum state.
  • Constraint 2: Conservation of mass requires: A=jaj which says only that the total number of all of the systems in the ensemble is A.

As we will learn in later chapters, the system will tend towards the distribution of aj that maximizes the total number of microstates. This can be expressed as:

j(lnWaj)=0

Our constraints becomes:

jejdaj=0

jdaj=0

The method of Lagrange multipliers (named after Joseph Louis Lagrange is a strategy for finding the local maxima and minima of a function subject to equality constraints. Using the method of LaGrange undetermined multipliers we have:

j[(lnWaj)daj+αdajβejdaj]=0

We can use Stirling's approximation:

lnx!xlnxx

to evaluate:

(lnWaj)

to get:

(A!aj)i(lnaiaj)=0

as outlined below.

Application of Stirling's Approximation

First step is to note that:

lnW=lnA!jlnaj!aAlnAAjajlnajjaj

Since (from Equation ???):

A=jaj

these two cancel to give:

lnW=AlnAjajlnaj

The derivative is:

(lnWaj)=AlnAajiailnaiaj

Therefore we have:

(AlnAaj)=AajlnAAaj=lnA1

(ailnaiaj)=aiajlnaiaiaj=lnaj+1

These latter derivatives result from the fact that:

(aiai)=1

(ajai)=0

The simple expression that results from these manipulations is:

ln(ajA)+αβej=0

The most probable distribution is:

ajA=eajeβej

Now we need to find the undetermined multipliers α and β.

The left hand side of Equation ??? is 1. Thus, we have:

Pj=ajA=eβejjeβej

This determines a and defines the Boltzmann distribution. We will show that β from the optimization procedure of method of Lagrange multipliers is:

β=1kT

This identification will show the importance of temperature in the Boltzmann distribution. The distribution represents a thermally equilibrated most probable distribution over all energy levels (Figure 17.2.1 ).

alt
Figure 17.2.1 : At lower temperatures, the lower energy states are more greatly populated. At higher temperatures, there are more higher energy states populated, but each is populated less. kBT 2.5kJmol1 at 300 K. (CC SA-BY 3.0; Mysterioso via Wikiversity).
Boltzmann Distribution

The Boltzmann distribution represents a thermally equilibrated most probable distribution over all energy levels. There is always a higher population in a state of lower energy than in one of higher energy.

Once we know the probability distribution for energy, we can calculate thermodynamic properties like the energy, entropy, free energies and heat capacities, which are all average quantities (Equation ???). To calculate Pj, we need the energy levels of a system (i.e., {ei}). The energy ("levels") of a system can be built up from the quantum energy levels

It must always be remembered that no matter how large the energy spacing is, there is always a non-zero probability of the upper level being populated. The only exception is a system that is at absolute zero. This situation is however hypothetical as absolute zero can be approached but not reached.

Partition Function

The sum over all factors eβej is given a name. It is called the molecular partition function, q:

q=jeβej

The molecular partition function q gives an indication of the average number of states that are thermally accessible to a molecule at the temperature of the system. The partition function is a sum over states (of course with the Boltzmann factor β multiplying the energy in the exponent) and is a number. Larger the value of q, larger the number of states which are available for the molecular system to occupy (Figure 17.2.2 ).

alt
Figure 17.2.2 : At lower temperatures, the lower energy states are more greatly populated. At higher temperatures, there are more higher energy states populated, but each is populated less. (CC SA-BY 3.0; Mysterioso via Wikiversity).

We distinguish here between the partition function of the ensemble, Q and that of an individual molecule, q. Since Q represents a sum over all states accessible to the system it can written as:

Q(N,V,T)=i,j,k...eβ(ei+ej+ek...)

where the indices i,j,k represent energy levels of different particles.

Regardless of the type of particle the molecular partition function, q represents the energy levels of one individual molecule. We can rewrite the above sum as:

Q=qiqjqk

or:

Q=qN

for N particles. Note that qi means a sum over states or energy levels accessible to molecule i and qj means the same for molecule j. The molecular partition function, q counts the energy levels accessible to molecule i only. Q counts not only the states of all of the molecules, but all of the possible combinations of occupations of those states. However, if the particles are not distinguishable then we will have counted N! states too many. The factor of N! is exactly how many times we can swap the indices in Q(N,V,T) and get the same value (again provided that the particles are not distinguishable). See this video for more information.

References

  1. Hakala, R.W. (1967). Simple justification of the form of Boltzmann's distribution law. Journal of Chemical Education. 44(11), 657. doi: 10.1021/ed044p657
  2. Grigorenko, I, Garcia, M.E. (2002). Calculation of the partition function using quantum genetic algorithms. Physica A: Satistical Mechanics and its Applications. 313. 463-470. Retrieved from http://www.sciencedirect.com/science...78437102009883

Problems

  1. Complete the justification of Boltzmann's distribution law by computing the proportionality constant a.
  2. A system contains two energy levels E1,E2. Using Boltzmann statistics, express the average energy of the system in terms of E1,E2.
  3. Consider a system contains N energy levels. Redo problem #2.
  4. Use the property of exponential function, derive equation (17.9).
  5. What are the uses of partition functions?

5.2: The Thermal Boltzman Distribution is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by LibreTexts.

  • Was this article helpful?

Support Center

How can we help?