# 11. The Microcanonical Ensemble

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of $$\hat{\rho}_{en}$$ so we can evaluate average observables $$<A> =Tr\{\hat{A}\hat{\rho}_{en}\}=A$$ that give us fundamental relations or equations of state. Just as thermodynamics has its potentials $$U$$, $$A$$, $$H$$, $$G$$ etc., so statistical mechanics has its ensembles, which are useful depending on what macroscopic variables are specified.  We first consider the microcanonical ensemble because it is the one directly defined in postulate II of statistical mechanics.

In the microcanonical ensemble $$U$$ is fixed (Postulate I), and other constraints that are fixed are the volume $$V$$ and mole number $$n$$ (for a simple system), or other extensive parameters (for more complicated systems).

### 1. Definition of the partition function

The ‘partition function’ of an ensemble describes how probability is partitioned among all the available microstates compatible with the constraints imposed on the ensemble.

In the case of the microcanonical ensemble, every microstate has the same energy and the same probability. According to postulate II, this probability is given by $$p_i=\rho_{ii}^{(eq)}=q/W(U)$$ for each microstate “$$i$$” at energy $$U$$. The microcanonical partition function is $$W(U)$$. Using just this, we can evaluate equations of state and fundamental relations.

### 2. Calculation of thermodynamic quantities from W(U)

Example 11.1

Fundamental relation for a “lattice gas” model: entropy-volume part.

Consider again the model system of a box with $$M=\dfrac{V}{V_0}$$ volume elements $$V_0$$ and $$N$$ particles of volume $$V_0$$, so each particle can fill one volume elements. The particles can randomly hop among unoccupied volume elements to randomly sample the full volume of the box. This is a simple model of an ideal gas.  As shown in the last chapter,

$W=\dfrac{M!}{(M-N)!N!}$

for identical particles, and we can approximate this, if $$M<<N$$ by

$W \approx \dfrac{1}{N!} \left(\dfrac{V}{V_0}\right)^N$

since $$M!/(M-N!) \approx M^N$$ in that case. Assuming the hopping samples all microstates so the system reaches equilibrium, we compute the equilibrium entropy, as proved in chapter 10 from postulate III, as

$S =k_B\ln W \approx S_0 + NK_B\ln \left(\dfrac{V}{V_0}\right)$

where $$S_0$$ is independent of volume. This gives the volume dependence of the entropy of an ideal gas. Note that by taking the derivative

$\dfrac{\partial S}{\partial V} = k_B\dfrac{N}{V} = \dfrac{P}{T}$

we can immediately derive the ideal gas law $$PV = Nk_BT = nRT$$.

Example 11.2

Fundamental relation for a lattice gas: entropy-energy part.

The above model does not give us the energy dependence, since we did not explicitly consider the energy of the particles, other than to assume there was enough energy for them to randomly hop around.  We now remedy this by considering the energy levels of particles in a box. The result will also demonstrate once more that  increases so ferociously fast that it is equal to W with incredibly high accuracy for more than a handful of particles.

Let the total energy $$U$$ be randomly distributed among particles in a box of volume $$L^3 = V$$. The energy is given by

,

where i=1,2,3 are the x,y,z coordinates of particle #1, and so forth to i=3N-2,3N-1,3N are the x,y,z coordinates of particle #N. In quantum mechanics, the momentum of a free particle is given by p=h/l, where h is Planck’s constant. Only certain waves $$\Psi(x)$$ are allowed in the box, such that $$\Psi(x)=0$$ at the boundaries of the box, as shown in the figure below.

Fig. 11.1 Particle in a box wavefunction can only have wavelengths so that $$\Psi=0$$at the boundaries. The state space (or quantum number space) with $$3N$$ axes contains a hypersphere of constant energy.  In a large number of dimensions, the states (black dots) in a layer at the surface of this sphere is essentially equal to the total number of states within that surface.

The wavelengths l=L/2, L, 3L/2 ... can be inserted in the equation for total energy, yielding

,

the energy for a bunch of particles in a box.  W(U) is the number of states at energy U.  Looking at the figure again, all the energy levels are “dots” in a 3N-dimensional cartesian space, called the “state space”, or “action space” or sometimes “quantum number space.” The surface of constant energy U is the surface of a hypersphere of dimension $$3N-1$$ in state space. The reason is that the above equation is of the form constant = $$x^2+ y^2+ ...$$ where the variables are the quantum numbers. The number of states within a thin shell of energy U at the surface of the sphere is

.

is the total number of states inside the sphere, which at a first glance would seem to be much larger than W(U), the states in the shell.  In fact, for a very high dimensional hypervolume, a thin shell at the surface contains all the volume, so in fact,  is essentially equal to $$W(U)$$ and we can just calculate the former to a good approximation when N is large.

If this is hard to believe, consider an analogous example of a hypercube instead of a hypersphere.  Its volume is Lm, where m is the number of dimensions. The change in volume with side length L is ∂V/∂L=mLm-1, so DV=mLm-1DL is the volume of a shell of width DL at the surface of the cube.  The ratio of that volume to the total volume is DV/V=mDL/L. Let’s take the example our intuition is built on, m=3, and assume DL/L=0.001, just a 0.1% surface layer.  Then DV/V=3.10-3<<V, so our intuition works: the shell volume is much less than the total volume.  But now consider m=1020, a typical number of particles in a statistical mechanical system.  Now DV/V=1020.10-3=1017. The little increment in volume is much greater than the original volume of the cube, and contains essentially all the volume of the new “slightly larger” cube.  It may be “slightly” large in side length, but it is astronomically larger in volume.

Now back to our hypersphere in figure 1. Its volume, which is essentially equal to W(U), the number of states just at the surface of the sphere, is

.

The (1/2)3N is there because all quantum numbers must be greater than zero, so only the positive part of the sphere should be counted.  The Gamma function G is related to the factorial function by G(N)=(N-1)!=, and R is the radius of the sphere, which is given by

,

where nmax is the largest quantum number, if all energy is in a single mode. They key is that R~√U, so U is raised to the 3N/2 power, where N is the number of particles, 3 is because there are three modes per particle, and the 1/2 is because of the energy of a free particle depends on the square of the quantum number.  Thus

,

where the constant S0 is not the same as in the previous example. We used the volume equation from the previous example to obtain an equation of state (PV=nRT), and we can obtain another equation of state here:

.

This equation relates the energy of an ideal gas to its temperature.  3n is the number of modes or degrees of freedom (3 velocities per particle time n moles of particles), whereas the factor of 2 comes directly from the particle-in-a-box energy function – in case you ever wondered where that comes from.  So, for a harmonic oscillator, n~U ( as you may recall) instead of n~U1/2, and you might expect U=3nRT for 3N particles held together by springs into a solid crystal lattice.  And indeed, that is true for an ideal lattice at high temperature (in analogy to an ideal gas at high temperature).  Unlike free particles, the energy of oscillators does not have the factor of 1/2.  The ‘deep’ reason is that an oscillator has two degrees of freedom to store energy in each direction, not just one: there’s still the kinetic energy, but there’s also potential energy.

Example 11.3

A system of N uncoupled spins sz=±1/2

The Hamiltonian for this system in a magnetic field is given by

,

where the extra term at the end is added so the energy equals zero when all the spins are pointing down.  At energy U=0, no spin is excited.  For each excited spin, the energy increases by B, so at energy U, U/B atoms are excited.  These U/B excitations are indistinguishable and can be distributed in N sites:

.

This is our usual formula for permutations; the right side is in terms of Gamma functions, which are defined even when U/B is not an integer.  Gamma functions basically interpolate the factorial function for noninteger values.

This formula has a potential problem built-in: clearly, when U starts out at 0 and then increases, W initially increases. But for U=NB (the maximum energy), W=1 again.  In fact, W reaches its maximum for U=NB/2.  But if W(U) is not monotonic in U, then S isn’t either, violating P3 of thermodynamics.  Let’s see how this works out.

For large N, and temperature neither so low that , nor so high that , we can use the Stirling expansion lnN! ≈ NlnN-N, yielding

after canceling terms as much as possible.  We can now calculate the temperature and obtain the equation of state U(T):

In this equations, at .  So even at infinite temperature the energy can only go up to half the maximum value, where W(U) is monotonic. At most half the spins can be made to point up by heating.  The population cannot be ‘inverted’ to have more spins point up than down. This should come as no surprise: if the number of microstates is maximized by having only half the spins point up when energy is added, then that’s the state you will get (this is true even in the exact solution).

This observation does not mean that it is impossible to get all the spins to point up. It is just not an equilibrium state at any temperature between T=0 and T=∞.  Such non-equilibrium states with more spins up (or atoms excited) than down are called “inverted” populations.  In lasers, such states are created by putting the system (like a laser crystal) far out of equilibrium.  Such a state will then relax back to an equilibrium state, releasing a pulse of energy as the spins (or atoms) drop from the excited to the ground state.

The heat capacity of the above example system is

peaked at 4kBT/B,

so we can calculate thermodynamic quantities as input for thermodynamic manipulations.

As we shall see in detail later (actually, we saw it in the previous example!), in any real system the heat capacity must eventually approach , where N is the number of degrees of freedom.  However, a broad peak at is a sign of two low-lying energy levels spaced by B.  Levels at higher energy will eventually contribute to , making sure it does not drop.

Example 11.4

Let us check that $$T$$ derived from

indeed agrees with the intuitive concept of temperature.  Consider two baths within a closed system, so .  If we know

for each bath, then

at equilibrium because the maximum number of states is already occupied.  For this to be true for any infinitesimal energy flow dU1,

At equilibrium, the temperatures are equal, fitting our thermodynamic definition that “temperature is equalized when heat flow is allowed.”