Skip to main content
Chemistry LibreTexts

1.14.17: Energy and Entropy

  • Page ID
    374748
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    This Topic takes a rather different approach from the other Topics in this Notebook. Across the galaxy of terms used in thermodynamics, two terms stand out, namely Energy and Entropy. With respect to a given closed system, both terms describe extensive properties, using the letter \(\mathrm{U}\) to identify energy and the letter \(\mathrm{S}\) to identify entropy.

    The term ‘energy’ is used quite generally in everyday life. One dictionary [1] describes energy as ‘the power and ability to be physically active’. Perhaps we might not be too happy at the use of the term ‘power’ in this context on the grounds that this term is normally linked to the rate at which energy is supplied [2]. Indeed in every day life we refer to powerful engines in, for example, fast sports cars. Nevertheless in a thermodynamic context the concepts of energy and energy change are part of the language of chemistry; e.g. bond energy, energy of activation, radiant energy… The First Law of Thermodynamics formalises the concept of energy change using the following ostensibly simple equation.

    \[\Delta \mathrm{U}=\mathrm{q}+\mathrm{w}\]

    Here \(\mathrm{U}\) is the thermodynamic energy, a function of state; \(\Delta \mathrm{U}\) describes the increase in thermodynamic energy of a closed system when heat \(\mathrm{q}\) flows from the surroundings into a given system and work w is done by the surroundings on that system.

    The distinction between heat and work is crucial. There are many ways in which the surroundings can do work on a system. Caldin [3] lists many examples in which work is given by the product of Intensity and Capacity Factors; e.g. intensity factor pressure, \(\mathrm{p}\) and capacity factor volume, \(\mathrm{V}\) such that \(\mathrm{w}=\mathrm{p} \, \mathrm{dV}\).

    In the context of energy, chemical thermodynamics quite generally describes quite modest changes in energy. Even in the case of an explosion involving, for example, ignition of a mixture of hydrogen and oxygen gases, the energy change turns out to involve transitions between electronic energy levels in atoms and molecules. Much more dramatic are nuclear reactions which involve the conversion of mass, \(\mathrm{m}\) into energy \(\mathrm{E}\) as described by Einstein’s famous equation.

    \[\mathrm{E}=\mathrm{m} \, \mathrm{c}^{2}\]

    Here \(\mathrm{c}\) is the speed of light, \(3.00 \times 10^{8} \mathrm{~m s}^{-1}\). In nuclear fission the nucleus of an atom breaks into two smaller nuclei of similar mass [4]. Thus \(\text{uranium}^{235}\) nuclei bombarded by neutrons split into barium-142 and krypton-92 nuclei. Einstein’s equation shows that \(1.0 \mathrm{~g}\) of this uranium isotope undergoes fission with the release of \(7.5 \times 10^{10} \mathrm{~J}\) [5], an awesome amount of energy.

    Here we return to the domain of chemical properties and chemical reactions where nuclei of atoms are not destroyed. Our interest centres on the thermodynamic variable, entropy \(\mathrm{S}\), an extensive function of state. However in every day conversation and in articles in newspapers and magazines the term ‘entropy’ is rarely used suggesting that it is not important. This conclusion is incorrect and the message quite misleading.

    In these Topics we describe the Second Law using an equation based on the formulation given by Clausius as follows.

    \[\mathrm{T} \, \mathrm{dS}=\mathrm{q}+\mathrm{A} \, \mathrm{d} \xi\]

    Here a positive \(\mathrm{q}\) describes heat passing from the surroundings into a closed system; \(\mathrm{A}\) is the affinity for spontaneous change, the change being described by the property \(\xi\). Following De Donder [6] as discussed by Prigogine and Defay [7] the Second Law is simply stated as follows.

    \[A \, d \xi \geq 0\]

    For a system moving between equilibrium states (i.e. the system and surroundings are at all stages at equilibrium where \(\mathrm{A}\) is zero),

    \[\mathrm{T} \, \mathrm{dS}=\mathrm{q}\]

    Hence \(\mathrm{q}\), measured using a calorimeter, is a direct measure of the change in entropy accompanying a change where the system is always in equilibrium with the surroundings. In fact this statement provides a useful answer to the question ‘what is entropy?’. There is therefore a fundamental link between the two quantities \(\mathrm{dS}\) and heat \(\mathrm{q}\). Indeed we understand immediately the importance of calorimeters in thermodynamics. At the same time we understand the importance of chemical kinetics because this subject is built around equation (d) which in the basis of the Law of Mass Action.

    In summary we see how the two foundation stones of thermodynamics, namely energy and entropy, are formalised in two laws for which there are no exceptions. So we can end the Topics here. But chemists do not although there are new hazards.

    It follows from equation (d) that for a process where \(\mathrm{q} < 0\), the entropy of the system decreases. [It is interesting to note that the unit of entropy \(\mathm{J K}^{-1}\) is the same as that for heat capacity.] At this point we review the arguments advanced by Lewis and Randall [8].

    An important reference system in thermodynamics is the perfect gas. No such gas is actually known in the real world but the concept is very valuable. The properties of a perfect gas conform to the two laws [7b].

    We envisage a closed system, volume \(\mathrm{V}\) containing n moles of a perfect gas. The first condition states that the thermodynamic energy \(\mathrm{U}\) is only a function of temperature. Thus,

    \[\left(\frac{\partial \mathrm{U}}{\partial \mathrm{V}}\right)_{\mathrm{T}}=0\]

    The second condition states that the following equation relates the pressure, volume and temperature of \(\mathrm{n}\) moles of a perfect gas.

    \[\mathrm{p} \, \mathrm{V}=\mathrm{n} \, \mathrm{R} \, \mathrm{T}\]

    Thus for one mole of a perfect gas, having molar volume \(\mathrm{V}_{\mathrm{m}}\),

    \[\mathrm{p} \, \mathrm{V}_{\mathrm{m}}=\mathrm{R} \, \mathrm{T}\]

    A key equation (Topic 2500) relates the change in thermodynamic energy \(\mathrm{U}\) to the changes in entropy, volume and composition. Thus

    \[\mathrm{dU}=\mathrm{T} \, \mathrm{dS}-\mathrm{p} \, \mathrm{dV}-\mathrm{A} \, \mathrm{d} \xi\]

    For an equilibrium transformation, the affinity of spontaneous change is zero. Hence for an equilibrium process,

    \[\mathrm{dU}=\mathrm{T} \, \mathrm{dS}-\mathrm{p} \, \mathrm{dV}\]

    For 1 mole of an ideal gas,

    \[\mathrm{dU}_{\mathrm{m}}=\mathrm{T} \, \mathrm{dS}_{\mathrm{m}}-\mathrm{p} \, \mathrm{dV}_{\mathrm{m}}\]

    Or,

    \[\mathrm{dS}_{\mathrm{m}}=\frac{1}{\mathrm{~T}} \, \mathrm{dU}_{\mathrm{m}}+\frac{\mathrm{P}}{\mathrm{T}} \, \mathrm{dV} \mathrm{m}_{\mathrm{m}}\]

    Molar isochoric heat capacity \(\mathrm{C}_{\mathrm{Vm}}\) is related to \(\mathrm{dU}_{\mathrm{m}}\) by equation (m).

    \[\mathrm{C}_{\mathrm{V}_{\mathrm{m}}}=\left(\partial \mathrm{U}_{\mathrm{m}} / \partial \mathrm{T}\right)_{\mathrm{V}(\mathrm{m})}\]

    Then,

    \[\mathrm{dS}_{\mathrm{m}}=\frac{\mathrm{C}_{\mathrm{Vm}^{\mathrm{m}}}}{\mathrm{T}} \, \mathrm{dT}+\frac{\mathrm{p}}{\mathrm{T}} \, \mathrm{dV} \mathrm{V}_{\mathrm{m}}\]

    We define the molar entropy of an ideal gas using equation (o).

    \[\mathrm{S}_{\mathrm{m}}=\mathrm{S}_{\mathrm{m}}\left[\mathrm{T}, \mathrm{V}_{\mathrm{m}}\right]\]

    The total differential of equation (o) takes the following form.

    \[\mathrm{dS}_{\mathrm{m}}=\left(\frac{\partial \mathrm{S}_{\mathrm{m}}}{\partial \mathrm{T}}\right)_{\mathrm{V}(\mathrm{m})} \mathrm{dT}+\left(\frac{\partial \mathrm{S}_{\mathrm{m}}}{\partial \mathrm{V}_{\mathrm{m}}}\right)_{\mathrm{T}} \mathrm{dV} \mathrm{V}_{\mathrm{m}}\]

    Comparison of equations (n) and (p) reveals the following relation.

    \[\left(\frac{\partial S_{m}}{\partial V_{m}}\right)_{T}=\frac{p}{T}\]

    Hence using equation h),

    \[\left(\frac{\partial S_{m}}{\partial V_{m}}\right)_{T}=\frac{R}{V_{m}}\]

    Thus at constant temperature,

    \[\mathrm{dS}_{\mathrm{m}}=\mathrm{T} \, \mathrm{d} \ln \left(\mathrm{V}_{\mathrm{m}}\right)\]

    Hence the change in entropy for the isothermal expansion of an ideal gas between states where the volumes are \(\mathrm{V}_{\mathrm{m}}(\mathrm{B})\) and \(\mathrm{V}_{\mathrm{m}}(\mathrm{A})\) is given by equation (t).

    \[S_{m}(B)-S_{m}(A)=R \, \ln \left[\frac{V_{m}(B)}{V_{m}(A)}\right]\]

    We turn now to a consideration of changes in entropy from a statistical point of view.

    A given experiment [8] uses two glass flasks of equal volumes connected by a glass tube which includes a tap, all at the same temperature \(\mathrm{T}\). The system contains \(\mathrm{N}\) gas molecules; e.g. oxygen. The gas molecules pass freely between the two flasks through the open tap. On examining the contents of the two flasks we would not be surprised to discover that there are equal numbers of the gas molecules in the two flasks. The probability of this results from experiment A is expressed by stating that \(\mathrm{P}_{\mathrm{y}}^{\mathrm{A}}\) is unity.

    We return to the two flasks and close the tap. The probability that all the oxygen molecules are to be found in one flask is \({(1/2)}^{\mathrm{N}}\); i.e. a very low probability. If the total system contained only 20 molecules this probability signals a chance of 1 in \(2^{20}\). Thus the probability \(\mathrm{P}_{\mathrm{y}}^{\mathrm{B}}\) for experiment B is very small; effectively zero.

    An interesting exercise characterises these probabilities by a property \(\sigma\). Then,

    \[\sigma=\frac{R}{N} \, \ln \left(P_{Y}\right)\]

    Note that the auxiliary property \(\sigma\) is generally negative because statistical probabilities vary between zero and unity. For the two experiments,

    \[\sigma_{B}-\sigma_{A}=\frac{R}{N} \, \ln \left(\frac{P_{Y}^{B}}{P_{Y}^{A}}\right)\]

    Hence,

    \[\sigma_{B}-\sigma_{A}=\frac{R}{N} \, \ln \left(\frac{(1 / 2)^{N}}{1}\right)\]

    Or,

    \[\sigma_{\mathrm{B}}-\sigma_{\mathrm{A}}=-\mathrm{R} \, \ln (2)\]

    We can express this result in general terms describing the expansions of one mole of gas from volume \(\mathrm{V}_{\mathrm{A}}\) to \(\mathrm{V}_{\mathrm{B}}\). Then,

    \[\sigma_{B}-\sigma_{A}=R \, \ln \left[V_{m}(B) / V_{m}(A)\right]\]

    At this point comparison between equations (t) and (y) is rewarding. Thus we may write the following equation.

    \[\mathrm{S}_{\mathrm{m}}(\mathrm{B})-\mathrm{S}_{\mathrm{m}}(\mathrm{A})=\frac{\mathrm{R}}{\mathrm{N}} \,\left[\ln \left(\mathrm{P}_{\mathrm{Y}}^{\mathrm{B}}\right)-\ln \left(\mathrm{P}_{\mathrm{Y}}^{\wedge}\right)\right]\]

    In other words the difference between the entropies in the ideal gas state is related to a probability. Thus we might conclude that \(\mathrm{S}_{\mathrm{m}}(\mathrm{B})\) is larger that \(\mathrm{S}_{\mathrm{m}}(\mathrm{A})\) because there are more ways of arranging molecules in system B than in system A. The state with the more ordered arrangement is the state with the lower entropy. It is a small step (but a very dangerous step) to draw comparison between entropy and (if there is such a word) the muddled-up-ness of a given system. But these are treacherous waters and outside the province of the classic thermodynamics which form the basis of the Topics. Indeed strong feelings are aroused. McGlashan [9,10], for example, takes to task chemists who assume that an increase in entropy implies an increase of disorder or of randomness or of ‘mixed-upness’. We leave the debate here except to note that both authors of these Topics favour the view advanced by McGlashan [9,10] although this view would not a win a popularity contest. But ‘popularity’ is not an acceptable criterion in thermodynamics.

    Footnotes

    [1] Cambridge International Dictionary of English, Cambridge University Press, Cambridge, 1995.

    [2] P. W. Atkins, Concepts in Physical Chemistry, Oxford University Press, Oxford, 1995.

    [3] E. F. Caldin, An Introduction to Chemical Thermodynamics, Oxford University Press, Oxford, 1958.

    [4] S. Glasstone, Sourcebook of Atomic Energy, MacMillan, London, 1954.

    [5] P. W. Atkins and L. Jones, Chemistry: Molecules, Matter and Change, W H Freeman, New York, 3rd edition, 1997, p.875.

    [6] Th. de Donder, Bull. Acad. Roy. Belg. (Cl.Sc), 1922, 7, 197, 205.

    [7] I. Prigogine and R. Defay, Chemical Thermodynamics, transl. D. H. Everett, Longmans Green, London, 1954, (a) chapter 3; (b) chapter 4.

    [8] G. N. Lewis and M. Randall, Thermodynamics , McGraw-Hill, New York, 1923, chapter VI.

    [9] M. L. McGlashan, Chemical Thermodynamics, Academic Press, London, 1979,pages 112-113;

    [10] M.L. McGlashan, J. Chem. Educ.,1966,43,226.


    This page titled 1.14.17: Energy and Entropy is shared under a Public Domain license and was authored, remixed, and/or curated by Michael J Blandamer & Joao Carlos R Reis.

    • Was this article helpful?