1.9.1: Entropy - Second Law of Thermodynamics
A closed system (in addition to the thermodynamic energy \(\mathrm{U}\)) is characterised by two functions of state.
- Temperature, \(\mathrm{T}\): an intensive variable.
- Entropy, \(\mathrm{S}\): an extensive variable.
The concept of entropy is particularly valuable in commenting on the direction of spontaneous chemical reaction [1].
The Second Law of Thermodynamics states that for spontaneous chemical reaction in a closed system [2].
\[\mathrm{T} \, \mathrm{dS}=\mathrm{q}+\mathrm{A} \, \mathrm{d} \xi \nonumber \]
where
\[\mathrm{A} \, \mathrm{d} \xi \geq 0 \nonumber \]
These two equations comprise the Second Law [3]. The product of the affinity for spontaneous chemical reaction and the extent of chemical reaction (i.e. accompanying change in composition) can never be negative. This is the thermodynamic `selection rule' for which there are absolutely no exceptions. Chemists base their analysis of chemical processes on the certainty of this rule (or axiom). The key point is the sense of direction of spontaneous change which emerges [4].
In the event that the affinity for spontaneous change is zero, no change in chemical composition occurs in a closed system; i.e. \(\mathrm{d}\xi\) is zero and the rate of change \(\mathrm{d}\xi / \mathrm{dt}\) is zero. The system and surroundings are in equilibrium. Hence
\[\mathrm{T} \, \mathrm{dS}=\mathrm{q} \quad(\text { at } \mathrm{A}=0) \nonumber \]
The latter equation has a particular set of applications. We imagine a closed system for which the affinity for spontaneous change is zero. We perturb the system by a change in pressure such that there is a corresponding change in composition-organisation in the system. However as we change the pressure along a certain pathway, we assert that the affinity for spontaneous change is always zero. Then between states I and II, at constant temperature \(\mathrm{T}\)
\[\mathrm{T} \, \mathrm{S}(\mathrm{II})-\mathrm{T} \, \mathrm{S}(\mathrm{I})=\mathrm{T} \, \int_{\text {statel }}^{\text {state II }} \mathrm{dS}=\mathrm{q} \nonumber \]
The pathway between these two states is called reversible or an equilibrium transformation. In fact the change in pressure must be carried out infinitely slowly because we must allow the chemical composition/molecular organisation to hold to the condition that there is no affinity for spontaneous change.
All processes in the real world (i.e. all natural processes) are irreversible; there is a defined direction for spontaneous changes [5].
Footnotes
[1] Many authors offer an explanation of the property, entropy. One view is that to attempt an explanation of the "meaning" of entropy is a complete waste of time (M. L. McGlashan, J. Chem. Educ., 1966, 43 , 226). A wide-ranging discussion is given by P.L. Huyskens and G.G. Siegel,Bull. Soc.Chem.Belg., 1988, 97 , 809, 815 and 823.
E.A. Guggenheim [Thermodynamics, North-Holland, Amsterdam, 1950]. This monograph is often cited for the following bold statement (page 11): "There exists a function \(\mathrm{S}\) of the state of a system called the entropy of the system .....".
H. Margenau [The Nature of Physical Reality, McGraw-Hill, New York, 1950] states 'Entropy is as definite and clear a thing as other thermodynamic quantities'.
The common view in introductory chemistry textbooks for many years has been that entropy is a measurement of randomness and/or disorder. However this view is unhelpful if not meaningless [E. T. Jaynes, Am. J.Phys.,1965, 33 ,391; F. L. Lambert, J. Chem. Educ.,1999, 76 ,1385; 2002, 79 ,187.] indeed a myth [W Brostow, Science 1972, 178 ,211.] and an educational disaster [M. Sozlibir, J.K.Bennett, J. Chem. Educ.,2007,84,1204.]
The generally accepted view [ F. L. Lambert, J.Chem.Educ.,2002, 79 ,1241 ] is that an entropy increase results from the energy of molecular motion becoming more dispersed or ’spread out’; e.g. in the two classic examples of a system being warmed by hotter surroundings or, isothermally, when a system’s molecules have greater volume for their energetic movement the energy of molecular motion becoming more dispersed or ‘spread out’; e.g. in the two classic examples of as system being warmed by hotter surroundings or, isothermally, when a system’s molecules have greater volume for their energetic movement. The concept is exceptionally valuable because entropy increase can be seen by chemists as simply involving the energy associated with mobile molecules spreading out more in three-dimensional space, whether a new total system of ‘less hot plus once-cooler’ or isothermally in a larger volume. This simple view is equivalent to the dispersal of energy in phase space. In quantum mechanical terms, ’energy dispersal’ means that a system will come to equilibrium in a final state that is optimal because it affords a maximal number of accessible energy arrangements. Even though the system can be in only one arrangement at one instant, its energy is truly dispersed because at the next instant it can be in a different arrangement: this amounts to a ‘temporal dance’ over a very small fraction of the hyper-astronomical number of microstates predicted by the Boltzmann relation. The account given here is based on a written comments in correspondence from F. L. Lambert.
[2] From equation (a), \(\mathrm{T}. \left.\mathrm{dS}=[\mathrm{K}]-\left[\mathrm{JK}^{-1}\right]=[]\right]+\left[\mathrm{J} \mathrm{} \mathrm{mol}^{-1}\right]=[\mathrm{J}]\)
[3] Equations (a) and (b) can be re-expressed in terms of the contribution to the change in entropy \(\mathrm{dS}\) by a process (e.g. chemical reaction) within the system \(\mathrm{d}_{\mathrm{i}}\mathrm{S}\). Then
\[\mathrm{T} \, \mathrm{dS}=\mathrm{q}+\mathrm{T} \, \mathrm{d}_{\mathrm{s}} \mathrm{S} \nonumber \]
where
\[\mathrm{d}_{\imath} \mathrm{S}>0 \nonumber \]
Equation (B) is the Second Law in that \(\mathrm{d}_{\mathrm{i}}\mathrm{S}\) cannot be negative. For a reversible process \(\mathrm{d}_{\mathrm{i}}\mathrm{S}=0\). But for all processes in the real world, \(\mathrm{d}_{\mathrm{i}}\mathrm{S}\) is positive. In other words all spontaneous processes occur in the direction whereby there is a positive contribution from \(\mathrm{d}_{\mathrm{i}}\mathrm{S}\) to the change in entropy \(\mathrm{dS}\).
[4] This concept of spontaneous change, coupled with the idea that changes occur in a predefined direction is linked with the idea that time is "one-sided". (a) I. Prigogine, From Being to Becoming, Freeman, San Francisco, 1980, page 6. (b) see also G. Nicolis and I. Prigogine, Self-Organization in Non-Equilibrium Systems, Wiley, New York, 1977.
[5] Equation (a) forms the basis of an oft-quoted comment. For an isolated system, \(\mathrm{q}\) is zero. Then \(\mathrm{T} \, \mathrm{dS}=\mathrm{A} \, \mathrm{d} \xi\) where \(\mathrm{A} \, \mathrm{d} \xi \geq 0\)
So for all spontaneous processes in an isolated system, \(\mathrm{dS} >0\). This is the basis of the statement that the entropy of the universe is increasing if the universe can be treated as an isolated system. But these comments stray from immediate interests of chemists.