19.2: The Concept of Entropy
- Page ID
- 24315
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)
( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\id}{\mathrm{id}}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\kernel}{\mathrm{null}\,}\)
\( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\)
\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\)
\( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)
\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)
\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)
\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vectorC}[1]{\textbf{#1}} \)
\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)
\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)
\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)- Define entropy
- Explain the relationship between entropy and the number of microstates
- Predict the sign of the entropy change for chemical and physical processes
The first law of thermodynamics governs changes in the state function we have called internal energy (U). Changes in the internal energy (ΔU) are closely related to changes in the enthalpy (ΔH), which is a measure of the heat flow between a system and its surroundings at constant pressure. You also learned previously that the enthalpy change for a chemical reaction can be calculated using tabulated values of enthalpies of formation. This information, however, does not tell us whether a particular process or reaction will occur spontaneously.
Let’s consider a familiar example of spontaneous change. If a hot frying pan that has just been removed from the stove is allowed to come into contact with a cooler object, such as cold water in a sink, heat will flow from the hotter object to the cooler one, in this case usually releasing steam. Eventually both objects will reach the same temperature, at a value between the initial temperatures of the two objects. This transfer of heat from a hot object to a cooler one obeys the first law of thermodynamics: energy is conserved.
Now consider the same process in reverse. Suppose that a hot frying pan in a sink of cold water were to become hotter while the water became cooler. As long as the same amount of thermal energy was gained by the frying pan and lost by the water, the first law of thermodynamics would be satisfied. Yet we all know that such a process cannot occur: heat always flows from a hot object to a cold one, never in the reverse direction. That is, by itself the magnitude of the heat flow associated with a process does not predict whether the process will occur spontaneously.
For many years, chemists and physicists tried to identify a single measurable quantity that would enable them to predict whether a particular process or reaction would occur spontaneously. Initially, many of them focused on enthalpy changes and hypothesized that an exothermic process would always be spontaneous. But although it is true that many, if not most, spontaneous processes are exothermic, there are also many spontaneous processes that are not exothermic. For example, at a pressure of 1 atm, ice melts spontaneously at temperatures greater than 0°C, yet this is an endothermic process because heat is absorbed. Similarly, many salts (such as NH4NO3, NaCl, and KBr) dissolve spontaneously in water even though they absorb heat from the surroundings as they dissolve (i.e., ΔHsoln > 0). Reactions can also be both spontaneous and highly endothermic, like the reaction of barium hydroxide with ammonium thiocyanate shown in Figure \(\PageIndex{1}\).
Thus enthalpy is not the only factor that determines whether a process is spontaneous. For example, after a cube of sugar has dissolved in a glass of water so that the sucrose molecules are uniformly dispersed in a dilute solution, they never spontaneously come back together in solution to form a sugar cube. Moreover, the molecules of a gas remain evenly distributed throughout the entire volume of a glass bulb and never spontaneously assemble in only one portion of the available volume. To help explain why these phenomena proceed spontaneously in only one direction requires an additional state function called entropy (S), a thermodynamic property of all substances that is proportional to their degree of disorder. In Chapter 13, we introduced the concept of entropy in relation to solution formation. Here we further explore the nature of this state function and define it mathematically.
Entropy and Microstates
In 1824, at the age of 28, Nicolas Léonard Sadi Carnot (Figure \(\PageIndex{2}\)) published the results of an extensive study regarding the efficiency of steam heat engines. In a later review of Carnot’s findings, Rudolf Clausius introduced a new thermodynamic property that relates the spontaneous heat flow accompanying a process to the temperature at which the process takes place. This new property was expressed as the ratio of the reversible heat (qrev) and the kelvin temperature (T). The term reversible process refers to a process that takes place at such a slow rate that it is always at equilibrium and its direction can be changed (it can be “reversed”) by an infinitesimally small change is some condition. Note that the idea of a reversible process is a formalism required to support the development of various thermodynamic concepts; no real processes are truly reversible, rather they are classified as irreversible.
Similar to other thermodynamic properties, this new quantity is a state function, and so its change depends only upon the initial and final states of a system. In 1865, Clausius named this property entropy (S) and defined its change for any process as the following:
\[ \color{red} ΔS=\dfrac{q_\ce{rev}}{T} \label{Eq1}\]
The entropy change for a real, irreversible process is then equal to that for the theoretical reversible process that involves the same initial and final states.
Following the work of Carnot and Clausius, Ludwig Boltzmann developed a molecular-scale statistical model that related the entropy of a system to the number of microstates possible for the system. A microstate (W) is a specific configuration of the locations and energies of the atoms or molecules that comprise a system like the following:
\[S=k \ln W \label{Eq2}\]
Here k is the Boltzmann constant and has a value of 1.38 × 10−23 J/K.
As for other state functions, the change in entropy for a process is the difference between its final (Sf) and initial (Si) values:
\[ΔS=S_\ce{f}−S_\ce{i}=k \ln W_\ce{f} − k \ln W_\ce{i}=k \ln\dfrac{W_\ce{f}}{W_\ce{i}} \label{Eq2a}\]
For processes involving an increase in the number of microstates, Wf > Wi, the entropy of the system increases, ΔS > 0. Conversely, processes that reduce the number of microstates, Wf < Wi, yield a decrease in system entropy, ΔS < 0. This molecular-scale interpretation of entropy provides a link to the probability that a process will occur as illustrated in the next paragraphs.
Consider the general case of a system comprised of N particles distributed among n boxes. The number of microstates possible for such a system is nN. For example, distributing four particles among two boxes will result in 24 = 16 different microstates as illustrated in Figure \(\PageIndex{3}\). Microstates with equivalent particle arrangements (not considering individual particle identities) are grouped together and are called distributions. The probability that a system will exist with its components in a given distribution is proportional to the number of microstates within the distribution. Since entropy increases logarithmically with the number of microstates, the most probable distribution is therefore the one of greatest entropy.
For this system, the most probable configuration is one of the six microstates associated with distribution (c) where the particles are evenly distributed between the boxes, that is, a configuration of two particles in each box. The probability of finding the system in this configuration is
\[\dfrac{6}{16}\) or \(\dfrac{3}{8}\]
The least probable configuration of the system is one in which all four particles are in one box, corresponding to distributions (a) and (d), each with a probability of
\[\dfrac{1}{16}\]
The probability of finding all particles in only one box (either the left box or right box) is then
\[\left(\dfrac{1}{16}+\dfrac{1}{16}\right)=\dfrac{2}{16}\) or \(\dfrac{1}{8}\]
As you add more particles to the system, the number of possible microstates increases exponentially (2N). A macroscopic (laboratory-sized) system would typically consist of moles of particles (N ~ 1023), and the corresponding number of microstates would be staggeringly huge. Regardless of the number of particles in the system, however, the distributions in which roughly equal numbers of particles are found in each box are always the most probable configurations.
The previous description of an ideal gas expanding into a vacuum is a macroscopic example of this particle-in-a-box model. For this system, the most probable distribution is confirmed to be the one in which the matter is most uniformly dispersed or distributed between the two flasks. The spontaneous process whereby the gas contained initially in one flask expands to fill both flasks equally therefore yields an increase in entropy for the system.
A similar approach may be used to describe the spontaneous flow of heat. Consider a system consisting of two objects, each containing two particles, and two units of energy (represented as “*”) in Figure \(\PageIndex{4}\). The hot object is comprised of particles A and B and initially contains both energy units. The cold object is comprised of particles C and D, which initially has no energy units. Distribution (a) shows the three microstates possible for the initial state of the system, with both units of energy contained within the hot object. If one of the two energy units is transferred, the result is distribution (b) consisting of four microstates. If both energy units are transferred, the result is distribution (c) consisting of three microstates. And so, we may describe this system by a total of ten microstates. The probability that the heat does not flow when the two objects are brought into contact, that is, that the system remains in distribution (a), is \(\dfrac{3}{10}\). More likely is the flow of heat to yield one of the other two distribution, the combined probability being \(\dfrac{7}{10}\). The most likely result is the flow of heat to yield the uniform dispersal of energy represented by distribution (b), the probability of this configuration being \(\dfrac{4}{10}\). As for the previous example of matter dispersal, extrapolating this treatment to macroscopic collections of particles dramatically increases the probability of the uniform distribution relative to the other distributions. This supports the common observation that placing hot and cold objects in contact results in spontaneous heat flow that ultimately equalizes the objects’ temperatures. And, again, this spontaneous process is also characterized by an increase in system entropy.
Consider the system shown here. What is the change in entropy for a process that converts the system from distribution (a) to (c)?
Solution
We are interested in the following change:
The initial number of microstates is one, the final six:
\[ΔS=k\ln\dfrac{W_\ce{c}}{W_\ce{a}}=\mathrm{1.38×10^{−23}\:J/K×\ln\dfrac{6}{1}=2.47×10^{−23}\:J/K}\]
The sign of this result is consistent with expectation; since there are more microstates possible for the final state than for the initial state, the change in entropy should be positive.
Consider the system shown in Figure \(\PageIndex{3}\). What is the change in entropy for the process where all the energy is transferred from the hot object (AB) to the cold object (CD)?
Answer: 0 J/K
Entropy: https://youtu.be/dkanY87VsjY
Predicting the Sign of ΔS
The relationships between entropy, microstates, and matter/energy dispersal described previously allow us to make generalizations regarding the relative entropies of substances and to predict the sign of entropy changes for chemical and physical processes. Consider the phase changes illustrated in Figure \(\PageIndex{5}\). In the solid phase, the atoms or molecules are restricted to nearly fixed positions with respect to each other and are capable of only modest oscillations about these positions. With essentially fixed locations for the system’s component particles, the number of microstates is relatively small. In the liquid phase, the atoms or molecules are free to move over and around each other, though they remain in relatively close proximity to one another. This increased freedom of motion results in a greater variation in possible particle locations, so the number of microstates is correspondingly greater than for the solid. As a result, Sliquid > Ssolid and the process of converting a substance from solid to liquid (melting) is characterized by an increase in entropy, ΔS > 0. By the same logic, the reciprocal process (freezing) exhibits a decrease in entropy, ΔS < 0.
Now consider the vapor or gas phase. The atoms or molecules occupy a much greater volume than in the liquid phase; therefore each atom or molecule can be found in many more locations than in the liquid (or solid) phase. Consequently, for any substance, Sgas > Sliquid > Ssolid, and the processes of vaporization and sublimation likewise involve increases in entropy, ΔS > 0. Likewise, the reciprocal phase transitions, condensation and deposition, involve decreases in entropy, ΔS < 0.
According to kinetic-molecular theory, the temperature of a substance is proportional to the average kinetic energy of its particles. Raising the temperature of a substance will result in more extensive vibrations of the particles in solids and more rapid translations of the particles in liquids and gases. At higher temperatures, the distribution of kinetic energies among the atoms or molecules of the substance is also broader (more dispersed) than at lower temperatures. Thus, the entropy for any substance increases with temperature (Figure \(\PageIndex{6}\) ).
The entropy of a substance is influenced by structure of the particles (atoms or molecules) that comprise the substance. With regard to atomic substances, heavier atoms possess greater entropy at a given temperature than lighter atoms, which is a consequence of the relation between a particle’s mass and the spacing of quantized translational energy levels (which is a topic beyond the scope of our treatment). For molecules, greater numbers of atoms (regardless of their masses) increase the ways in which the molecules can vibrate and thus the number of possible microstates and the system entropy.
Finally, variations in the types of particles affects the entropy of a system. Compared to a pure substance, in which all particles are identical, the entropy of a mixture of two or more different particle types is greater. This is because of the additional orientations and interactions that are possible in a system comprised of nonidentical components. For example, when a solid dissolves in a liquid, the particles of the solid experience both a greater freedom of motion and additional interactions with the solvent particles. This corresponds to a more uniform dispersal of matter and energy and a greater number of microstates. The process of dissolution therefore involves an increase in entropy, ΔS > 0.
Considering the various factors that affect entropy allows us to make informed predictions of the sign of ΔS for various chemical and physical processes as illustrated in Example .
Predict the sign of the entropy change for the following processes. Indicate the reason for each of your predictions.
- One mole liquid water at room temperature \(⟶\) one mole liquid water at 50 °C
- \(\ce{Ag+}(aq)+\ce{Cl-}(aq)⟶\ce{AgCl}(s)\)
- \(\ce{C6H6}(l)+\dfrac{15}{2}\ce{O2}(g)⟶\ce{6CO2}(g)+\ce{3H2O}(l)\)
- \(\ce{NH3}(s)⟶\ce{NH3}(l)\)
Solution
- positive, temperature increases
- negative, reduction in the number of ions (particles) in solution, decreased dispersal of matter
- negative, net decrease in the amount of gaseous species
- positive, phase transition from solid to liquid, net increase in dispersal of matter
Predict the sign of the enthalpy change for the following processes. Give a reason for your prediction.
- \(\ce{NaNO3}(s)⟶\ce{Na+}(aq)+\ce{NO3-}(aq)\)
- the freezing of liquid water
- \(\ce{CO2}(s)⟶\ce{CO2}(g)\)
- \(\ce{CaCO}(s)⟶\ce{CaO}(s)+\ce{CO2}(g)\)
Answer:
(a) Positive; The solid dissolves to give an increase of mobile ions in solution. (b) Negative; The liquid becomes a more ordered solid. (c) Positive; The relatively ordered solid becomes a gas. (d) Positive; There is a net production of one mole of gas.
Entropy (S) is a thermodynamic property of all substances. The greater the number of possible microstates for a system, the greater the disorder and the higher the entropy.
Experiments show that the magnitude of ΔSvap is 80–90 J/(mol•K) for a wide variety of liquids with different boiling points. However, liquids that have highly ordered structures due to hydrogen bonding or other intermolecular interactions tend to have significantly higher values of ΔSvap. For instance, ΔSvap for water is 102 J/(mol•K). Another process that is accompanied by entropy changes is the formation of a solution. As illustrated in Figure \(\PageIndex{4}\), the formation of a liquid solution from a crystalline solid (the solute) and a liquid solvent is expected to result in an increase in the number of available microstates of the system and hence its entropy. Indeed, dissolving a substance such as NaCl in water disrupts both the ordered crystal lattice of NaCl and the ordered hydrogen-bonded structure of water, leading to an increase in the entropy of the system. At the same time, however, each dissolved Na+ ion becomes hydrated by an ordered arrangement of at least six water molecules, and the Cl− ions also cause the water to adopt a particular local structure. Both of these effects increase the order of the system, leading to a decrease in entropy. The overall entropy change for the formation of a solution therefore depends on the relative magnitudes of these opposing factors. In the case of an NaCl solution, disruption of the crystalline NaCl structure and the hydrogen-bonded interactions in water is quantitatively more important, so ΔSsoln > 0.
Dissolving NaCl in water results in an increase in the entropy of the system. Each hydrated ion, however, forms an ordered arrangement with water molecules, which decreases the entropy of the system. The magnitude of the increase is greater than the magnitude of the decrease, so the overall entropy change for the formation of an NaCl solution is positive.
Predict which substance in each pair has the higher entropy and justify your answer.
- 1 mol of NH3(g) or 1 mol of He(g), both at 25°C
- 1 mol of Pb(s) at 25°C or 1 mol of Pb(l) at 800°C
Given: amounts of substances and temperature
Asked for: higher entropy
Strategy:
From the number of atoms present and the phase of each substance, predict which has the greater number of available microstates and hence the higher entropy.
Solution:
- Both substances are gases at 25°C, but one consists of He atoms and the other consists of NH3 molecules. With four atoms instead of one, the NH3 molecules have more motions available, leading to a greater number of microstates. Hence we predict that the NH3 sample will have the higher entropy.
- The nature of the atomic species is the same in both cases, but the phase is different: one sample is a solid, and one is a liquid. Based on the greater freedom of motion available to atoms in a liquid, we predict that the liquid sample will have the higher entropy.
Predict which substance in each pair has the higher entropy and justify your answer.
- 1 mol of He(g) at 10 K and 1 atm pressure or 1 mol of He(g) at 250°C and 0.2 atm
- a mixture of 3 mol of H2(g) and 1 mol of N2(g) at 25°C and 1 atm or a sample of 2 mol of NH3(g) at 25°C and 1 atm
Answer
- 1 mol of He(g) at 250°C and 0.2 atm (higher temperature and lower pressure indicate greater volume and more microstates)
- a mixture of 3 mol of H2(g) and 1 mol of N2(g) at 25°C and 1 atm (more molecules of gas are present)
Reversible and Irreversible Changes
Changes in entropy (ΔS), together with changes in enthalpy (ΔH), enable us to predict in which direction a chemical or physical change will occur spontaneously. Before discussing how to do so, however, we must understand the difference between a reversible process and an irreversible one. In a reversible process, every intermediate state between the extremes is an equilibrium state, regardless of the direction of the change. In contrast, an irreversible process is one in which the intermediate states are not equilibrium states, so change occurs spontaneously in only one direction. As a result, a reversible process can change direction at any time, whereas an irreversible process cannot. When a gas expands reversibly against an external pressure such as a piston, for example, the expansion can be reversed at any time by reversing the motion of the piston; once the gas is compressed, it can be allowed to expand again, and the process can continue indefinitely. In contrast, the expansion of a gas into a vacuum (Pext = 0) is irreversible because the external pressure is measurably less than the internal pressure of the gas. No equilibrium states exist, and the gas expands irreversibly. When gas escapes from a microscopic hole in a balloon into a vacuum, for example, the process is irreversible; the direction of airflow cannot change.
Because work done during the expansion of a gas depends on the opposing external pressure (w = PextΔV), work done in a reversible process is always equal to or greater than work done in a corresponding irreversible process: wrev ≥ wirrev. Whether a process is reversible or irreversible, ΔU = q + w. Because U is a state function, the magnitude of ΔU does not depend on reversibility and is independent of the path taken. So
\[ΔU = q_{rev} + w_{rev} = q_{irrev} + w_{irrev} \label{Eq1a}\]
Work done in a reversible process is always equal to or greater than work done in a corresponding irreversible process: wrev ≥ wirrev.
In other words, ΔU for a process is the same whether that process is carried out in a reversible manner or an irreversible one. We now return to our earlier definition of entropy, using the magnitude of the heat flow for a reversible process (qrev) to define entropy quantitatively.
Quantum states and Energy Spreading
At the atomic and molecular level, all energy is quantized; each particle possesses discrete states of kinetic energy and is able to accept thermal energy only in packets whose values correspond to the energies of one or more of these states. Polyatomic molecules can store energy in rotational and vibrational motions, and all molecules (even monatomic ones) will possess translational kinetic energy (thermal energy) at all temperatures above absolute zero. The energy difference between adjacent translational states is so minute that translational kinetic energy can be regarded as continuous (non-quantized) for most practical purposes.
The number of ways in which thermal energy can be distributed amongst the allowed states within a collection of molecules is easily calculated from simple statistics, but we will confine ourselves to an example here. Suppose that we have a system consisting of three molecules and three quanta of energy to share among them. We can give all the kinetic energy to any one molecule, leaving the others with none, we can give two units to one molecule and one unit to another, or we can share out the energy equally and give one unit to each molecule. All told, there are ten possible ways of distributing three units of energy among three identical molecules as shown here:
Each of these ten possibilities represents a distinct microstate that will describe the system at any instant in time. Those microstates that possess identical distributions of energy among the accessible quantum levels (and differ only in which particular molecules occupy the levels) are known as configurations. Because all microstates are equally probable, the probability of any one configuration is proportional to the number of microstates that can produce it. Thus in the system shown above, the configuration labeled ii will be observed 60% of the time, while iii will occur only 10% of the time.
As the number of molecules and the number of quanta increases, the number of accessible microstates grows explosively; if 1000 quanta of energy are shared by 1000 molecules, the number of available microstates will be around 10600— a number that greatly exceeds the number of atoms in the observable universe! The number of possible configurations (as defined above) also increases, but in such a way as to greatly reduce the probability of all but the most probable configurations. Thus for a sample of a gas large enough to be observable under normal conditions, only a single configuration (energy distribution amongst the quantum states) need be considered; even the second-most-probable configuration can be neglected.
The bottom line: any collection of molecules large enough in numbers to have chemical significance will have its therrmal energy distributed over an unimaginably large number of microstates. The number of microstates increases exponentially as more energy states ("configurations" as defined above) become accessible owing to
- Addition of energy quanta (higher temperature),
- Increase in the number of molecules (resulting from dissociation, for example).
- the volume of the system increases (which decreases the spacing between energy states, allowing more of them to be populated at a given temperature.)
Key Concepts and Summary
- For a given system, the greater the number of microstates, the higher the entropy.
- During a spontaneous process, the entropy of the universe increases.
\[\Delta S=\frac{q_{\textrm{rev}}}{T}\]
Entropy (S) is a state function whose value increases with an increase in the number of available microstates. A reversible process is one for which all intermediate states between extremes are equilibrium states; it can change direction at any time. In contrast, an irreversible process occurs in one direction only. The change in entropy of the system or the surroundings is the quantity of heat transferred divided by the temperature. Entropy (S) may be interpreted as a measure of the dispersal or distribution of matter and/or energy in a system, and it is often described as representing the “disorder” of the system.
For a given substance, Ssolid < Sliquid < Sgas in a given physical state at a given temperature, entropy is typically greater for heavier atoms or more complex molecules. Entropy increases when a system is heated and when solutions form. Using these guidelines, the sign of entropy changes for some chemical reactions may be reliably predicted.
Key Equations
- \(ΔS=\dfrac{q_\ce{rev}}{T}\)
- \(S = k \ln W\)
- \(ΔS=k\ln\dfrac{W_\ce{f}}{W_\ce{i}}\)
Glossary
- entropy (S)
- state function that is a measure of the matter and/or energy dispersal within a system, determined by the number of system microstates often described as a measure of the disorder of the system
- microstate (W)
- possible configuration or arrangement of matter and energy within a system
- reversible process
- process that takes place so slowly as to be capable of reversing direction in response to an infinitesimally small change in conditions; hypothetical construct that can only be approximated by real processes removed
Contributors and Attributions
- {template.ContribLower()}}
Paul Flowers (University of North Carolina - Pembroke), Klaus Theopold (University of Delaware) and Richard Langley (Stephen F. Austin State University) with contributing authors. Textbook content produced by OpenStax College is licensed under a Creative Commons Attribution License 4.0 license. Download for free at http://cnx.org/contents/85abf193-2bd...a7ac8df6@9.110).