Skip to main content
Chemistry LibreTexts

7.9: Entropy and the Second Law of Thermodynamics

  • Page ID
    142263
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Skills to Develop

    • To understand the relationship between internal energy and entropy.

    The first law of thermodynamics governs changes in the state function we have called internal energy (\(U\)). Changes in the internal energy (ΔU) are closely related to changes in the enthalpy (ΔH), which is a measure of the heat flow between a system and its surroundings at constant pressure. This information, however, does not tell us whether a particular process or reaction will occur spontaneously. (In chemistry, the word spontaneous means "very likely to proceed in the forward direction, as written, in order to reach equilibrium.")

    Let’s consider a familiar example of spontaneous change. If a hot frying pan that has just been removed from the stove is allowed to come into contact with a cooler object, such as cold water in a sink, heat will flow from the hotter object to the cooler one, in this case usually releasing steam. Eventually both objects will reach the same temperature, at a value between the initial temperatures of the two objects. This transfer of heat from a hot object to a cooler one obeys the first law of thermodynamics: energy is conserved.

    Now consider the same process in reverse. Suppose that a hot frying pan in a sink of cold water were to become hotter while the water became cooler. As long as the same amount of thermal energy was gained by the frying pan and lost by the water, the first law of thermodynamics would be satisfied. Yet we all know that such a process is very unlikely to occur: heat is always likely to flow from a hot object to a cold one, but never likely to flow in the reverse direction. That is, by itself the magnitude of the heat flow associated with a process does not predict whether the process will occur spontaneously.

    For many years, chemists and physicists tried to identify a single measurable quantity that would enable them to predict whether a particular process or reaction would occur spontaneously. Initially, many of them focused on enthalpy changes and hypothesized that an exothermic process would always be spontaneous. But although it is true that many, if not most, spontaneous processes are exothermic, there are also many spontaneous processes that are not exothermic. For example, at a pressure of 1 atm, ice melts spontaneously at temperatures greater than 0°C, yet this is an endothermic process because heat is absorbed. Similarly, many salts (such as NH4NO3, NaCl, and KBr) dissolve spontaneously in water even though they absorb heat from the surroundings as they dissolve (i.e., ΔHsoln > 0). Reactions can also be both spontaneous and highly endothermic, like the reaction of barium hydroxide with ammonium thiocyanate shown in Figure \(\PageIndex{1}\).

    DC096Temp.jpg

    Figure \(\PageIndex{1}\): An Endothermic Reaction. The reaction of barium hydroxide with ammonium thiocyanate is spontaneous but highly endothermic, so water, one product of the reaction, quickly freezes into slush. When water is placed on a block of wood under the flask, the highly endothermic reaction that takes place in the flask freezes water that has been placed under the beaker, so the flask becomes frozen to the wood. For a full video: see https://www.youtube.com/watch?v=GQkJI-Nq3Os.

    Thus enthalpy is not the only factor that determines whether a process is spontaneous. For example, after a cube of sugar has dissolved in a glass of water so that the sucrose molecules are uniformly dispersed in a dilute solution, they never spontaneously come back together in solution to form a sugar cube. Moreover, the molecules of a gas remain evenly distributed throughout the entire volume of a glass bulb and never spontaneously assemble in only one portion of the available volume. To help explain why these phenomena proceed spontaneously in only one direction requires an additional state function called entropy (S), a thermodynamic property of all substances that is proportional to their degree of "possibilities".

    Entropy

    Chemical and physical changes in a system may be accompanied by either an increase or a decrease in the possible arrangements (sometimes called disorder, randomness, or freedom) of the system, corresponding to an increase in entropy (ΔS > 0) or a decrease in entropy (ΔS < 0), respectively. As with any other state function, the change in entropy is defined as the difference between the entropies of the final and initial states: ΔS = Sf − Si.

    When a gas expands into a vacuum, its entropy increases because the increased volume allows for greater atomic or molecular possibilities. The greater the number of atoms or molecules in the gas, the greater the possible arrangements. The magnitude of the entropy of a system depends on the number of microscopic states, or microstates, associated with it (in this case, the number of atoms or molecules); that is, the greater the number of microstates, the greater the entropy.

    We can illustrate the concepts of microstates and entropy using a deck of playing cards, as shown in Figure \(\PageIndex{2}\). There are approximately 1068 (52!) different ways they might be arranged, which corresponds to 1068 different microscopic states. Card games assign a higher value to a hand that has a low degree of probability. In games such as five-card poker, only 4 of the 2,598,960 different possible hands, or microstates, contain the valued arrangement of cards called a royal flush, almost 1.1 million hands contain one pair, and more than 1.3 million hands are completely disordered (according to the rules of poker) and therefore have no value. Because the last two arrangements are far more probable than the first, the value of a poker hand is inversely proportional to its entropy.

    Piatnikcards.jpg

    Figure \(\PageIndex{2}\): Illustrating High- and Low-Entropy States with a Deck of Playing Cards. A randomly shuffled deck of 52 cards can have any one of approximately 1068 different arrangements, which correspond to 1068 different microstates. The deck above shows a single arrangement, and thus represents one of these microstates. If this is the sole, desired arrangement of cards, the arrangement has a very low entropy because there is only one possible way to obtain this arrangement (1 out of 52!). The card game of canasta uses 108 cards, and so a canasta deck has a higher number of possible microstates (108!), and could be said to have a higher entropy than a standard deck.

    Image used with permission (CC BY-3.0 ; Trainler).

    We can see how to calculate these kinds of probabilities for a chemical system by considering the possible arrangements of a sample of four gas molecules in a two-bulb container (Figure \(\PageIndex{3}\)). There are five possible arrangements: all four molecules in the left bulb (I); three molecules in the left bulb and one in the right bulb (II); two molecules in each bulb (III); one molecule in the left bulb and three molecules in the right bulb (IV); and four molecules in the right bulb (V). If we assign a different color to each molecule to keep track of it for this discussion (remember, however, that in reality the molecules are indistinguishable from one another), we can see that there are 16 different ways the four molecules can be distributed in the bulbs, each corresponding to a particular microstate. As shown in Figure \(\PageIndex{3}\), arrangement I is associated with a single microstate, as is arrangement V, so each arrangement has a probability of 1/16. Arrangements II and IV each have a probability of 4/16 because each can exist in four microstates. Similarly, six different microstates can occur as arrangement III, making the probability of this arrangement 6/16. Thus the arrangement that we would expect to encounter most often, with half the gas molecules in each bulb, is the most probable arrangement. The other states are not impossible but simply less likely.

    16332a96c8e876b99cc7e54f47db2867.jpg

    Figure \(\PageIndex{3}\): The Possible Microstates for a Sample of Four Gas Molecules in Two Bulbs of Equal Volume

    There are 16 different ways to distribute four gas molecules between the bulbs, with each distribution corresponding to a particular microstate. Arrangements I and V each produce a single microstate with a probability of 1/16. This particular arrangement is least likely to be observed. Arrangements II and IV each produce four microstates, with a probability of 4/16. Arrangement III, with half the gas molecules in each bulb, has a probability of 6/16. It is the one encompassing the most microstates, so it is the most probable.

    Instead of four molecules of gas, let’s now consider 1 L of an ideal gas which contains 2.69 × 1022 molecules. If we allow the sample of gas to expand into a second 1 L container, the probability of finding all 2.69 × 1022 molecules in one container and none in the other at any given time is extremely small, approximately \(\frac{2}{2.69 \times 10^{22}}\). The probability of such an occurrence is effectively zero. Although nothing prevents the molecules in the gas sample from occupying only one of the two bulbs, that particular arrangement is so improbable that it is never actually observed. The probability of arrangements with essentially equal numbers of molecules in each bulb is quite high, however, because there are many equivalent microstates in which the molecules are distributed equally. Hence a macroscopic sample of a gas occupies all of the space available to it, simply because this is the most probable arrangement.

    A system that has a large number of possible microstates because the particles are free to move has a high entropy. This is most clearly seen in the entropy changes that accompany phase transitions, such as solid to liquid or liquid to gas. As you know, a crystalline solid is composed of an ordered array of molecules, ions, or atoms that occupy fixed positions in a lattice, whereas the molecules in a liquid are free to move and tumble within the volume of the liquid; molecules in a gas have even more freedom to move than those in a liquid. Each degree of motion increases the number of available microstates, resulting in a higher entropy. Thus the entropy of a system must increase during melting (ΔSfus > 0). Similarly, when a liquid is converted to a vapor, the greater freedom of motion of the molecules in the gas phase means that ΔSvap > 0. Conversely, the reverse processes (condensing a vapor to form a liquid or freezing a liquid to form a solid) must be accompanied by a decrease in the entropy of the system: ΔS < 0.

    Entropy (S) is a thermodynamic property of all substances that is proportional to their degree of possibilities. The greater the number of possible microstates for a system, the greater the disorder and the higher the entropy.

    Experiments show that the magnitude of ΔSvap is 80–90 J/(mol•K) for a wide variety of liquids with different boiling points. However, liquids that have highly ordered structures due to hydrogen bonding or other intermolecular interactions tend to have significantly higher values of ΔSvap. For instance, ΔSvap for water is 102 J/(mol•K). Another process that is accompanied by entropy changes is the formation of a solution. As illustrated in Figure \(\PageIndex{4}\), the formation of a liquid solution from a crystalline solid (the solute) and a liquid solvent is expected to result in an increase in the number of available microstates of the system and hence its entropy. Indeed, dissolving a substance such as NaCl in water disrupts both the ordered crystal lattice of NaCl and the ordered hydrogen-bonded structure of water, leading to an increase in the entropy of the system. At the same time, however, each dissolved Na+ ion becomes hydrated by an ordered arrangement of at least six water molecules, and the Cl ions also cause the water to adopt a particular local structure. Both of these effects increase the order of the system, leading to a decrease in entropy. The overall entropy change for the formation of a solution therefore depends on the relative magnitudes of these opposing factors. In the case of an NaCl solution, disruption of the crystalline NaCl structure and the hydrogen-bonded interactions in water is quantitatively more important, so ΔSsoln > 0.

    b8c52b3c38b3bc645d4a6e8649a04f21.jpg

    Figure \(\PageIndex{4}\): The Effect of Solution Formation on Entropy

    Dissolving NaCl in water results in an increase in the entropy of the system. Each hydrated ion, however, forms an ordered arrangement with water molecules, which decreases the entropy of the system. The magnitude of the increase is greater than the magnitude of the decrease, so the overall entropy change for the formation of an NaCl solution is positive.

    Example \(\PageIndex{1}\)

    Predict which substance in each pair has the higher entropy and justify your answer.

    1. 1 mol of NH3(g) or 1 mol of He(g), both at 25°C
    2. 1 mol of Pb(s) at 25°C or 1 mol of Pb(l) at 800°C

    Given: amounts of substances and temperature

    Asked for: higher entropy

    Strategy:

    From the number of atoms present and the phase of each substance, predict which has the greater number of available microstates and hence the higher entropy.

    Solution:

    1. Both substances are gases at 25°C, but one consists of He atoms and the other consists of NH3 molecules. With four atoms instead of one, the NH3 molecules have more motions available, leading to a greater number of microstates. Hence we predict that the NH3 sample will have the higher entropy.
    2. The nature of the atomic species is the same in both cases, but the phase is different: one sample is a solid, and one is a liquid. Based on the greater freedom of motion available to atoms in a liquid, we predict that the liquid sample will have the higher entropy.

    Exercise \(\PageIndex{1}\)

    Predict which substance in each pair has the higher entropy and justify your answer.

    1. 1 mol of He(g) at 10 K and 1 atm pressure or 1 mol of He(g) at 250°C and 0.2 atm
    2. a mixture of 3 mol of H2(g) and 1 mol of N2(g) at 25°C and 1 atm or a sample of 2 mol of NH3(g) at 25°C and 1 atm
    Answer a

    1 mol of He(g) at 250°C and 0.2 atm (higher temperature and lower pressure indicate greater volume and more microstates)

    Answer a

    a mixture of 3 mol of H2(g) and 1 mol of N2(g) at 25°C and 1 atm (more molecules of gas are present)

    Video Solution

    The Second Law of Thermodynamics

    The entropy of the universe increases during a spontaneous process. It also increases during an non-spontaneous process that is driven to occur by the addition of work to the system. The entropy of the universe will continue to increase until equilibrium is reached. Although we do not know what it means for the universe to be at equilibrium, we do know that there is no change in entropy for any chemical or physical process that is at equilibrium. Thus, equilibrium must be the most likely situation for any system. A system at equilibrium is likely to stay at equilibrium, and a system that is not at equilibrium is likely to change so that it can reach equilibrium. The most likely thing to happen does not have to happen, nor is it guaranteed to happen. Simply put, the most likely thing to happen is the most likely thing to happen.

    As an example, consider the entropy changes that accompany the transfer of heat from a hot object to a cold one, as occurs when lava spewed from a volcano flows into cold ocean water. The cold substance, the water, gains heat (q > 0), so the change in the entropy of the water can be written as ΔScold = q/Tcold. Similarly, the hot substance, the lava, loses heat (q < 0), so its entropy change can be written as ΔShot = −q/Thot, where Tcold and Thot are the temperatures of the cold and hot substances, respectively. The total entropy change of the universe accompanying this process is therefore

    \[\Delta S_{\textrm{univ}}=\Delta S_{\textrm{cold}}+\Delta S_{\textrm{hot}}=\dfrac{q}{T_{\textrm{cold}}}+\left(-\dfrac{q}{T_{\textrm{hot}}}\right) \label{Eq6}\]

    The numerators on the right side of Equation \(\ref{Eq6}\) are the same in magnitude but opposite in sign. Whether ΔSuniv is positive or negative depends on the relative magnitudes of the denominators. By definition, Thot > Tcold, so −q/Thot must be less than q/Tcold, and ΔSuniv must be positive. As predicted by the second law of thermodynamics, the entropy of the universe increases during this process. Any process for which ΔSuniv is positive is, by definition, a spontaneous one that will (very likely) occur as written. Conversely, any process for which ΔSuniv is negative will not occur as written but will (very likely) occur spontaneously in the reverse direction. We see, therefore, that heat is spontaneously transferred from a hot substance, the lava, to a cold substance, the ocean water. In fact, if the lava is hot enough (e.g., if it is molten), so much heat can be transferred that the water is converted to steam (Figure \(\PageIndex{7}\)).

    38a9391ec943f35f76c5118d7ffa13d4.jpg

    Figure \(\PageIndex{7}\): Spontaneous Transfer of Heat from a Hot Substance to a Cold Substance

    Summary

    A measure of the possible arrangements of a system is its entropy (S), a state function whose value increases with an increase in the number of available microstates. For a given system, the greater the number of microstates, the higher the entropy. The change in entropy of the system or the surroundings is the quantity of heat transferred divided by the temperature. The second law of thermodynamics states that the entropy of a system at equilibrium is constant and does not have an effect on the entropy of the universe, whereas in a system that is not at equilibrium, such as the transfer of heat from a hot object to a cold object, the entropy of the universe increases.

    Contributors

    • Modified by Tom Neils (Grand Rapids Community College)

    7.9: Entropy and the Second Law of Thermodynamics is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by LibreTexts.

    • Was this article helpful?