Skip to main content
Chemistry LibreTexts

6.8: Entropy and the Second Law of Thermodynamics

  • Page ID
    472586
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)
    Learning Objectives
    • Define entropy and calculate the increase of entropy in a system with reversible and irreversible processes.
    • Calculate the increasing disorder of a system.
    photograph of a glass of water with a lemon in it.
    Figure \(\PageIndex{1}\): The ice in this drink is slowly melting. Eventually the liquid will reach thermal equilibrium, as predicted by the second law of thermodynamics. (credit: Jon Sullivan, PDPhoto.org)

    There is yet another way of expressing the second law of thermodynamics. This version relates to a concept called entropy. By examining it, we shall see that the directions associated with the second law—heat transfer from hot to cold, for example—are related to the tendency in nature for systems to become disordered and for less energy to be available for use as work. The entropy of a system can in fact be shown to be a measure of its disorder and of the unavailability of energy to do work. We will first present the physical way in which this can be measured, and then discuss additional aspects of it in the subsections. In the first subsection we will discuss why entropy works the way it does. In the second subsection we will discuss some of the implications of this for us, the world we live in, and the universe as a whole.

    MAKING CONNECTIONS: ENTROPY, ENERGY, AND WORK

    Recall that the simple definition of energy is the ability to do work. Entropy is a measure of how much energy is not available to do work. Although all forms of energy are interconvertible, and all can be used to do work, it is not always possible, even in principle, to convert the entire available energy into work. That unavailable energy is of interest in thermodynamics, because the field of thermodynamics arose from efforts to convert heat to work.

    A value for entropy can be derived from the Carnot Cycle:

    \[\Delta S=\left(\frac{Q}{T}\right)_{\mathrm{rev}}, \nonumber \]

    where \(Q\) is the heat transfer, which is positive for heat transfer into and negative for heat transfer out of, and \(T\) is the absolute temperature at which the reversible process takes place. The SI unit for entropy is joules per kelvin (J/K).

    The definition of \(\Delta S\) is strictly valid only for reversible processes, such as used in a Carnot engine. However, we can find \(\Delta S\) precisely even for real, irreversible processes. The reason is that the entropy \(S\) of a system, like internal energy \(U\), depends only on the state of the system and not how it reached that condition. Entropy is a property of state. Thus, the change in entropy \(\Delta S\) of a system between state 1 and state 2 is the same no matter how the change occurs. We just need to find or imagine a reversible process that takes us from state 1 to state 2 and calculate \(\Delta S\) for that process. That will be the change in entropy for any process going from state 1 to state 2. (See Figure \(\PageIndex{2}\).)

    schematic of processes between two states. The change in entropy is shown to be the same whether it is a reversible process or irreversible process.
    Figure \(\PageIndex{2}\): When a system goes from state 1 to state 2, its entropy changes by the same amount \(\Delta S\), whether a hypothetical reversible path is followed or a real irreversible path is taken.

    We can also say that the total change in entropy for a system in any reversible process is zero. The entropy of various parts of the system may change, but the total change is zero. Furthermore, the system does not affect the entropy of its surroundings, since heat transfer between them does not occur. Thus, the reversible process changes neither the total entropy of the system nor the entropy of its surroundings. Sometimes this is stated as follows: Reversible processes do not affect the total entropy of the universe. Real processes are not reversible, though, and they do change total entropy.

    It is reasonable that entropy increases for heat transfer from hot to cold. Since the change in entropy is \(Q / T\), there is a larger change at lower temperatures. The decrease in entropy of the hot object is therefore less than the increase in entropy of the cold object, producing an overall increase. This result is very general:

    There is an increase in entropy for any system undergoing an irreversible process.

    With respect to entropy, there are only two possibilities: entropy is constant for a reversible process, and it increases for an irreversible process.

    THE SECOND LAW OF THERMODYNAMICS (FOURTH EXPRESSION)

    The total entropy of the universe either increases or remains constant in any process; it never decreases.

    For example, heat transfer cannot occur spontaneously from cold to hot, because entropy would decrease.

    Entropy is very different from energy. Entropy is not conserved but increases in all real processes. Reversible processes (such as in Carnot engines) are the processes in which the most heat transfer to work takes place and are also the ones that keep entropy constant. Thus, we are led to make a connection between entropy and the availability of energy to do work.

    Order to Disorder

    Entropy is related not only to the unavailability of energy to do work—it can also be thought of as a measure of disorder. (Perhaps more accurately phrased as dispersion.) This notion was initially postulated by Ludwig Boltzmann in the 1800s, and will be examined in more detail in the following subsection. For example, melting a block of ice means taking a highly structured and orderly system of water molecules and converting it into a disorderly liquid in which molecules have no fixed positions. (See Figure \(\PageIndex{4}\).) There is a large increase in entropy in the process, as seen in the following example.

    Example \(\PageIndex{2}\): Entropy Associated with Disorder

    Find the increase in entropy of 1.00 kg of ice originally at \(0^{\circ} \mathrm{C}\) that is melted to form water at \(0^{\circ} \mathrm{C}\).

    Strategy

    As before, the change in entropy can be calculated from the definition of \(\Delta S\) once we find the energy \(Q\) needed to melt the ice.

    Solution

    The change in entropy is defined as:

    \[\Delta S=\frac{Q}{T}. \nonumber\]

    Here \(Q\) is the heat transfer necessary to melt 1.00 kg of ice and is given by

    \[Q=m L_{\mathrm{f}}, \nonumber\]

    where \(m\) is the mass and \(L_{\mathrm{f}}\) is the latent heat of fusion. \(L_{\mathrm{f}}=334 \mathrm{~kJ} / \mathrm{kg}\) for water, so that

    \[Q=(1.00 \mathrm{~kg})(334 \mathrm{~kJ} / \mathrm{kg})=3.34 \times 10^{5} \mathrm{~J}. \nonumber\]

    Now the change in entropy is positive, since heat transfer occurs into the ice to cause the phase change; thus,

    \[\Delta S=\frac{Q}{T}=\frac{3.34 \times 10^{5} \mathrm{~J}}{T}. \nonumber\]

    \(T\) is the melting temperature of ice. That is, \(T=0^{\circ} \mathrm{C}=273 \mathrm{~K}\). So the change in entropy is

    \[\begin{aligned}
    \Delta S &=\frac{3.34 \times 10^{5} \mathrm{~J}}{273 \mathrm{~K}} \\
    &=1.22 \times 10^{3} \mathrm{~J} / \mathrm{K}.
    \end{aligned} \nonumber\]

    Discussion

    This is a significant increase in entropy accompanying an increase in disorder.

    drawing of a snowflake and then individual water molecules moving about.
    Figure \(\PageIndex{4}\): When ice melts, it becomes more disordered and less structured. The systematic arrangement of molecules in a crystal structure is replaced by a more random and less orderly movement of molecules without fixed locations or orientations. Its entropy increases because heat transfer occurs into it. Entropy is a measure of disorder.

    Section Summary

    • Entropy is the loss of energy available to do work.
    • Another form of the second law of thermodynamics states that the total entropy of a system either increases or remains constant; it never decreases.
    • Change of entropy is zero in a reversible process; it increases in an irreversible process.
    • Entropy is also associated with the tendency toward disorder in an isolated system.

    Glossary

    entropy
    a measurement of a system's disorder and its inability to do work in a system
    change in entropy
    the ratio of heat transfer to temperature \(Q/T\)
    second law of thermodynamics stated in terms of entropy
    the total entropy of a system either increases or remains constant; it never decreases

    This page titled 6.8: Entropy and the Second Law of Thermodynamics is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Jamie MacArthur.