Skip to main content
Chemistry LibreTexts

6.13: The Laws of Thermodynamics

  • Page ID
    152059
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    We usually consider that the first, second, and third laws of thermodynamics are basic postulates. One of our primary objectives is to understand the ideas that are embodied in these laws. We introduce these ideas here, using statements of the laws of thermodynamics that are immediately applicable to chemical systems. In the next three chapters, we develop some of the most important consequences of these ideas. In the course of doing so, we examine other ways that these laws have been stated.

    The first law deals with the definition and properties of energy. The second and third laws deal with the definition and properties of entropy. The laws of thermodynamics assert that energy and entropy are state functions. In the next chapter, we discuss the mathematical properties of state functions. Energy and entropy changes are defined in terms of the heat and work exchanged between a system and its surroundings. We adopt the convention that heat and work are positive if they increase the energy of the system. In a process in which a closed system accepts increments of heat, \(dq\), and work, \(dw\), from its surroundings, we define the changes in the energy, \(dE\), and the entropy, \(dS\), of the system in terms of \(dq\), \(dw\), and the temperature.

    The meaning of the first law is intimately related to a crucial distinction between the character of energy on the one hand and that of the variables heat and work on the other. When we say that energy is a state function, we mean that the energy is a property of the system. In contrast, heat and work are not properties of the system; rather they describe a process in which the system changes. When we say that the heat exchanged in a process is \(q\), we mean that \(q\) units of thermal energy are transferred from the surroundings to the system. If \(q>0\), the energy of the system increases by this amount, and the energy of the surroundings decreases by the same amount. \(q\) has meaning only as a description of one aspect of the process.

    When the process is finished, the system has an energy, but \(q\) exists only as an accounting record. Like the amount on a cancelled check that records how much we paid for something, \(q\) is just a datum about a past event. Likewise, \(w\) is the record of the amount of non-thermal energy that is transferred. Because we can effect the same change in the energy of a system in many different ways, we have to measure \(q\) and \(w\) for a particular process as the process is taking place. We cannot find them by making measurements on the system after the process has gone to completion.

    In Section 6.1, we introduce a superscripted caret to denote a property (state function) of the surroundings. Thus, \(E\) is the energy of the system; \(\hat{E}\) is the energy of the surroundings; \(dE\) is an incremental change in the energy of the system; and \(d\hat{E}\) is an incremental change in the energy of the surroundings. If we are careful to remember that heat and work are not state functions, it is useful to extend this notation to increments of heat and work. If \(q\) units of energy are transmitted to the system as heat, we let \(\hat{q}\) be the thermal energy transferred to the surroundings in the same process. Then \(\hat{q}=-q\), and \(\hat{q}+q=0\). Likewise, we let \(w\) be the work done on the system and \(\hat{w}\) be the work done on the surroundings in the same process, so that \(\hat{w}=-w\), and \(\hat{w}+w=0\). Unlike \(E\) and \(\hat{E}\), which are properties of different systems, \(q\) and \(\hat{q}\) (or \(w\) and \(\hat{w}\)) are merely alternative expressions of the same thing—the quantity of energy transferred as heat (or work).

    We define the incremental change in the energy of a closed system as \(dE=dq+dw\). The accompanying change in the energy of the surroundings is \(d\hat{E}=d\hat{q}+d\hat{w}\), so that \(dE+d\hat{E}=0\). Whereas \(\hat{q}+q=0\) (or \(d\hat{q}+dq=0\)) is a tautology, because it merely defines \(\hat{q}\) as \(-q\), the first law asserts that \(dE_{universe}=dE+d\hat{E}\) is a fundamental property of nature. Any increase in the energy of the system is accompanied by a decrease in the energy of the surroundings, and conversely. Energy is conserved; heat is not; work is not.

    The first law of thermodynamics

    In a process in which a closed system accepts increments of heat, \( {dq}\), and work, \( {dw}\), from its surroundings, the change in the energy of the system, \( {dE}\), is \( dE = dq + dw\). Energy is a state function. For any process, \( dE_{universe} =0.\)

    For a reversible process in which a system passes from state A to state B, the amount by which the energy of the system changes is the line integral of \(dE\) along the path followed. Denoting an incremental energy change along this path as \(d_{AB}E\), we have \(\Delta_{AB}E=\int^B_A{d_{AB}E}\). (We review line integrals in the next chapter.) The energy change for the surroundings is the line integral of \(d\hat{E}\) along the path followed by the surroundings during the same process:

    \[\Delta_{AB}\hat{E}=\int^B_A{d_{AB}\hat{E}}. \nonumber \]

    For any process in which energy is exchanged with the surroundings, the change in the system’s energy is \(\Delta E=q+w\), where \(q\) and \(w\) are the amounts of thermal and non-thermal energy delivered to the system. We can compute \(\Delta E\) from \(q\) and \(w\) whether the process is reversible or irreversible.

    In contrast, the definition of entropy change applies only to reversible processes. In a process in which a system reversibly accepts an increment of heat, \(dq^{rev}\), from its surroundings, the entropy change is defined by \(dS={dq^{rev}}/{T}\). (We introduce the superscript, “rev”, to distinguish heat and work exchanged in reversible processes from heat and work exchanged in irreversible, “irrev”, or spontaneous, “spon”, processes.) When a system passes reversibly from state A to state B, the entropy change for the system is the line integral of \(d_{AB}S={q^{rev}}/{T}\) along the path followed:

    \[\Delta_{AB}S=\int^B_A{d_{AB}q^{rev}/T}. \nonumber \]

    The entropy change for the surroundings is defined by the same relationship,

    \[d_{AB}\hat{S}=d_{AB}{\hat{q}^{rev}}/{T}. \nonumber \]

    Every system has an entropy. The entropies of the system and of its surroundings can change whenever a system undergoes a change. If the change is reversible, \(\Delta S=-\Delta \hat{S}\).

    The second law of thermodynamics

    In a reversible process in which a closed system accepts an increment of heat, \(dq^{rev}\), from its surroundings, the change in the entropy of the system, \( {dS}\), is \( dS=dq^{rev}/T\). Entropy is a state function. For any reversible process, \( dS_{universe} = 0\), and conversely. For any spontaneous process,\( {\ } {d}{ {S}}_{ {universe}} {>} {0}\), and conversely.

    We define the entropy change of the universe by \(dS_{universe}=dS+d\hat{S}\); it follows that \(\Delta_{AB}S_{universe}=\Delta_{AB}S+\Delta_{AB}\hat{S}\) for any process in which a system passes from a state A to a state B, whether the process is reversible or not. Since \(dS_{universe}=0\) for every part of a reversible process, we have \(\Delta S_{universe}=0\) for any reversible process. Likewise, since\(\ dS_{universe}>0\) for every part of a spontaneous process, we have \(\Delta S_{universe}>0\) for any spontaneous process.

    The third law deals with the properties of entropy at temperatures in the neighborhood of absolute zero. It is possible to view the third law as a statement about the properties of the temperature function. It is also possible to view it as a statement about the properties of heat capacities. A statement in which the third law attributes particular properties to the entropy of pure substances is directly applicable to chemical systems. This statement is that of Lewis and Randall\({}^{3}\):

    third law of thermodynamics

    If the entropy of each element in some crystalline state be taken as zero at the absolute zero of temperature, every substance has a positive finite entropy; but at the absolute zero of temperature the entropy may become zero, and does so become in the case of perfect crystalline substances.

    The Lewis and Randall statement focuses on the role that the third law plays in our efforts to express the thermodynamic properties of pure substances in useful ways. To do so, it incorporates a matter of definition when it stipulates that “the entropy of each element be taken as zero at the absolute zero of temperature.” The third law enables us to find thermodynamic properties (“absolute entropies” and Gibbs free energies of formation) from which we can make useful predictions about the equilibrium positions of reactions. The third law can be inferred from experimental observations on macroscopic systems. It also arises in a natural way when we develop the theory of statistical thermodynamics. In both developments, the choice of zero for the entropy of “each element in some crystalline state” at absolute zero is—while arbitrary—logical, natural, and compellingly convenient.


    This page titled 6.13: The Laws of Thermodynamics is shared under a CC BY-SA 4.0 license and was authored, remixed, and/or curated by Paul Ellgen via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.