Skip to main content
Chemistry LibreTexts

5.5: Thermodynamics and Systems

  • Page ID
    355819
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    The study of how energy in its various forms moves through a system is called thermodynamics. In chemistry specifically it is called thermochemistry. The first law of thermodynamics tells us that energy can be neither created nor destroyed but it can be transferred from a system to its surroundings and vice versa.[15] For any system, if we add up the kinetic and potential energies of all of the particles that make up the substance we get the total energy. This is called the system’s internal energy, abbreviated as \(\mathrm{E}\) in chemistry.[16] It turns out that it is not really possible to measure the total internal energy of a system. But we can measure and calculate the change in internal energy represented as \(\Delta \mathrm{E}\) (we use the Greek letter \(\Delta\) to stand for change). There are two ways that the internal energy of a system can change: we can change the total amount of thermal energy in the system (denoted as \(q\)), or the system can do work or have work done to it (denoted as \(w\)). The change in internal energy is therefore: \[\Delta \mathrm{E} = q + w\]

    At the molecular level, it should now be relatively easy to imagine the effects of adding or removing thermal energy from a system. However work (usually defined as force multiplied by distance) done on or by the system is a macroscopic phenomenon. If the system comes to occupy a larger or smaller volume, work must be done on the surroundings or on the system, respectively. With the exception of gases, most systems we study in chemistry do not expand or contract significantly. In these situations, \(\Delta \mathrm{E} = q\), the change in thermal energy (heat). In addition, most of the systems we study in chemistry and biology are under constant pressure (usually atmospheric pressure). Heat change at constant pressure is what is known as a state function, it is called enthalpy (\(\mathrm{H}\)). A state function is a property of a system that does not depend upon the path taken to get to a particular state. Typically we use upper case symbols (for example, \(\mathrm{H}\), \(\mathrm{T}\), \(\mathrm{P}\), \(\mathrm{E}\), \(\mathrm{G}\), \(\mathrm{S}\)) to signify state functions, and lower case symbols for properties that depend on the path by which the change is made (for example, \(q\) and \(w\)). You may be wondering what the difference between a state and path function is. Imagine you are climbing Mount Everest. If you were able to fly in a balloon from the base camp to the top you would travel a certain distance and the height change would be \(29,029\) feet. Now in contrast if you traveled on foot, via any one of the recorded paths – which wind around the mountain you would travel very different distances – but the height change would still be \(29,029\) feet. That is the distance travelled is a path function and the height of Mt Everest is a state function. Similarly, both \(q\) and \(\Delta \mathrm{H}\) describe thermal energy changes but \(q\) depends on the path and \(\Delta \mathrm{H}\) does not. In a system at constant pressure with no volume change, it is the change in enthalpy (\(\Delta \mathrm{H}\)) that we will be primarily interested in (together with the change in entropy (\(\Delta \mathrm{S}\)), which we examine shortly in greater detail).

    Because we cannot measure energy changes directly we have to use some observable (and measurable) change in the system. Typically we measure the temperature change and then relate it to the energy change. For changes that occur at constant pressure and volume this energy change is the enthalpy change, \(\Delta \mathrm{H}\). If we know the temperature change (\(\Delta \mathrm{T}\)), the amount (mass) of material and its specific heat, we can calculate the enthalpy change: \[\Delta H(J)=\text { mass }(g) \times \text { specific heat }\left(J / g^{\circ} C\right) \times \Delta T\left({ }^{\circ} C\right) .\][17]

    When considering the enthalpy change for a process, the direction of energy transfer is important. By convention, if thermal energy goes out of the system to the surroundings (that is, the surroundings increase in temperature), the sign of \(\Delta \mathrm{H}\) is negative and we say the process is exothermic (literally, “heat out”). Combustion reactions, such as burning wood or gasoline in air, are probably the most common examples of exothermic processes. In contrast, if a process requires thermal energy from the surroundings to make it happen, the sign of \(\Delta \mathrm{H}\) is positive and we say the process is endothermic (energy is transferred from the surroundings to the system).

    Questions

    Questions to Answer

    • You have systems (at \(10 { }^{\circ}\mathrm{C}\)) composed of water, methanol, ethanol, or propanol. Predict the final temperature of each system if equal amounts of thermal energy (\(q\)) are added to equal amounts of a substance (\(m\)). What do you need to know to do this calculation?
    • Draw a simple sketch of a system and surroundings. Indicate by the use of arrows what we mean by an endothermic process and an exothermic process. What is the sign of \(\Delta \mathrm{H}\) for each process?
    • Draw a similar diagram and show the direction and sign of work (\(w\)) when the system does work on the surroundings (expands), and when the surroundings do work on the system (contracts).
    • Draw a diagram to show the molecular level mechanism by which thermal energy is transferred in or out of a system. For example how is thermal energy transferred as an ice cube melts in a glass of water?

    Questions to Ponder

    • What does the difference in behavior of water, methanol, ethanol, and propane tell us about their molecular behavior/organization/structure?

    The Second Law of Thermodynamics

    Whereas the first law of thermodynamics states that you cannot get more energy out of a system than is already present in some form, the second law of thermodynamics tells us that we cannot even get back the energy that we use to bring about a change in a system. The idea in the second law is captured by the phrase “for any change in a system, the total entropy of the universe must increase.” As we will see, this means that some of the energy is changed into a form that is no longer useful (that is, it cannot do work).

    There are lots of subtle and not so subtle implications captured by this statement and we will need to look at them carefully to identify them. You may already have some idea of what entropy means, but can you define it? As you might imagine it is not a simple idea. The word entropy is often used to designate randomness or disorder but this is not a very useful or accurate way to define entropy (although randomly disordered systems do have high entropy). A better way to think about entropy is in terms of probabilities: how to measure, calculate, and predict outcomes. Thermal energy transfers from hot to cold systems because the outcome is the most probable outcome. A drop of dye disperses in water because the resulting dispersed distribution of dye molecules is the most probable. Osmosis occurs when water passes through a membrane from a dilute to a more concentrated solution because the resulting system is more probable. In fact whenever a change occurs, the overall entropy of the universe always increases.[18] The second law has (as far as we know) never, ever been violated. In fact the direction of entropy change has been called “time’s arrow”; the forward direction of time is determined by the entropy change. At this point you should be shaking your head. All this cannot possibly be true! First of all, if entropy is always increasing, then was there a time in the past when entropy was 0?[19] Second, are there not situations where entropy decreases and things become more ordered, like when you clean up a room? Finally, given that common sense tells us that time flows in only one direction (to the future), how is it possible that at the atomic and molecular scale all events are reversible?

    Probability and Entropy

    Before we look at entropy in detail, let us look at a few systems and think about what we already know about probability. For example if you take a deck of cards and shuffle it, which is more probable: that the cards will fall into the order ace, king, queen, jack, 10, 9, etc. for each suit, or that they will end up in some random, jumbled order? Of course the answer is obvious—the random order is much more probable because there are many sequences of cards that count as “random order” but only one that counts as “ordered.” This greater probability is true even though any pre-specified random sequence is just as unlikely as the perfectly ordered one. It is because we care about a particular order that we lump all other possible orders of the cards together as “random” and do not distinguish between them.

    We can calculate, mathematically, the probability of the result we care about. To determine the probability of an event (for example, a particular order of cards), we divide the number of outcomes cared about by the total number of possible outcomes. For 52 cards there are \(52!\) (52 factorial, or \(52 \times 51 \times 50 \times 49 \ldots\)) ways that the cards can be arranged.[20] This number is \(\sim 8.07 \times 10^{67}\), a number on the same order of magnitude as the number of atoms in our galaxy. So the probability of shuffling cards to produce any one particular order is \(\frac{1}{52!}\) – a very small number indeed. But because the probability is greater than zero, this is an event that can happen. In fact, it must happen eventually, because the probability that any given arrangement of cards will occur is 1. That is a mind bender, but true nevertheless. Highly improbable events occur all the time![21]

    This idea of entropy, in terms of probabilities, can help us understand why different substances or systems have different entropies. We can actually calculate entropies for many systems from the formula \(\mathrm{S} =k \ln \(\mathrm{W}\)\), where \(\mathrm{S}\) is the entropy, \(k\) is the Boltzmann constant, and \(\mathrm{W}\) is the number of distinguishable arrangements (or states) that the system has.[22] So the greater the value of \(\mathrm{W}\) (the number of arrangements), the greater the entropy.

    In some cases it is relatively easy to figure out which system has more possible arrangements. For example, in a solid substance such as ice the molecules are fixed in place and can only vibrate. In a liquid, the molecules are free to roam around; it is possible for each molecule to be anywhere within the liquid mass and not confined to one position. In a gas, the molecules are not confined at all and can be found anywhere, or at least anywhere in the container. In general, gases have more entropy than liquids and solids. This so-called positional entropy can be extended to mixtures. In most mixtures (but not all, as we will see in the case of emulsions and colloids) the number of distinguishable arrangements is larger for the mixed compared to the unmixed components. The entropy of a mixture is usually larger.

    So let us return to the idea that the direction of change in a system is determined by probabilities. We will consider the transfer of thermal energy (heat) and see if we can make sense of it. First, remember that energy is quantized. So, for any substance at a particular temperature there will be a certain number of energy quanta (recall that at the atomic-molecular level energy is quantized). To make things simpler, we will consider a four-atom solid that contains two quanta of energy. These quanta can be distributed so that a particular atom can have 0, 1, or 2 quanta of energy. You can either calculate or determine by trial and error the number of different possible arrangements of these quanta (there are 10). Remember that \(\mathrm{W}\) is the number of distinguishable arrangements, so for this system \(\mathrm{W} = 10\) and \(\mathrm{S} = k \ln 10\). Now, what happens if we consider two similar systems, one with 4 quanta and the other with 8 quanta? The system with 4 quanta will be at a lower temperature than the system with 8 quanta. We can also calculate the value of \(\mathrm{W}\) for the 4-quanta (4-atom) system by considering the maximum number of possible ways to arrange the quanta over the 4 atoms. For the 4-atom, 4-quanta system, \(\mathrm{W} = 35\). If we do the same calculation for the 8-quanta, 4-atom system, \(\mathrm{W} = 165\). If taken together, the total number of arrangements of the two systems is \(35 \times 165 = 5775\).[23]

    But what about temperature? The 4-quanta system is at a lower temperature than the 8-quanta system because the 8-quanta system has more energy. What happens if we put the two systems in contact? Energy will transfer from the hotter (8-quanta) to the colder (4-quanta) system until the temperatures are equal. At this point, each will have 6 quanta (which corresponds to a \(\mathrm{W}\) of \(84\)). Because there are two systems (each with 6 quanta), the total \(\mathrm{W}\) for the combined systems is \(\mathrm{W}\) of \(84 \times 84 = 7056\) states. You will note that \(7056\) is greater than \(5775\). There are more distinguishable arrangements of the quanta in the two systems after the energy transfer than before. The final system is more probable and therefore has a higher entropy.

    Now you might well object, given that we are working with systems of only a few atoms each. It is easy to imagine that random fluctuations could lead to the movement of quanta from cold to hot, and that is true. That is why the behavior at the nanoscale is reversible. But when we are talking about macroscopic systems, such a possibility quickly becomes increasingly improbable as the number of atoms/molecules increases. Remember a very small drop of water with a weight of \(0.05\) grams contains approximately \(1.8 \times 10^{21}\) molecules (perhaps you can also calculate the volume of such a drop). Events that are reversible at the nanoscale are irreversible at the macroscopic scale – yet another wacky and counterintuitive fact. It is generally true that we are driven to seek a purpose for why things happen. In the grand scheme of things, the overarching idea that change in the universe is driven simply by the move to more probable states can be difficult to accept,but it is true – even when we consider living systems in the context of their surroundings.[24] The presence of a living system (which is itself highly organized) increases entropy of the Universe as a whole.

    Questions

    Questions to Answer

    • Which has more entropy (in each case, explain your choice):
      • A new deck of cards or a shuffled deck?
      • Separated dye and water or a mixed-up solution?
      • \(\mathrm{H}_{2}\mathrm{O}\)(\(s\)) or \(\mathrm{H}_{2}\mathrm{O}\)(\(l\))?
      • \(\mathrm{CaCO}_{3}\)(\(s\)) or \(\mathrm{CaO}\)(\(s\)) + \(\mathrm{CO}_{2}\)(\(g\))?
      • \(\mathrm{H}_{2}\mathrm{O}\)(\(l\)) (at \(25 { }^{\circ}\mathrm{C}\)) or \(\mathrm{H}_{2}\mathrm{O}\)(\(l\)) (at \(50 { }^{\circ}\mathrm{C}\))?
    • Do you think that the structure of a compound affects its entropy? Why?
    • Predict the relative entropies of diamond and sodium chloride, carbon dioxide, oxygen, and \(\mathrm{HF}\). What factors influenced your prediction? Look up the entropies. Were you correct?

    Questions to Ponder

    • Can you think of any changes that occur that seem to produce more order?
    • Why don’t living systems (organisms) violate the second law of thermodynamics?
    • Does the second law rule out evolution?
    • Does it argue for a supernatural soul running the brain?

    This page titled 5.5: Thermodynamics and Systems is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Melanie M. Cooper & Michael W. Klymkowsky via source content that was edited to the style and standards of the LibreTexts platform.