Skip to main content
Chemistry LibreTexts

5.5.2: Probability and Entropy

  • Page ID
    52343
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Before we look at entropy in detail, let us look at a few systems and think about what we already know about probability. For example if you take a deck of cards and shuffle it, which is more probable: that the cards will fall into the order ace, king, queen, jack, 10, 9, etc. for each suit, or that they will end up in some random, jumbled order? Of course the answer is obvious— the random order is much more probable because there are many sequences of cards that count as “random order” but only one that counts as “ordered.” This greater probability is true even though any pre-specified random sequence is just as unlikely as the perfectly ordered one. It is because we care about a particular order that we lump all other possible orders of the cards together as “random” and do not distinguish between them.

    We can calculate, mathematically, the probability of the result we care about. To determine the probability of an event (for example, a particular order of cards), we divide the number of outcomes cared about by the total number of possible outcomes. For 52 cards there are 52! (52 factorial, or 52 x 51 x 50 x 49...) ways that the cards can be arranged.104 This number is ~8.07 x 1067, a number on the same order of magnitude as the number of atoms in our galaxy. So the probability of shuffling cards to produce any one particular order is 1/52! – a very small number indeed. But because the probability is greater than zero, this is an event that can happen. In fact, it must happen eventually, because the probability that any given arrangement of cards will occur is 1. That is a mind bender, but true nevertheless. Highly improbable events occur all the time!105

    This idea of entropy, in terms of probabilities, can help us understand why different substances or systems have different entropies. We can actually calculate entropies for many systems from the formula S = k ln W, where S is the entropy, k is the Boltzmann constant, and W is the number of distinguishable arrangements (or states) that the system has.106 So the greater the value of W (the number of arrangements), the greater the entropy.

    In some cases it is relatively easy to figure out which system has more possible arrangements. For example, in a solid substance such as ice the molecules are fixed in place and can only vibrate. In a liquid, the molecules are free to roam around; it is possible for each molecule to be anywhere within the liquid mass and not confined to one position. In a gas, the molecules are not confined at all and can be found anywhere, or at least anywhere in the container. In general, gases have more entropy than liquids and solids. This so-called positional entropy can be extended to mixtures. In most mixtures (but not all, as we will see in the case of emulsions and colloids) the number of distinguishable arrangements is larger for the mixed compared to the unmixed components. The entropy of a mixture is usually larger.

    So let us return to the idea that the direction of change in a system is determined by probabilities. We will consider the transfer of thermal energy (heat) and see if we can make sense of it. First, remember that energy is quantized. So, for any substance at a particular temperature there will be a certain number of energy quanta (recall that at the atomic-molecular level energy is quantized). To make things simpler, we will consider a four-atom solid that contains two quanta of energy. These quanta can be distributed so that a particular atom can have 0, 1, or 2 quanta of energy. You can either calculate or determine by trial and error the number of different possible arrangements of these quanta (there are 10). Remember that W is the number of distinguishable arrangements, so for this system W = 10 and S = k ln 10. Now, what happens if we consider two similar systems, one with 4 quanta and the other with 8 quanta? The system with 4 quanta will be at a lower temperature than the system with 8 quanta. We can also calculate the value of W for the 4-quanta (4-atom) system by considering the maximum number of possible ways to arrange the quanta over the 4 atoms. For the 4-atom, 4-quanta system, W = 35. If we do the same calculation for the 8-quanta, 4-atom system, W = 165. If taken together, the total number of arrangements of the two systems is 35 x 165 = 5775.107

    But what about temperature? The 4-quanta system is at a lower temperature than the 8- quanta system because the 8-quanta system has more energy. What happens if we put the two systems in contact? Energy will transfer from the hotter (8-quanta) to the colder (4-quanta) system until the temperatures are equal. At this point, each will have 6 quanta (which corresponds to a W of 84). Because there are two systems (each with 6 quanta), the total W for the combined systems is W of 84 x 84 = 7056 states. You will note that 7056 is greater than 5775. There are more distinguishable arrangements of the quanta in the two systems after the energy transfer than before. The final system is more probable and therefore has a higher entropy.

    Now you might well object, given that we are working with systems of only a few atoms each. It is easy to imagine that random fluctuations could lead to the movement of quanta from cold to hot, and that is true. That is why the behavior at the nanoscale is reversible. But when we are talking about macroscopic systems, such a possibility quickly becomes increasingly improbable as the number of atoms/molecules increases. Remember a very small drop of water with a weight of 0.05 grams contains approximately 1.8 x 1021 molecules (perhaps you can also calculate the volume of such a drop). Events that are reversible at the nanoscale are irreversible at the macroscopic scale – yet another wacky and counterintuitive fact. It is generally true that we are driven to seek a purpose for why things happen. In the grand scheme of things, the overarching idea that change in the universe is driven simply by the move to more probable states can be difficult to accept,but it is true – even when we consider living systems in the context of108 their surroundings. The presence of a living system (which is itself highly organized) increases entropy of the Universe as a whole.

    Questions to Answer

    • Which has more entropy (in each case, explain your choice):

      • A new deck of cards or a shuffled deck?

      • Separated dye and water or a mixed-up solution?

      • H2O(s) or H2O(l)?

      • CaCO3(s) or CaO(s) + CO2(g)?

      • H2O(l) (at 25 °C) or H2O(l) (at 50 °C)?

    • Do you think that the structure of a compound affects its entropy? Why?
    • Predict the relative entropies of diamond and sodium chloride, carbon dioxide, oxygen, and HF. What factors influenced your prediction? Look up the entropies. Were you correct?

    Questions to Ponder

    • Can you think of any changes that occur that seem to produce more order?
    • Why don’t living systems(organisms) violate the second law of thermodynamics?
    • Does the second law rule out evolution?
    • Does it argue for a supernatural soul running the brain?

    References

    104 http://www.schuhmacher.at/weblog/52cards.html

    105 A realistic understanding of the probability of something happening is a great asset (but would put the gambling and lottery industries, and perhaps part of the investment community, out of business). Listen to: http://www.wnyc.org/ shows/radiolab/episodes/2009/09/11

    106 or Ω in some texts.

    107 We multiply, rather than add W when we combine systems.

    108 A great lecture by Richard Feynmann on this topic: http://research.microsoft.com/apps/tools/tuva/#data=5%7C0%7C%7Cd88d1dbd-a736-4c3f-b832-2b0df62e4eca%7C%7C


    5.5.2: Probability and Entropy is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by LibreTexts.

    • Was this article helpful?