Skip to main content
Chemistry LibreTexts

16.12: Measuring the Entropy

  • Page ID
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    Up to this point we have often quoted values of the entropy without giving any indication of how such values may be obtained. Alas, there is no convenient black box labeled entropy meter into which we can put a substance and read off its entropy value on a dial. Determining the entropy turns out to be both difficult and laborious. In the case of a simple gas, if we know enough about its molecular structure and enough quantum mechanics, we can actually calculate its entropy. For most substances, though, we are forced to derive the entropy from a series of calorimetric measurements, most of them at very low temperatures. 

    This method for determining the entropy centers around a very simple relationship between q, the heat energy absorbed by a body, the temperature T at which this absorption takes place, and ΔS, the resultant increase in entropy: 

    \[\Delta S=\frac{q}{T} \label{1} \] It is possible to derive this relationship from our original definition of entropy, namely, S = k ln W, but the proof is beyond the level of this text. 

    It is easy to see how Eq. \(\ref{1}\) can be used to measure the entropy. We start with our substance as close to the absolute zero of temperature as is technically feasible and heat it in many stages, measuring the heat absorbed at each stage, until we arrive at the desired temperature, say 298 K. The initial value of the entropy is zero, and we can calculate the entropy increase for each stage by means of Eq. \(\ref{1}\)and so the sum of all these increases is the entropy value for 298 K. In the case of simple gases, values of entropy measured in this way agree very well with those calculated from knowledge of molecular structure. 

    Equation \(\ref{1}\) was discovered long before the statistical nature of entropy was realized. Scientists and engineers began to appreciate the importance of the quantity q/T very early in the nineteenth century because of its connection with the efficiency of steam engines. These arguments were developed by both Lord Kelvin in England and Rudolf Clausius (1822 to 1888) in Germany. It was Clausius who first formulated the second law in terms of the entropy S, but Clausius had only a vague idea that entropy was in any way connected with molecules or probability. The statistical nature of entropy was first suggested by Boltzmann in 1877 and then developed into an elegant system in 1902 by Josiah Willard Gibbs (1839 to 1903), one of the real giants among American scientists. 

    An important feature of Eq. \(\ref{1}\) is the inverse relationship between the entropy increase and the temperature. A given quantity of heat energy produces a very large change of entropy when absorbed at a very low temperature but only a small change when absorbed at a high temperature. 

    Example \(\PageIndex{1}\): Entropy

    Calculate the increase in entropy which a substance undergoes when it absorbs 1 kJ of heat energy at the following temperatures: (a) 3 K; (b) 300 K; (c) 3000 K.


    a) At 3 K we have \[\Delta S=\frac{\text{1000 J}}{\text{3 K}}=\text{333}\text{.3 J K}^{-\text{1}} \nonumber \] b) At 300 K, similarly, \[\Delta S=\frac{\text{1000 J}}{\text{300 K}}=\text{3}\text{.33 J K}^{-\text{1}} \nonumber \] c) At 3000K \[\Delta S=\frac{\text{1000 J}}{\text{3000 K}}=\text{0}\text{.33 J K}^{-\text{1}} \nonumber \]

    An amusing analogy to this behavior can be drawn from everyday life. If a 10-year-old boy is allowed to play in his bedroom for half an hour, the increase in disorder is scarcely noticeable because the room is already disordered (i.e., at a higher “temperature”). By contrast, if the boy is let loose for half an hour in a neat and tidy living room (i.e., at a lower “temperature”), the effect is much more dramatic.

    This page titled 16.12: Measuring the Entropy is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Ed Vitz, John W. Moore, Justin Shorb, Xavier Prat-Resina, Tim Wendorff, & Adam Hahn.

    • Was this article helpful?