Skip to main content
Chemistry LibreTexts

20.9: The Statistical Definition of Entropy is Analogous to the Thermodynamic Definition

  • Page ID
    13723
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)

    We learned earlier in 20-5 that entropy, \(S\), is related to the number of microstates, \(W\), in an ensemble with \(A\) systems:

    \[S_{ensemble}=k_B \ln{W} \label{eq1}\]

    and

    \[W=\frac{A!}{\prod_j{a_j}!} \label{eq2}\]

    Combining Equations \ref{eq1} and \ref{eq2} to get:

    \[\begin{split} S_{ensemble} &= k_B \ln{\frac{A!}{\prod_j{a_j}!}} \\ &= k_B \ln{A!}-k_B\sum_j{\ln{a_j}!} \end{split}\]

    Using Sterling's approximation:

    \[\ln{A!} \approx A\ln{A}-A\]

    We obtain:

    \[S_{ensemble} = k_B A \ln{A}-k_BA-k_B\sum_j{a_j\ln{a_j}}+k_B\sum{a_j}\]

    Since:

    \[A=\sum{a_j}\]

    The expression simplifies to:

    \[S_{ensemble} = k_B A \ln{A}-k_B\sum_j{a_j\ln{a_j}}\]

    We can make use of the fact that the number of microstates in state \(j\) is equal to the total number of microstates multiplied by the probability of finding the system in state \(j\), \(p_j\):

    \[a_j=p_jA\]

    Plugging in, we obtain

    \[\begin{split}S_{ensemble} &= k_B A \ln{A}-k_B\sum_j{p_jA\ln{p_jA}} \\ &= k_B A \ln{A}-k_B\sum_j{p_jA\ln{p_j}}-k_B\sum_j{p_jA\ln{A}} \end{split}\]

    Since \(A\) is a constant and the sum of the probabilities of finding the system in state \(j\) is always 1:

    \[\sum{p_j}=1\]

    The first and last term cancel out:

    \[S_{ensemble} = -k_BA\sum_j{p_j\ln{p_j}}\]

    We can use that the entropy of the system is the entropy of the ensemble divided by the number of systems:

    \[S_{system}=S_{ensemble}/A\]

    Dividing by \(A\), we obtain:

    \[S_{system} = -k_B\sum_j{p_j\ln{p_j}}\]

    We can differentiate this equation and dropping the subscript:

    \[dS = -k_B\sum_j{\left(dp_j+\ln{p_j}dp_j\right)}\]

    Since \(\sum_j{p_j}=1\), the derivative \(\sum_j{dp_j}=0\):

    \[dS = -k_B\sum_j{\ln{p_j}dp_j}\]

    In short:

    \[\sum_j{\ln{p_j}dp_j}=-\frac{\delta q_{rev}}{k_BT}\]

    Plugging in:

    \[dS = \frac{\delta q_{rev}}{T}\]


    20.9: The Statistical Definition of Entropy is Analogous to the Thermodynamic Definition is shared under a not declared license and was authored, remixed, and/or curated by LibreTexts.