# 20.9: The Statistical Definition of Entropy is Analogous to the Thermodynamic Definition

$$\newcommand{\vecs}{\overset { \rightharpoonup} {\mathbf{#1}} }$$ $$\newcommand{\vecd}{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$$$\newcommand{\AA}{\unicode[.8,0]{x212B}}$$

We learned earlier in 20-5 that entropy, $$S$$, is related to the number of microstates, $$W$$, in an ensemble with $$A$$ systems:

$S_{ensemble}=k_B \ln{W} \label{eq1}$

and

$W=\frac{A!}{\prod_j{a_j}!} \label{eq2}$

Combining Equations \ref{eq1} and \ref{eq2} to get:

$\begin{split} S_{ensemble} &= k_B \ln{\frac{A!}{\prod_j{a_j}!}} \\ &= k_B \ln{A!}-k_B\sum_j{\ln{a_j}!} \end{split} \nonumber$

$\ln{A!} \approx A\ln{A}-A \nonumber$

We obtain:

$S_{ensemble} = k_B A \ln{A}-k_BA-k_B\sum_j{a_j\ln{a_j}}+k_B\sum{a_j} \nonumber$

Since:

$A=\sum{a_j} \nonumber$

The expression simplifies to:

$S_{ensemble} = k_B A \ln{A}-k_B\sum_j{a_j\ln{a_j}} \nonumber$

We can make use of the fact that the number of microstates in state $$j$$ is equal to the total number of microstates multiplied by the probability of finding the system in state $$j$$, $$p_j$$:

$a_j=p_jA \nonumber$

Plugging in, we obtain

$\begin{split}S_{ensemble} &= k_B A \ln{A}-k_B\sum_j{p_jA\ln{p_jA}} \\ &= k_B A \ln{A}-k_B\sum_j{p_jA\ln{p_j}}-k_B\sum_j{p_jA\ln{A}} \end{split} \nonumber$

Since $$A$$ is a constant and the sum of the probabilities of finding the system in state $$j$$ is always 1:

$\sum{p_j}=1 \nonumber$

The first and last term cancel out:

$S_{ensemble} = -k_BA\sum_j{p_j\ln{p_j}} \nonumber$

We can use that the entropy of the system is the entropy of the ensemble divided by the number of systems:

$S_{system}=S_{ensemble}/A \nonumber$

Dividing by $$A$$, we obtain:

$S_{system} = -k_B\sum_j{p_j\ln{p_j}} \nonumber$

We can differentiate this equation and dropping the subscript:

$dS = -k_B\sum_j{\left(dp_j+\ln{p_j}dp_j\right)} \nonumber$

Since $$\sum_j{p_j}=1$$, the derivative $$\sum_j{dp_j}=0$$:

$dS = -k_B\sum_j{\ln{p_j}dp_j} \nonumber$

In short:

$\sum_j{\ln{p_j}dp_j}=-\frac{\delta q_{rev}}{k_BT} \nonumber$

Plugging in:

$dS = \frac{\delta q_{rev}}{T} \nonumber$

20.9: The Statistical Definition of Entropy is Analogous to the Thermodynamic Definition is shared under a not declared license and was authored, remixed, and/or curated by LibreTexts.