# 3: The Second and Third Laws of Thermodynamics

- Page ID
- 456157

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

- 3.1: Introduction to the Second Law
- The second law of thermodynamics, which introduces us to the topic of entropy, is amazing in how it constrains what we can experience and what we can do in the universe. A spontaneous process is one that will occur without external forces pushing it. A process can be spontaneous even if it happens very slowly. Unfortunately, Thermodynamics is silent on the topic of how fast processes will occur, but is provides us with a powerful toolbox for predicting which processes will be spontaneous.

- 3.2: Heat Engines and the Carnot Cycle
- To simplify his analysis of the inner workings of an engine, Carnot devised a useful construct for examining what affect engine efficiency. His construct is the heat engine. The idea behind a heat engine is that it will take energy in the form of heat, and transform it into an equivalent amount of work. Unfortunately, such a device is impractical. As it turns out, nature prevents the complete conversion of energy into work with perfect efficiency. This leads to an important statement of the Sec

- 3.3: Entropy
- In addition to learning that the efficiency of a Carnot engine depends only on the high and low temperatures, more interesting things can be derived through the exploration of this system.

- 3.4: Calculating Entropy Changes
- Entropy changes are fairly easy to calculate so long as one knows initial and final state. For example, if the initial and final volume are the same, the entropy can be calculated by assuming a reversible, isochoric pathway and determining an expression for dq/T. That term can then be integrated from the initial condition to the final conditions to determine the entropy change.

- 3.5: Comparing the System and the Surroundings
- It is oftentimes important to calculate both the entropy change of the system as well as that of the surroundings. Depending on the size of the surroundings, they can provide or absorb as much heat as is needed for a process without changing temperature. As such, it is oftentimes a very good approximation to consider the changes to the surroundings as happening isothermally, even though it may not be the case for the system (which is often smaller).

- 3.6: Entropy and Disorder
- A common interpretation of entropy is that it is somehow a measure of chaos or randomness. There is some utility in that concept. Given that entropy is a measure of the dispersal of energy in a system, the more chaotic a system is, the greater the dispersal of energy will be, and thus the greater the entropy will be.

- 3.7: The Third Law of Thermodynamics
- One important consequence of Botlzmann’s proposal is that a perfectly ordered crystal (i.e. one that has only one energetic arrangement in its lowest energy state) will have an entropy of 0. This makes entropy qualitatively different than other thermodynamic functions. For example, in the case of enthalpy, it is impossible have a zero to the scale without setting an arbitrary reference (i.e., the enthalpy of formation of elements in their standard states is zero.) But entropy has a natural zero!

- 3.8: Adiabatic Compressibility
- The isothermal compressibility is a very useful quantity, as it can be measured for many different substances and tabulated. Also, as we will see in the next chapter, it can be used to evaluate several different partial derivatives involving thermodynamic variables.

- 3.9: Putting the Second Law to Work
- In the previous chapter, we saw that for a spontaneous process, ΔSuniv>0. While this is a useful criterion for determining whether or not a process is spontaneous, it is rather cumbersome, as it requires one to calculate not only the entropy change for the system, but also that of the surroundings. It would be much more convenient if there was a single criterion that would do the job and focus only on the system. As it turns out, there is!
- 3.9.1: Free Energy Functions
- 3.9.2: Combining the First and Second Laws - Maxwell's Relations
- 3.9.3: ΔA, ΔG, and Maximum Work
- 3.9.4: Volume Dependence of Helmholtz Energy
- 3.9.5: Pressure Dependence of Gibbs Energy
- 3.9.6: Temperature Dependence of A and G
- 3.9.7: When Two Variables Change at Once
- 3.9.8: The Difference between Cp and Cv

- 3.E: The Second Law (Exercises)
- Exercises for Chapter 5 "The Second Law" in Fleming's Physical Chemistry Textmap.