6: Entropy, Part I
 Page ID
 199201










 6.1: Introduction to the Second Law
 The second law of thermodynamics, which introduces us to the topic of entropy, is amazing in how it constrains what we can experience and what we can do in the universe. A spontaneous process is one that will occur without external forces pushing it. A process can be spontaneous even if it happens very slowly. Unfortunately, Thermodynamics is silent on the topic of how fast processes will occur, but is provides us with a powerful toolbox for predicting which processes will be spontaneous.
 6.2: Heat Engines and the Carnot Cycle
 To simplify his analysis of the inner workings of an engine, Carnot devised a useful construct for examining what affect engine efficiency. His construct is the heat engine. The idea behind a heat engine is that it will take energy in the form of heat, and transform it into an equivalent amount of work. Unfortunately, such a device is impractical. As it turns out, nature prevents the complete conversion of energy into work with perfect efficiency. This leads to an important statement of the Sec
 6.3: Entropy
 In addition to learning that the efficiency of a Carnot engine depends only on the high and low temperatures, more interesting things can be derived through the exploration of this system.
 6.4: Calculating Entropy Changes
 Entropy changes are fairly easy to calculate so long as one knows initial and final state. For example, if the initial and final volume are the same, the entropy can be calculated by assuming a reversible, isochoric pathway and determining an expression for dq/T. That term can then be integrated from the initial condition to the final conditions to determine the entropy change.
 6.5: Comparing the System and the Surroundings
 It is oftentimes important to calculate both the entropy change of the system as well as that of the surroundings. Depending on the size of the surroundings, they can provide or absorb as much heat as is needed for a process without changing temperature. As such, it is oftentimes a very good approximation to consider the changes to the surroundings as happening isothermally, even though it may not be the case for the system (which is often smaller).
 6.6: Entropy and Disorder
 A common interpretation of entropy is that it is somehow a measure of chaos or randomness. There is some utility in that concept. Given that entropy is a measure of the dispersal of energy in a system, the more chaotic a system is, the greater the dispersal of energy will be, and thus the greater the entropy will be.
 6.7: The Third Law of Thermodynamics
 One important consequence of Botlzmann’s proposal is that a perfectly ordered crystal (i.e. one that has only one energetic arrangement in its lowest energy state) will have an entropy of 0. This makes entropy qualitatively different than other thermodynamic functions. For example, in the case of enthalpy, it is impossible have a zero to the scale without setting an arbitrary reference (i.e., the enthalpy of formation of elements in their standard states is zero.) But entropy has a natural zero!
 6.8: Adiabatic Compressibility
 The isothermal compressibility is a very useful quantity, as it can be measured for many different substances and tabulated. Also, as we will see in the next chapter, it can be used to evaluate several different partial derivatives involving thermodynamic variables.
 6.E: The Second Law (Exercises)
 Exercises for Chapter 5 "The Second Law" in Fleming's Physical Chemistry Textmap.
 6.S: The Second Law (Summary)
 Summary for Chapter 5 "The Second Law" in Fleming's Physical Chemistry Textmap.