# 3.5: Entropy defined

- Page ID
- 352438

The work we did to get to this description of the efficiency of the Carnot engine has a side benefit. The fact that we could cancel everything about the heat expressions but temperature means that the heat transferred at the two isothermal steps of the Carnot engine’s function is proportional to the temperature at those steps (with a negative sign embedded because of the sign flip we did in the final substitution). We can build out an expression to describe this better:

$\frac{{q}_{3}}{{T}_{3}}=-\frac{{q}_{1}}{{T}_{1}}\text{}\text{OR}\text{}\frac{{q}_{\text{OUT}}}{{T}_{\text{COLD}}}=-\frac{{q}_{\text{IN}}}{{T}_{\text{HOT}}}$

The ratio of heat to temperature will be a useful expression going forward; in real engines, the equivalency will not be neat, and the magnitude of the ratio of heat out to the cold temperature *will always *be greater than the magnitude of the ratio of the heat in to high temperature, because in real engines heat will always be lost in the energy transformation. This ratio of heat to temperature will be how we formalize the definition of *entropy change *at a single temperature:

$\mathrm{\Delta}S=\frac{q}{T}$

A positive entropy change always corresponds with heat being absorbed by a system, and the disorder in that system increasing. A negative entropy change always corresponds with heat being released by that system, and the disorder in that system decreasing.

One of the things that becomes clear when going over a multiple of sources for studying physical chemistry (even when comparing these notes to textbooks I've adopted for physical chemistry in the past) is that there is a spectrum of approaches to introducing and discussing entropy. I have obviously focused on engines, and the disorder that must be generated by the proper function of an engine. A introductory physical chemistry text will typically focus on the phases of matter; a gas phase has more entropy than a liquid phase, which has more entropy than a solid phase. (There is an absolute definition of entropy, which is statistical; we will spend very little time on that definition. We will far more often talk of entropy change than we will of absolute entropy.)

However, one thing that will become plain as we move forward in our dealings with entropy is that entropy does *not *always change at a single temperature, and we will not be able to make the assumption that all processes where entropy changes are isothermal. We need to work with differential language:

$dS=\frac{\delta q}{T}$

Remember that we write differential heat as *δ**q** *and not as *dq* because heat is a process variable, and we can’t integrate over heat unless we integrate the entire process. If we are going to engage in a differential process, and we're going to be able to identify the limits of that integration, we need to be able to transform differential heat into something that we can properly integrate over.

We know, for example, what heat transfer looks like at constant pressure:

${q}_{P}=n\overline{{C}_{P}}\mathrm{\Delta}T$

If temperature change is only as large as a differential, though, we can express that heat transfer by means of a differential that will be integrable while *δ**q* is not:

$\delta {q}_{P}=n\overline{{C}_{P}}dT$

This is an expression we can integrate and find a change in entropy for!

$dS=\frac{\delta {q}_{P}}{T}=n\overline{{C}_{P}}\frac{dT}{T}$

$\int}_{{S}_{i}}^{{S}_{f}}dS=n\overline{{C}_{P}}{\int}_{{T}_{i}}^{{T}_{f}}\frac{dT}{T$

$\mathrm{\Delta}S=n\overline{{C}_{P}}\mathrm{ln}\left(\frac{{T}_{f}}{{T}_{i}}\right)$

We find the entropy change for matter that’s changing its temperature through heat transfer by taking the natural logarithm of the ratio of change of temperatures. Note that while this is temperature-dependent, the logarithm is unitless; the units of entropy change are energy units per unit temperature, or J K^{-1} in base SI units.