Lecture: Introducing entropy

Contemporary Challenges 2021
This lecture introduces the idea of entropy, including the relationship between entropy and multiplicity as well as the relationship between changes in entropy and heat.

This lecture has a student group activity embedded in it. Should the Monte Carlo video be a separate "activity", so it could be done live instead of as a pre-class video?

See this video for a Monte Carlo simulation showing how the multiplicity tends to increase.

As we have seen, the multiplicity \(\Omega\) of a material is of considerable interest, because the combined multiplicity of a system plus its surroundings tends to increase. We have also seen that for an Einstein solid the multiplicity tends to grow very rapidly (beyond exponentially) with system size for around \(10^{23}\) atoms (a few grams of any material), the multiplicity would be closer to \(10^{10^{20}}\), which is a bit much. In addition, to combine the multiplicity of two systems, we have to multiply rather than add.

Instead of working with multiplicity, we far prefer the entropy \(S\) \begin{align} S \equiv \text{(constant)} \ln \Omega \end{align} where the constant is called Boltzmann's constant and in SI units is \(k_B=1.38\times 10^{23}\text{ J/K}\). (It is also possible to choose units such that temperature is expressed in energy units, and \(k_B=1\). I tend to do this in my own research.)

The natural log function ensures \(S\) is not astronomically large, and even more importantily it makes entropy an additive quantity. If we have two subsystems with multiplicity \(\Omega_1\) and \(\Omega_2\), then the combined system has multiplicity \begin{align} \Omega_{\text{combined}} &= \Omega_1\cdot\Omega_2 \\ S_{\text{combined}} &= k_B\ln(\Omega_{\text{combined}}) \\ &= k_B\ln(\Omega_1\cdot\Omega_2) \\ &= k_B\ln\Omega_1 + k_B\ln\Omega_2 \\ &= S_1 + S_2 \end{align} If you're hazy on the behavior of logarithms, this wouldn't be a bad time to review those properties. Most crucially: \begin{align} \ln(ab) &= \ln a + \ln b \\ \ln\left(\frac{a}{b}\right) &= \ln a - \ln b \\ \ln(a^b) &= b \ln a \end{align}

Analyzing a real system's entropy

Imagine dropping a hot block of metal into a room-temperature tub of water. How would you answer if I asked you to find \(\Omega_{\text{metal,initial}}\), \(\Omega_{\text{metal,final}}\), \(\Omega_{\text{water,initial}}\), and \(\Omega_{\text{water,final}}\)? You'd be stumped. I'd be stumped, if I tried to compute these multiplicities.

Does this mean that entropy is unhelpful in practice? Absolutely not! Entropy was actually discovered before the connection between it and multiplicity, and changes in entropy are readily measurable! We can't easily measure the absolute entropy (although it is possible and is actually done), but changes in entropy are often quite straightforward to measure.

The key is the property that \begin{align} \Delta S &= \frac{Q}{T} \end{align} where \(Q\) is the energy entering the object through heating (positive means it gains energy), and \(T\) is the temperature of the object we are considering, provided \(Q\) is small enough that we can neglect changes in temperature. \(T\) must be measured on an absolute scale (e.g. Kelvin).

Why does this equation work? It's related to the definition of temperature.

entropy multiplicity heat thermodynamics
Learning Outcomes