Activities
Students are placed into small groups and asked to create an experimental setup they can use to measure the partial derivative they are given, in which entropy changes.
Transitioning from the PDM back to thermodynamic systemsHeating
In the partial derivative machine, the change in internal energy corresponds to the work done on the left string and the right string: \begin{align} dU &= F_L dx_L + F_R dx_R \end{align} The ”thing we changed” was \(dx_L\) or \(dx_R\). From that we could determine the change in internal energy.
When we transfer energy to something by heating, it's hard to measure the “thing we changed,” which was entropy. It is, however, possible in some cases to measure the amount of energy transfered by heating, and from that we can work backwards to find out how much the entropy changed.
The amount of energy transfered into a system by heating is generally written as \(Q\). (There is a historical misconception built deeply into our language that heat is a property of a material, i.e. a state property. This is caloric theory. You don't need to know any of this history, but you do have to be careful when using the word “heat”.)
An infinitesimal amount of energy transfered by heating is called \({\mathit{\unicode{273}}} Q\). The symbol \({\mathit{\unicode{273}}} \) indicates an inexact differential, which you can think of as a “small chunk” that is not the change of something. \({\mathit{\unicode{273}}} Q\) is not a small change in the amount of energy transfered by heating, but rather is a small amount of energy transfered by heating.
When playing with the partial derivative machine, we can say the work done on the left string, \(F_Ldx_L\), is analogous to heat entering a thermodynamic system.
Latent heat
A phase transition is when a material changes state of matter, as in melting or boiling. At most phase transitions (technically, abrupt phase transitions as you will learnin the Capstone), the temperature remains constant while the material is changing from one state to the other. So you know that as long as you have ice and water coexisting in equilibrium at one atmosphere of pressure, the temperature must be \(0^\circ\)C. Similarly, as long as water is boiling at one atmosphere of pressure, the temperature must be \(100^\circ\)C. In both of these cases, you can transfer energy to the system (as we will) by heating without changing the temperature! This relates to why I keep awkwardly saying “transfer energy to a system by heating” rather than just “heating a system” which means the same thing. We have deeply ingrained the idea that “heating” is synonymous with “raising the temperature,” which does not align with the physics meaning.
So now let me define the latent heat. The latent heat is the amount of energy that must be transfered to a material by heating in order to change it from one phase to another. The latent heat of fusion is the amount of energy required to melt a solid, and the latent heat of vaporization is the amount of energy required to turn a liquid into a gas. We will be measuring both of these for water.
A question you may ask is whether the latent heat is extensive or intensive. Technically the latent heat is extensive, since if you have more material then more energy is required to melt/boil it. However, when you hear latent heat quoted, it is almost always the specific latent heat, which is the energy transfer by heating required per unit of mass. It can be confusing that people use the same words to refer to both quantities. Fortunately, dimensional checking can always give you a way to verify which is being referred to. If \(L\) is an energy per mass, then it must be the specific latent heat, while if it is an energy, then it must be the latent heat.
Heat capacity and specific heat
The heat capacity is the amount of energy transfer required per temperature to raise the temperature of a system. If we hold the pressure fixed (as in our experiment) we can write this as: \begin{align} {\mathit{\unicode{273}}} Q &= C_p dT \end{align} where \(C_p\) is the heat capacity at fixed pressure. You might think to rewrite this expression as a derivative, but we can't do that since the energy transfered by heating is not a state function.
Note that the heat capacity, like the latent heat, is an extensive quantity. The specific heat is the the heat capacity per unit mass, which is an intensive quantity that we can consider a property of a material independently of the quantity of that material.
I'll just mention as an aside that the term “heat capacity” is another one of those unfortunate phrases that reflect the inaccurate idea that heat is a property of a system.
Entropy
Finally, we can get to entropy. Entropy is the “thing that changes” when you transfer energy by heating. I'll just give this away: \begin{align} {\mathit{\unicode{273}}} Q &= TdS \end{align} where this equation is only true if you make the change quasistatically (see another lecture). This allows us to find the change in entropy if we know how much energy was transfered by heating, and the temperature in the process. \begin{align} \Delta S &= \int \frac1T {\mathit{\unicode{273}}} Q \end{align} where again, we need to know the temperature as we add heat.
In this entire problem, keep results to first order in the van der Waals correction terms \(a\) and $b.
Show that the entropy of the van der Waals gas is \begin{align} S &= Nk\left\{\ln\left(\frac{n_Q(V-Nb)}{N}\right)+\frac52\right\} \end{align}
Show that the energy is \begin{align} U &= \frac32 NkT - \frac{N^2a}{V} \end{align}
Show that the enthalpy \(H\equiv U+pV\) is \begin{align} H(T,V) &= \frac52NkT + \frac{N^2bkT}{V} - 2\frac{N^2a}{V} \\ H(T,p) &= \frac52NkT + Nbp - \frac{2Nap}{kT} \end{align}
Show that a Fermi electron gas in the ground state exerts a pressure \begin{align} p = \frac{\left(3\pi^2\right)^{\frac23}}{5} \frac{\hbar^2}{m}\left(\frac{N}{V}\right)^{\frac53} \end{align} In a uniform decrease of the volume of a cube every orbital has its energy raised: The energy of each orbital is proportional to \(\frac1{L^2}\) or to \(\frac1{V^{\frac23}}\).
Find an expression for the entropy of a Fermi electron gas in the region \(kT\ll \varepsilon_F\). Notice that \(S\rightarrow 0\) as \(T\rightarrow 0\).
Problem
Suppose that a system of \(N\) atoms of type \(A\) is placed in diffusive contact with a system of \(N\) atoms of type \(B\) at the same temperature and volume.
Show that after diffusive equilibrium is reached the total entropy is increased by \(2Nk\ln 2\). The entropy increase \(2Nk\ln 2\) is known as the entropy of mixing.
If the atoms are identical (\(A=B\)), show that there is no increase in entropy when diffusive contact is established. The difference has been called the Gibbs paradox.
Since the Helmholtz free energy is lower for the mixed \(AB\) than for the separated \(A\) and \(B\), it should be possible to extract work from the mixing process. Construct a process that could extract work as the two gasses are mixed at fixed temperature. You will probably need to use walls that are permeable to one gas but not the other.
- Note
This course has not yet covered work, but it was covered in Energy and Entropy, so you may need to stretch your memory to finish part (c).
Problem
Suppose \(g(U) = CU^{3N/2}\), where \(C\) is a constant and \(N\) is the number of particles.
Show that \(U=\frac32 N k_BT\).
Show that \(\left(\frac{\partial^2S}{\partial U^2}\right)_N\) is negative. This form of \(g(U)\) actually applies to a monatomic ideal gas.
Problem
Consider two noninteracting systems \(A\) and \(B\). We can either treat these systems as separate, or as a single combined system \(AB\). We can enumerate all states of the combined by enumerating all states of each separate system. The probability of the combined state \((i_A,j_B)\) is given by \(P_{ij}^{AB} = P_i^AP_j^B\). In other words, the probabilities combine in the same way as two dice rolls would, or the probabilities of any other uncorrelated events.
- Show that the entropy of the combined system \(S_{AB}\) is the sum of entropies of the two separate systems considered individually, i.e. \(S_{AB} = S_A+S_B\). This means that entropy is extensive. Use the Gibbs entropy for this computation. You need make no approximation in solving this problem.
- Show that if you have \(N\) identical non-interacting systems, their total entropy is \(NS_1\) where \(S_1\) is the entropy of a single system.
Note
In real materials, we treat properties as being extensive even when there are interactions in the system. In this case, extensivity is a property of large systems, in which surface effects may be neglected.
The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier \(\beta\), we can prove that \(\beta=\frac1{kT}\) based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.
The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align}: We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier \(\beta\) as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate \(U\) and \(S\) to \(\beta\). Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that \(\beta = \frac1{kT}\).
This lecture introduces the idea of entropy, including the relationship between entropy and multiplicity as well as the relationship between changes in entropy and heat.
These lecture notes for the second week of https://paradigms.oregonstate.edu/courses/ph441 involve relating entropy and temperature in the microcanonical ensemble, using a paramagnet as an example. These notes include a few small group activities.
These lecture notes for the first week of https://paradigms.oregonstate.edu/courses/ph441 include a couple of small group activities in which students work with the Gibbs formulation of the entropy.
None
None
Problem
A 100W light bulb is left burning inside a Carnot refridgerator that draws 100W. Can the refridgerator cool below room temperature?
Problem
In our week on radiation, we saw that the Helmholtz free energy of a box of radiation at temperature \(T\) is \begin{align} F &= -8\pi \frac{V(kT)^4}{h^3c^3}\frac{\pi^4}{45} \end{align} From this we also found the internal energy and entropy \begin{align} U &= 24\pi \frac{(kT)^4}{h^3c^3}\frac{\pi^4}{45} V \\ S &= 32\pi kV\left(\frac{kT}{hc}\right)^3 \frac{\pi^4}{45} \end{align} Given these results, let us consider a Carnot engine that uses an empty metalic piston (i.e. a photon gas).
Given \(T_H\) and \(T_C\), as well as \(V_1\) and \(V_2\) (the two volumes at \(T_H\)), determine \(V_3\) and \(V_4\) (the two volumes at \(T_C\)).
What is the heat \(Q_H\) taken up and the work done by the gas during the first isothermal expansion? Are they equal to each other, as for the ideal gas?
Does the work done on the two isentropic stages cancel each other, as for the ideal gas?
Calculate the total work done by the gas during one cycle. Compare it with the heat taken up at \(T_H\) and show that the energy conversion efficiency is the Carnot efficiency.
Show that for a reversible heat pump the energy required per unit of heat delivered inside the building is given by the Carnot efficiency: \begin{align} \frac{W}{Q_H} &= \eta_C = \frac{T_H-T_C}{T_H} \end{align} What happens if the heat pump is not reversible?
Assume that the electricity consumed by a reversible heat pump must itself be generated by a Carnot engine operating between the even hotter temperature \(T_{HH}\) and the cold (outdoors) temperature \(T_C\). What is the ratio \(\frac{Q_{HH}}{Q_H}\) of the heat consumed at \(T_{HH}\) (i.e. fuel burned) to the heat delivered at \(T_H\) (in the house we want to heat)? Give numerical values for \(T_{HH}=600\text{K}\); \(T_{H}=300\text{K}\); \(T_{C}=270\text{K}\).
Draw an energy-entropy flow diagram for the combination heat engine-heat pump, similar to Figures 8.1, 8.2 and 8.4 in the text (or the equivalent but sloppier) figures in the course notes. However, in this case we will involve no external work at all, only energy and entropy flows at three temperatures, since the work done is all generated from heat.
Problem
Find the chemical potential of an ideal monatomic gas in two dimensions, with \(N\) atoms confined to a square of area \(A=L^2\). The spin is zero.
Find an expression for the energy \(U\) of the gas.
Find an expression for the entropy \(\sigma\). The temperature is \(kT\).
Problem
Consider one mole of an ideal monatomic gas at 300K and 1 atm. First, let the gas expand isothermally and reversibly to twice the initial volume; second, let this be followed by an isentropic expansion from twice to four times the original volume.
How much heat (in joules) is added to the gas in each of these two processes?
What is the temperature at the end of the second process?
Suppose the first process is replaced by an irreversible expansion into a vacuum, to a total volume twice the initial volume. What is the increase of entropy in the irreversible expansion, in J/K?
Problem
- Solve for the heat capacity of a vacuum, given the above, and assuming that photons represent all the energy present in vacuum.
- Compare the heat capacity of vacuum at room temperature with the heat capacity of an equal volume of water.
Problem
Consider an ideal gas of \(N\) particles, each of mass \(M\), confined to a one-dimensional line of length \(L\). The particles have spin zero (so you can ignore spin) and do not interact with one another. Find the entropy at temperature \(T\). You may assume that the temperature is high enough that \(k_B T\) is much greater than the ground state energy of one particle.
Find an expression for the free energy as a function of \(T\) of a system with two states, one at energy 0 and one at energy \(\varepsilon\).
From the free energy, find expressions for the internal energy \(U\) and entropy \(S\) of the system.
Plot the entropy versus \(T\). Explain its asymptotic behavior as the temperature becomes high.
Plot the \(S(T)\) versus \(U(T)\). Explain the maximum value of the energy \(U\).
Problem
Find the entropy of a set of \(N\) oscillators of frequency \(\omega\) as a function of the total quantum number \(n\). Use the multiplicity function: \begin{equation} g(N,n) = \frac{(N+n-1)!}{n!(N-1)!} \end{equation} and assume that \(N\gg 1\). This means you can make the Sitrling approximation that \(\log N! \approx N\log N - N\). It also means that \(N-1 \approx N\).
Let \(U\) denote the total energy \(n\hbar\omega\) of the oscillators. Express the entropy as \(S(U,N)\). Show that the total energy at temperature \(T\) is \begin{equation} U = \frac{N\hbar\omega}{e^{\frac{\hbar\omega}{kT}}-1} \end{equation} This is the Planck result found the hard way. We will get to the easy way soon, and you will never again need to work with a multiplicity function like this.
The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier \(\beta\), we can prove that \(\beta=\frac1{kT}\) based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.
The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align} We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier \(\beta\) as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate \(U\) and \(S\) to \(\beta\). Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that \(\beta = \frac1{kT}\).
Problem
Consider a three-state system with energies \((-\epsilon,0,\epsilon)\).
- At infinite temperature, what are the probabilities of the three states being occupied? What is the internal energy \(U\)? What is the entropy \(S\)?
- At very low temperature, what are the three probabilities?
- What are the three probabilities at zero temperature? What is the internal energy \(U\)? What is the entropy \(S\)?
- What happens to the probabilities if you allow the temperature to be negative?
These lecture notes covering week 8 of https://paradigms.oregonstate.edu/courses/ph441 include a small group activity in which students derive the Carnot efficiency.
These notes, from the third week of https://paradigms.oregonstate.edu/courses/ph441 cover the canonical ensemble and Helmholtz free energy. They include a number of small group activities.
This very quick lecture reviews the content taught in https://paradigms.oregonstate.edu/courses/ph423, and is the first content in https://paradigms.oregonstate.edu/courses/ph441.
Students generate a list of properties a glass of water might have. The class then discusses and categorizes those properties.
Students struggle with understanding that entropy can be created. It's an extensive quantity, and is the only one that isn't normally conserved, so that makes it pretty weird. We (professors) don't always realize how very weird this is, and students don't have the vocabulary to explain it to us, and are often afraid to try.
In this remote-friendly activity, students use a microwave oven (and optionally a thermometer) to measure the latent heat of melting for water (and optionally the heat capacity). From these they compute changes in entropy. See also Ice Calorimetry Lab.
This lab gives students a chance to take data on the first day of class (or later, but I prefer to do it the first day of class). It provides an immediate context for thermodynamics, and also gives them a chance to experimentally measure a change in entropy. Students are required to measure the energy required to melt ice and raise the temperature of water, and measure the change in entropy by integrating the heat capacity.