group Small Group Activity

30 min.

Thermodynamic States (Remote)

Thermo

Try doing this activity as a follow-up to the Changes in Internal Energy (Remote) about the first law of thermodynamics.

assignment Homework

Bottle in a Bottle 2
heat entropy ideal gas Energy and Entropy 2021 (2 years)

Consider the bottle in a bottle problem in a previous problem set, summarized here.

A small bottle of helium is placed inside a large bottle, which otherwise contains vacuum. The inner bottle contains a slow leak, so that the helium leaks into the outer bottle. The inner bottle contains one tenth the volume of the outer bottle, which is insulated.

The volume of the small bottle is 0.001 m23 and the volume of the big bottle is 0.01 m3. The initial state of the gas in the small bottle was \(p=106\) Pa and its temperature \(T=300\) K. Approximate the helium gas as an ideal gas of equations of state \(pV=Nk_BT\) and \(U=\frac32 Nk_BT\).

  1. How many molecules of gas does the large bottle contain? What is the final temperature of the gas?

  2. Compute the integral \(\int \frac{{\mathit{\unicode{273}}} Q}{T}\) and the change of entropy \(\Delta S\) between the initial state (gas in the small bottle) and the final state (gas leaked in the big bottle).

  3. Discuss your results.

group Small Group Activity

30 min.

Using \(pV\) and \(TS\) Plots
Energy and Entropy 2021 (2 years)

work heat first law energy

Students work out heat and work for rectangular paths on \(pV\) and \(TS\) plots. This gives with computing heat and work, applying the First Law, and recognizing that internal energy is a state function, which cannot change after a cyclic process.

assignment Homework

Photon carnot engine
Carnot engine Work Energy Entropy Thermal and Statistical Physics 2020

In our week on radiation, we saw that the Helmholtz free energy of a box of radiation at temperature \(T\) is \begin{align} F &= -8\pi \frac{V(kT)^4}{h^3c^3}\frac{\pi^4}{45} \end{align} From this we also found the internal energy and entropy \begin{align} U &= 24\pi \frac{(kT)^4}{h^3c^3}\frac{\pi^4}{45} V \\ S &= 32\pi kV\left(\frac{kT}{hc}\right)^3 \frac{\pi^4}{45} \end{align} Given these results, let us consider a Carnot engine that uses an empty metalic piston (i.e. a photon gas).

  1. Given \(T_H\) and \(T_C\), as well as \(V_1\) and \(V_2\) (the two volumes at \(T_H\)), determine \(V_3\) and \(V_4\) (the two volumes at \(T_C\)).

  2. What is the heat \(Q_H\) taken up and the work done by the gas during the first isothermal expansion? Are they equal to each other, as for the ideal gas?

  3. Does the work done on the two isentropic stages cancel each other, as for the ideal gas?

  4. Calculate the total work done by the gas during one cycle. Compare it with the heat taken up at \(T_H\) and show that the energy conversion efficiency is the Carnot efficiency.

assignment Homework

Paramagnetism
Energy Temperature Paramagnetism Thermal and Statistical Physics 2020 Find the equilibrium value at temperature \(T\) of the fractional magnetization \begin{equation} \frac{\mu_{tot}}{Nm} \equiv \frac{2\langle s\rangle}{N} \end{equation} of a system of \(N\) spins each of magnetic moment \(m\) in a magnetic field \(B\). The spin excess is \(2s\). The energy of this system is given by \begin{align} U &= -\mu_{tot}B \end{align} where \(\mu_{tot}\) is the total magnetization. Take the entropy as the logarithm of the multiplicity \(g(N,s)\) as given in (1.35 in the text): \begin{equation} S(s) \approx k_B\log g(N,0) - k_B\frac{2s^2}{N} \end{equation} for \(|s|\ll N\), where \(s\) is the spin excess, which is related to the magnetization by \(\mu_{tot} = 2sm\). Hint: Show that in this approximation \begin{equation} S(U) = S_0 - k_B\frac{U^2}{2m^2B^2N}, \end{equation} with \(S_0=k_B\log g(N,0)\). Further, show that \(\frac1{kT} = -\frac{U}{m^2B^2N}\), where \(U\) denotes \(\langle U\rangle\), the thermal average energy.

assignment Homework

Energy fluctuations
energy Boltzmann factor statistical mechanics heat capacity Thermal and Statistical Physics 2020 Consider a system of fixed volume in thermal contact with a resevoir. Show that the mean square fluctuations in the energy of the system is \begin{equation} \left<\left(\varepsilon-\langle\varepsilon\rangle\right)^2\right> = k_BT^2\left(\frac{\partial U}{\partial T}\right)_{V} \end{equation} Here \(U\) is the conventional symbol for \(\langle\varepsilon\rangle\). Hint: Use the partition function \(Z\) to relate \(\left(\frac{\partial U}{\partial T}\right)_V\) to the mean square fluctuation. Also, multiply out the term \((\cdots)^2\).

face Lecture

120 min.

Thermal radiation and Planck distribution
Thermal and Statistical Physics 2020

Planck distribution blackbody radiation photon statistical mechanics

These notes from the fourth week of Thermal and Statistical Physics cover blackbody radiation and the Planck distribution. They include a number of small group activities.

face Lecture

120 min.

Work, Heat, and cycles
Thermal and Statistical Physics 2020

work heat engines Carnot thermodynamics entropy

These lecture notes covering week 8 of Thermal and Statistical Physics include a small group activity in which students derive the Carnot efficiency.

face Lecture

120 min.

Chemical potential and Gibbs distribution
Thermal and Statistical Physics 2020

chemical potential Gibbs distribution grand canonical ensemble statistical mechanics

These notes from the fifth week of Thermal and Statistical Physics cover the grand canonical ensemble. They include several small group activities.

assignment Homework

Entropy, energy, and enthalpy of van der Waals gas
Van der Waals gas Enthalpy Entropy Thermal and Statistical Physics 2020

In this entire problem, keep results to first order in the van der Waals correction terms \(a\) and $b.

  1. Show that the entropy of the van der Waals gas is \begin{align} S &= Nk\left\{\ln\left(\frac{n_Q(V-Nb)}{N}\right)+\frac52\right\} \end{align}

  2. Show that the energy is \begin{align} U &= \frac32 NkT - \frac{N^2a}{V} \end{align}

  3. Show that the enthalpy \(H\equiv U+pV\) is \begin{align} H(T,V) &= \frac52NkT + \frac{N^2bkT}{V} - 2\frac{N^2a}{V} \\ H(T,p) &= \frac52NkT + Nbp - \frac{2Nap}{kT} \end{align}

Effects of High Altitude by Randall Munroe, at xkcd.

assignment Homework

Radiation in an empty box
Thermal physics Radiation Free energy Thermal and Statistical Physics 2020

As discussed in class, we can consider a black body as a large box with a small hole in it. If we treat the large box a metal cube with side length \(L\) and metal walls, the frequency of each normal mode will be given by: \begin{align} \omega_{n_xn_yn_z} &= \frac{\pi c}{L}\sqrt{n_x^2 + n_y^2 + n_z^2} \end{align} where each of \(n_x\), \(n_y\), and \(n_z\) will have positive integer values. This simply comes from the fact that a half wavelength must fit in the box. There is an additional quantum number for polarization, which has two possible values, but does not affect the frequency. Note that in this problem I'm using different boundary conditions from what I use in class. It is worth learning to work with either set of quantum numbers. Each normal mode is a harmonic oscillator, with energy eigenstates \(E_n = n\hbar\omega\) where we will not include the zero-point energy \(\frac12\hbar\omega\), since that energy cannot be extracted from the box. (See the Casimir effect for an example where the zero point energy of photon modes does have an effect.)

Note
This is a slight approximation, as the boundary conditions for light are a bit more complicated. However, for large \(n\) values this gives the correct result.

  1. Show that the free energy is given by \begin{align} F &= 8\pi \frac{V(kT)^4}{h^3c^3} \int_0^\infty \ln\left(1-e^{-\xi}\right)\xi^2d\xi \\ &= -\frac{8\pi^5}{45} \frac{V(kT)^4}{h^3c^3} \\ &= -\frac{\pi^2}{45} \frac{V(kT)^4}{\hbar^3c^3} \end{align} provided the box is big enough that \(\frac{\hbar c}{LkT}\ll 1\). Note that you may end up with a slightly different dimensionless integral that numerically evaluates to the same result, which would be fine. I also do not expect you to solve this definite integral analytically, a numerical confirmation is fine. However, you must manipulate your integral until it is dimensionless and has all the dimensionful quantities removed from it!

  2. Show that the entropy of this box full of photons at temperature \(T\) is \begin{align} S &= \frac{32\pi^5}{45} k V \left(\frac{kT}{hc}\right)^3 \\ &= \frac{4\pi^2}{45} k V \left(\frac{kT}{\hbar c}\right)^3 \end{align}

  3. Show that the internal energy of this box full of photons at temperature \(T\) is \begin{align} \frac{U}{V} &= \frac{8\pi^5}{15}\frac{(kT)^4}{h^3c^3} \\ &= \frac{\pi^2}{15}\frac{(kT)^4}{\hbar^3c^3} \end{align}

group Small Group Activity

30 min.

Optical depth of atmosphere
Contemporary Challenges 2022 (4 years) In this activity students estimate the optical depth of the atmosphere at the infrared wavelength where carbon dioxide has peak absorption.

face Lecture

120 min.

Boltzmann probabilities and Helmholtz
Thermal and Statistical Physics 2020

ideal gas entropy canonical ensemble Boltzmann probability Helmholtz free energy statistical mechanics

These notes, from the third week of Thermal and Statistical Physics cover the canonical ensemble and Helmholtz free energy. They include a number of small group activities.

assignment Homework

Ideal gas in two dimensions
Ideal gas Entropy Chemical potential Thermal and Statistical Physics 2020
  1. Find the chemical potential of an ideal monatomic gas in two dimensions, with \(N\) atoms confined to a square of area \(A=L^2\). The spin is zero.

  2. Find an expression for the energy \(U\) of the gas.

  3. Find an expression for the entropy \(\sigma\). The temperature is \(kT\).

assignment Homework

Energy, Entropy, and Probabilities
Energy Entropy Probabilities Thermodynamic identity

The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier \(\beta\), we can prove that \(\beta=\frac1{kT}\) based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.

The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align}: We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier \(\beta\) as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate \(U\) and \(S\) to \(\beta\). Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that \(\beta = \frac1{kT}\).

assignment Homework

Active transport
Active transport Concentration Chemical potential Thermal and Statistical Physics 2020

The concentration of potassium \(\text{K}^+\) ions in the internal sap of a plant cell (for example, a fresh water alga) may exceed by a factor of \(10^4\) the concentration of \(\text{K}^+\) ions in the pond water in which the cell is growing. The chemical potential of the \(\text{K}^+\) ions is higher in the sap because their concentration \(n\) is higher there. Estimate the difference in chemical potential at \(300\text{K}\) and show that it is equivalent to a voltage of \(0.24\text{V}\) across the cell wall. Take \(\mu\) as for an ideal gas. Because the values of the chemical potential are different, the ions in the cell and in the pond are not in diffusive equilibrium. The plant cell membrane is highly impermeable to the passive leakage of ions through it. Important questions in cell physics include these: How is the high concentration of ions built up within the cell? How is metabolic energy applied to energize the active ion transport?

David adds
You might wonder why it is even remotely plausible to consider the ions in solution as an ideal gas. The key idea here is that the ideal gas entropy incorporates the entropy due to position dependence, and thus due to concentration. Since concentration is what differs between the cell and the pond, the ideal gas entropy describes this pretty effectively. In contrast to the concentration dependence, the temperature-dependence of the ideal gas chemical potential will not be so great.

group Small Group Activity

30 min.

Applying the equipartition theorem
Contemporary Challenges 2022 (4 years)

equipartition theorem

Students count the quadratic degrees of freedom of a few toy molecules to predict their internal energy at temperature \(T\).

face Lecture

5 min.

Energy and Entropy review
Thermal and Statistical Physics 2020 (3 years)

thermodynamics statistical mechanics

This very quick lecture reviews the content taught in Energy and Entropy, and is the first content in Thermal and Statistical Physics.

assignment Homework

Energy, Entropy, and Probabilities
Thermal and Statistical Physics 2020

The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier \(\beta\), we can prove that \(\beta=\frac1{kT}\) based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.

The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align} We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier \(\beta\) as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate \(U\) and \(S\) to \(\beta\). Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that \(\beta = \frac1{kT}\).

face Lecture

120 min.

Gibbs entropy approach
Thermal and Statistical Physics 2020

Gibbs entropy information theory probability statistical mechanics

These lecture notes for the first week of Thermal and Statistical Physics include a couple of small group activities in which students work with the Gibbs formulation of the entropy.