assignment Homework

Gibbs sum for a two level system
Gibbs sum Microstate Thermal average energy Thermal and Statistical Physics 2020
  1. Consider a system that may be unoccupied with energy zero, or occupied by one particle in either of two states, one of energy zero and one of energy \(\varepsilon\). Find the Gibbs sum for this system is in terms of the activity \(\lambda\equiv e^{\beta\mu}\). Note that the system can hold a maximum of one particle.

  2. Solve for the thermal average occupancy of the system in terms of \(\lambda\).

  3. Show that the thermal average occupancy of the state at energy \(\varepsilon\) is \begin{align} \langle N(\varepsilon)\rangle = \frac{\lambda e^{-\frac{\varepsilon}{kT}}}{\mathcal{Z}} \end{align}

  4. Find an expression for the thermal average energy of the system.

  5. Allow the possibility that the orbitals at \(0\) and at \(\varepsilon\) may each be occupied each by one particle at the same time; Show that \begin{align} \mathcal{Z} &= 1 + \lambda + \lambda e^{-\frac{\varepsilon}{kT}} + \lambda^2 e^{-\frac{\varepsilon}{kT}} \\ &= (1+\lambda)\left(1+e^{-\frac{\varepsilon}{kT}}\right) \end{align} Because \(\mathcal{Z}\) can be factored as shown, we have in effect two independent systems.

face Lecture

120 min.

Gibbs entropy approach
Thermal and Statistical Physics 2020

Gibbs entropy information theory probability statistical mechanics

These lecture notes for the first week of Thermal and Statistical Physics include a couple of small group activities in which students work with the Gibbs formulation of the entropy.

face Lecture

120 min.

Entropy and Temperature
Thermal and Statistical Physics 2020

paramagnet entropy temperature statistical mechanics

These lecture notes for the second week of Thermal and Statistical Physics involve relating entropy and temperature in the microcanonical ensemble, using a paramagnet as an example. These notes include a few small group activities.

assignment Homework

Energy, Entropy, and Probabilities
Energy Entropy Probabilities Thermodynamic identity

The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier \(\beta\), we can prove that \(\beta=\frac1{kT}\) based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.

The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align}: We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier \(\beta\) as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate \(U\) and \(S\) to \(\beta\). Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that \(\beta = \frac1{kT}\).

assignment Homework

Energy, Entropy, and Probabilities
Thermal and Statistical Physics 2020

The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier \(\beta\), we can prove that \(\beta=\frac1{kT}\) based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.

The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align} We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier \(\beta\) as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate \(U\) and \(S\) to \(\beta\). Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that \(\beta = \frac1{kT}\).

assignment Homework

Free energy of a two state system
Helmholtz free energy entropy statistical mechanics Thermal and Statistical Physics 2020
  1. Find an expression for the free energy as a function of \(T\) of a system with two states, one at energy 0 and one at energy \(\varepsilon\).

  2. From the free energy, find expressions for the internal energy \(U\) and entropy \(S\) of the system.

  3. Plot the entropy versus \(T\). Explain its asymptotic behavior as the temperature becomes high.

  4. Plot the \(S(T)\) versus \(U(T)\). Explain the maximum value of the energy \(U\).

face Lecture

120 min.

Boltzmann probabilities and Helmholtz
Thermal and Statistical Physics 2020

ideal gas entropy canonical ensemble Boltzmann probability Helmholtz free energy statistical mechanics

These notes, from the third week of Thermal and Statistical Physics cover the canonical ensemble and Helmholtz free energy. They include a number of small group activities.

assignment Homework

Pressure and entropy of a degenerate Fermi gas
Fermi gas Pressure Entropy Thermal and Statistical Physics 2020
  1. Show that a Fermi electron gas in the ground state exerts a pressure \begin{align} p = \frac{\left(3\pi^2\right)^{\frac23}}{5} \frac{\hbar^2}{m}\left(\frac{N}{V}\right)^{\frac53} \end{align} In a uniform decrease of the volume of a cube every orbital has its energy raised: The energy of each orbital is proportional to \(\frac1{L^2}\) or to \(\frac1{V^{\frac23}}\).

  2. Find an expression for the entropy of a Fermi electron gas in the region \(kT\ll \varepsilon_F\). Notice that \(S\rightarrow 0\) as \(T\rightarrow 0\).

assignment Homework

Radiation in an empty box
Thermal physics Radiation Free energy Thermal and Statistical Physics 2020

As discussed in class, we can consider a black body as a large box with a small hole in it. If we treat the large box a metal cube with side length \(L\) and metal walls, the frequency of each normal mode will be given by: \begin{align} \omega_{n_xn_yn_z} &= \frac{\pi c}{L}\sqrt{n_x^2 + n_y^2 + n_z^2} \end{align} where each of \(n_x\), \(n_y\), and \(n_z\) will have positive integer values. This simply comes from the fact that a half wavelength must fit in the box. There is an additional quantum number for polarization, which has two possible values, but does not affect the frequency. Note that in this problem I'm using different boundary conditions from what I use in class. It is worth learning to work with either set of quantum numbers. Each normal mode is a harmonic oscillator, with energy eigenstates \(E_n = n\hbar\omega\) where we will not include the zero-point energy \(\frac12\hbar\omega\), since that energy cannot be extracted from the box. (See the Casimir effect for an example where the zero point energy of photon modes does have an effect.)

Note
This is a slight approximation, as the boundary conditions for light are a bit more complicated. However, for large \(n\) values this gives the correct result.

  1. Show that the free energy is given by \begin{align} F &= 8\pi \frac{V(kT)^4}{h^3c^3} \int_0^\infty \ln\left(1-e^{-\xi}\right)\xi^2d\xi \\ &= -\frac{8\pi^5}{45} \frac{V(kT)^4}{h^3c^3} \\ &= -\frac{\pi^2}{45} \frac{V(kT)^4}{\hbar^3c^3} \end{align} provided the box is big enough that \(\frac{\hbar c}{LkT}\ll 1\). Note that you may end up with a slightly different dimensionless integral that numerically evaluates to the same result, which would be fine. I also do not expect you to solve this definite integral analytically, a numerical confirmation is fine. However, you must manipulate your integral until it is dimensionless and has all the dimensionful quantities removed from it!

  2. Show that the entropy of this box full of photons at temperature \(T\) is \begin{align} S &= \frac{32\pi^5}{45} k V \left(\frac{kT}{hc}\right)^3 \\ &= \frac{4\pi^2}{45} k V \left(\frac{kT}{\hbar c}\right)^3 \end{align}

  3. Show that the internal energy of this box full of photons at temperature \(T\) is \begin{align} \frac{U}{V} &= \frac{8\pi^5}{15}\frac{(kT)^4}{h^3c^3} \\ &= \frac{\pi^2}{15}\frac{(kT)^4}{\hbar^3c^3} \end{align}

assignment Homework

Boltzmann probabilities
Energy Entropy Boltzmann probabilities Thermal and Statistical Physics 2020 (3 years) Consider a three-state system with energies \((-\epsilon,0,\epsilon)\).
  1. At infinite temperature, what are the probabilities of the three states being occupied? What is the internal energy \(U\)? What is the entropy \(S\)?
  2. At very low temperature, what are the three probabilities?
  3. What are the three probabilities at zero temperature? What is the internal energy \(U\)? What is the entropy \(S\)?
  4. What happens to the probabilities if you allow the temperature to be negative?

assignment Homework

Entropy of mixing
Entropy Equilibrium Sackur-Tetrode Thermal and Statistical Physics 2020

Suppose that a system of \(N\) atoms of type \(A\) is placed in diffusive contact with a system of \(N\) atoms of type \(B\) at the same temperature and volume.

  1. Show that after diffusive equilibrium is reached the total entropy is increased by \(2Nk\ln 2\). The entropy increase \(2Nk\ln 2\) is known as the entropy of mixing.

  2. If the atoms are identical (\(A=B\)), show that there is no increase in entropy when diffusive contact is established. The difference has been called the Gibbs paradox.

  3. Since the Helmholtz free energy is lower for the mixed \(AB\) than for the separated \(A\) and \(B\), it should be possible to extract work from the mixing process. Construct a process that could extract work as the two gasses are mixed at fixed temperature. You will probably need to use walls that are permeable to one gas but not the other.

Note

This course has not yet covered work, but it was covered in Energy and Entropy, so you may need to stretch your memory to finish part (c).

face Lecture

120 min.

Thermal radiation and Planck distribution
Thermal and Statistical Physics 2020

Planck distribution blackbody radiation photon statistical mechanics

These notes from the fourth week of Thermal and Statistical Physics cover blackbody radiation and the Planck distribution. They include a number of small group activities.

face Lecture

5 min.

Energy and Entropy review
Thermal and Statistical Physics 2020 (3 years)

thermodynamics statistical mechanics

This very quick lecture reviews the content taught in Energy and Entropy, and is the first content in Thermal and Statistical Physics.

face Lecture

120 min.

Chemical potential and Gibbs distribution
Thermal and Statistical Physics 2020

chemical potential Gibbs distribution grand canonical ensemble statistical mechanics

These notes from the fifth week of Thermal and Statistical Physics cover the grand canonical ensemble. They include several small group activities.

assignment Homework

Ideal gas in two dimensions
Ideal gas Entropy Chemical potential Thermal and Statistical Physics 2020
  1. Find the chemical potential of an ideal monatomic gas in two dimensions, with \(N\) atoms confined to a square of area \(A=L^2\). The spin is zero.

  2. Find an expression for the energy \(U\) of the gas.

  3. Find an expression for the entropy \(\sigma\). The temperature is \(kT\).

face Lecture

120 min.

Ideal Gas
Thermal and Statistical Physics 2020

ideal gas particle in a box grand canonical ensemble chemical potential statistical mechanics

These notes from week 6 of Thermal and Statistical Physics cover the ideal gas from a grand canonical standpoint starting with the solutions to a particle in a three-dimensional box. They include a number of small group activities.

face Lecture

120 min.

Phase transformations
Thermal and Statistical Physics 2020

phase transformation Clausius-Clapeyron mean field theory thermodynamics

These lecture notes from the ninth week of Thermal and Statistical Physics cover phase transformations, the Clausius-Clapeyron relation, mean field theory and more. They include a number of small group activities.