*assignment*Energy, Entropy, and Probabilities*assignment*Homework##### Energy, Entropy, and Probabilities

Energy Entropy Probabilities Thermodynamic identityThe goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier \(\beta\), we can prove that \(\beta=\frac1{kT}\) based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.

The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align}: We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier \(\beta\) as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate \(U\) and \(S\) to \(\beta\). Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that \(\beta = \frac1{kT}\).

*assignment*Energy fluctuations*assignment*Homework##### Energy fluctuations

energy Boltzmann factor statistical mechanics heat capacity Thermal and Statistical Physics 2020 Consider a system of fixed volume in thermal contact with a resevoir. Show that the mean square fluctuations in the energy of the system is \begin{equation} \left<\left(\varepsilon-\langle\varepsilon\rangle\right)^2\right> = k_BT^2\left(\frac{\partial U}{\partial T}\right)_{V} \end{equation} Here \(U\) is the conventional symbol for \(\langle\varepsilon\rangle\).*Hint:*Use the partition function \(Z\) to relate \(\left(\frac{\partial U}{\partial T}\right)_V\) to the mean square fluctuation. Also, multiply out the term \((\cdots)^2\).*face*Gibbs entropy approach*face*Lecture120 min.

##### Gibbs entropy approach

Thermal and Statistical Physics 2020Gibbs entropy information theory probability statistical mechanics

These lecture notes for the first week of Thermal and Statistical Physics include a couple of small group activities in which students work with the Gibbs formulation of the entropy.*face*Chemical potential and Gibbs distribution*face*Lecture120 min.

##### Chemical potential and Gibbs distribution

Thermal and Statistical Physics 2020chemical potential Gibbs distribution grand canonical ensemble statistical mechanics

These notes from the fifth week of Thermal and Statistical Physics cover the grand canonical ensemble. They include several small group activities.*face*Energy and Entropy review*face*Lecture5 min.

##### Energy and Entropy review

Thermal and Statistical Physics 2020 (3 years)thermodynamics statistical mechanics

This very quick lecture reviews the content taught in Energy and Entropy, and is the first content in Thermal and Statistical Physics.*assignment*Gibbs sum for a two level system*assignment*Homework##### Gibbs sum for a two level system

Gibbs sum Microstate Thermal average energy Thermal and Statistical Physics 2020Consider a system that may be unoccupied with energy zero, or occupied by one particle in either of two states, one of energy zero and one of energy \(\varepsilon\). Find the Gibbs sum for this system is in terms of the activity \(\lambda\equiv e^{\beta\mu}\). Note that the system can hold a maximum of one particle.

Solve for the thermal average occupancy of the system in terms of \(\lambda\).

Show that the thermal average occupancy of the state at energy \(\varepsilon\) is \begin{align} \langle N(\varepsilon)\rangle = \frac{\lambda e^{-\frac{\varepsilon}{kT}}}{\mathcal{Z}} \end{align}

Find an expression for the thermal average energy of the system.

Allow the possibility that the orbitals at \(0\) and at \(\varepsilon\) may each be occupied each by one particle at the same time; Show that \begin{align} \mathcal{Z} &= 1 + \lambda + \lambda e^{-\frac{\varepsilon}{kT}} + \lambda^2 e^{-\frac{\varepsilon}{kT}} \\ &= (1+\lambda)\left(1+e^{-\frac{\varepsilon}{kT}}\right) \end{align} Because \(\mathcal{Z}\) can be factored as shown, we have in effect two independent systems.

*face*Boltzmann probabilities and Helmholtz*face*Lecture120 min.

##### Boltzmann probabilities and Helmholtz

Thermal and Statistical Physics 2020ideal gas entropy canonical ensemble Boltzmann probability Helmholtz free energy statistical mechanics

These notes, from the third week of Thermal and Statistical Physics cover the canonical ensemble and Helmholtz free energy. They include a number of small group activities.*face*Review of Thermal Physics*face*Lecture30 min.

##### Review of Thermal Physics

Thermal and Statistical Physics 2020thermodynamics statistical mechanics

These are notes, essentially the equation sheet, from the final review session for Thermal and Statistical Physics.*assignment*Pressure and entropy of a degenerate Fermi gas*assignment*Homework##### Pressure and entropy of a degenerate Fermi gas

Fermi gas Pressure Entropy Thermal and Statistical Physics 2020Show that a Fermi electron gas in the ground state exerts a pressure \begin{align} p = \frac{\left(3\pi^2\right)^{\frac23}}{5} \frac{\hbar^2}{m}\left(\frac{N}{V}\right)^{\frac53} \end{align} In a uniform decrease of the volume of a cube every orbital has its energy raised: The energy of each orbital is proportional to \(\frac1{L^2}\) or to \(\frac1{V^{\frac23}}\).

Find an expression for the entropy of a Fermi electron gas in the region \(kT\ll \varepsilon_F\). Notice that \(S\rightarrow 0\) as \(T\rightarrow 0\).

*face*Ideal Gas*face*Lecture120 min.

##### Ideal Gas

Thermal and Statistical Physics 2020ideal gas particle in a box grand canonical ensemble chemical potential statistical mechanics

These notes from week 6 of Thermal and Statistical Physics cover the ideal gas from a grand canonical standpoint starting with the solutions to a particle in a three-dimensional box. They include a number of small group activities.-
Thermal and Statistical Physics 2020
The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier \(\beta\), we can prove that \(\beta=\frac1{kT}\) based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.

The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align} We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier \(\beta\) as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate \(U\) and \(S\) to \(\beta\). Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that \(\beta = \frac1{kT}\).