## Energy, Entropy, and Probabilities

• assignment Energy, Entropy, and Probabilities

assignment Homework

##### Energy, Entropy, and Probabilities
Energy Entropy Probabilities Thermodynamic identity

The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier $\beta$, we can prove that $\beta=\frac1{kT}$ based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.

The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align}: We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier $\beta$ as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate $U$ and $S$ to $\beta$. Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that $\beta = \frac1{kT}$.

• face Gibbs entropy approach

face Lecture

120 min.

##### Gibbs entropy approach
Thermal and Statistical Physics 2020

These lecture notes for the first week of Thermal and Statistical Physics include a couple of small group activities in which students work with the Gibbs formulation of the entropy.
• face Energy and Entropy review

face Lecture

5 min.

##### Energy and Entropy review
Thermal and Statistical Physics 2020 (3 years)

This very quick lecture reviews the content taught in Energy and Entropy, and is the first content in Thermal and Statistical Physics.
• assignment Energy fluctuations

assignment Homework

##### Energy fluctuations
energy Boltzmann factor statistical mechanics heat capacity Thermal and Statistical Physics 2020 Consider a system of fixed volume in thermal contact with a resevoir. Show that the mean square fluctuations in the energy of the system is $$\left<\left(\varepsilon-\langle\varepsilon\rangle\right)^2\right> = k_BT^2\left(\frac{\partial U}{\partial T}\right)_{V}$$ Here $U$ is the conventional symbol for $\langle\varepsilon\rangle$. Hint: Use the partition function $Z$ to relate $\left(\frac{\partial U}{\partial T}\right)_V$ to the mean square fluctuation. Also, multiply out the term $(\cdots)^2$.
• face Chemical potential and Gibbs distribution

face Lecture

120 min.

##### Chemical potential and Gibbs distribution
Thermal and Statistical Physics 2020

These notes from the fifth week of Thermal and Statistical Physics cover the grand canonical ensemble. They include several small group activities.
• assignment Gibbs sum for a two level system

assignment Homework

##### Gibbs sum for a two level system
Gibbs sum Microstate Thermal average energy Thermal and Statistical Physics 2020
1. Consider a system that may be unoccupied with energy zero, or occupied by one particle in either of two states, one of energy zero and one of energy $\varepsilon$. Find the Gibbs sum for this system is in terms of the activity $\lambda\equiv e^{\beta\mu}$. Note that the system can hold a maximum of one particle.

2. Solve for the thermal average occupancy of the system in terms of $\lambda$.

3. Show that the thermal average occupancy of the state at energy $\varepsilon$ is \begin{align} \langle N(\varepsilon)\rangle = \frac{\lambda e^{-\frac{\varepsilon}{kT}}}{\mathcal{Z}} \end{align}

4. Find an expression for the thermal average energy of the system.

5. Allow the possibility that the orbitals at $0$ and at $\varepsilon$ may each be occupied each by one particle at the same time; Show that \begin{align} \mathcal{Z} &= 1 + \lambda + \lambda e^{-\frac{\varepsilon}{kT}} + \lambda^2 e^{-\frac{\varepsilon}{kT}} \\ &= (1+\lambda)\left(1+e^{-\frac{\varepsilon}{kT}}\right) \end{align} Because $\mathcal{Z}$ can be factored as shown, we have in effect two independent systems.

• face Boltzmann probabilities and Helmholtz

face Lecture

120 min.

##### Boltzmann probabilities and Helmholtz
Thermal and Statistical Physics 2020

These notes, from the third week of Thermal and Statistical Physics cover the canonical ensemble and Helmholtz free energy. They include a number of small group activities.
• face Review of Thermal Physics

face Lecture

30 min.

##### Review of Thermal Physics
Thermal and Statistical Physics 2020

These are notes, essentially the equation sheet, from the final review session for Thermal and Statistical Physics.
• assignment Boltzmann probabilities

assignment Homework

##### Boltzmann probabilities
Energy Entropy Boltzmann probabilities Thermal and Statistical Physics 2020 (3 years) Consider a three-state system with energies $(-\epsilon,0,\epsilon)$.
1. At infinite temperature, what are the probabilities of the three states being occupied? What is the internal energy $U$? What is the entropy $S$?
2. At very low temperature, what are the three probabilities?
3. What are the three probabilities at zero temperature? What is the internal energy $U$? What is the entropy $S$?
4. What happens to the probabilities if you allow the temperature to be negative?
• assignment Pressure and entropy of a degenerate Fermi gas

assignment Homework

##### Pressure and entropy of a degenerate Fermi gas
Fermi gas Pressure Entropy Thermal and Statistical Physics 2020
1. Show that a Fermi electron gas in the ground state exerts a pressure \begin{align} p = \frac{\left(3\pi^2\right)^{\frac23}}{5} \frac{\hbar^2}{m}\left(\frac{N}{V}\right)^{\frac53} \end{align} In a uniform decrease of the volume of a cube every orbital has its energy raised: The energy of each orbital is proportional to $\frac1{L^2}$ or to $\frac1{V^{\frac23}}$.

2. Find an expression for the entropy of a Fermi electron gas in the region $kT\ll \varepsilon_F$. Notice that $S\rightarrow 0$ as $T\rightarrow 0$.

• Thermal and Statistical Physics 2020

The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier $\beta$, we can prove that $\beta=\frac1{kT}$ based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.

The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align} We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier $\beta$ as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate $U$ and $S$ to $\beta$. Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that $\beta = \frac1{kT}$.