## Exponential and Logarithm Identities

• face Energy and Entropy review

face Lecture

5 min.

##### Energy and Entropy review
Thermal and Statistical Physics 2020 (3 years)

This very quick lecture reviews the content taught in Energy and Entropy, and is the first content in Thermal and Statistical Physics.
• face Boltzmann probabilities and Helmholtz

face Lecture

120 min.

##### Boltzmann probabilities and Helmholtz
Thermal and Statistical Physics 2020

These notes, from the third week of Thermal and Statistical Physics cover the canonical ensemble and Helmholtz free energy. They include a number of small group activities.
• assignment Energy, Entropy, and Probabilities

assignment Homework

##### Energy, Entropy, and Probabilities
Energy Entropy Probabilities Thermodynamic identity

The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier $\beta$, we can prove that $\beta=\frac1{kT}$ based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.

The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align}: We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier $\beta$ as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate $U$ and $S$ to $\beta$. Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that $\beta = \frac1{kT}$.

• assignment Gibbs entropy is extensive

assignment Homework

##### Gibbs entropy is extensive
Gibbs entropy Probability Thermal and Statistical Physics 2020

Consider two noninteracting systems $A$ and $B$. We can either treat these systems as separate, or as a single combined system $AB$. We can enumerate all states of the combined by enumerating all states of each separate system. The probability of the combined state $(i_A,j_B)$ is given by $P_{ij}^{AB} = P_i^AP_j^B$. In other words, the probabilities combine in the same way as two dice rolls would, or the probabilities of any other uncorrelated events.

1. Show that the entropy of the combined system $S_{AB}$ is the sum of entropies of the two separate systems considered individually, i.e. $S_{AB} = S_A+S_B$. This means that entropy is extensive. Use the Gibbs entropy for this computation. You need make no approximation in solving this problem.
2. Show that if you have $N$ identical non-interacting systems, their total entropy is $NS_1$ where $S_1$ is the entropy of a single system.

##### Note
In real materials, we treat properties as being extensive even when there are interactions in the system. In this case, extensivity is a property of large systems, in which surface effects may be neglected.

• assignment Energy, Entropy, and Probabilities

assignment Homework

##### Energy, Entropy, and Probabilities
Thermal and Statistical Physics 2020

The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier $\beta$, we can prove that $\beta=\frac1{kT}$ based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.

The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align} We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier $\beta$ as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate $U$ and $S$ to $\beta$. Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that $\beta = \frac1{kT}$.

• assignment Entropy of mixing

assignment Homework

##### Entropy of mixing
Entropy Equilibrium Sackur-Tetrode Thermal and Statistical Physics 2020

Suppose that a system of $N$ atoms of type $A$ is placed in diffusive contact with a system of $N$ atoms of type $B$ at the same temperature and volume.

1. Show that after diffusive equilibrium is reached the total entropy is increased by $2Nk\ln 2$. The entropy increase $2Nk\ln 2$ is known as the entropy of mixing.

2. If the atoms are identical ($A=B$), show that there is no increase in entropy when diffusive contact is established. The difference has been called the Gibbs paradox.

3. Since the Helmholtz free energy is lower for the mixed $AB$ than for the separated $A$ and $B$, it should be possible to extract work from the mixing process. Construct a process that could extract work as the two gasses are mixed at fixed temperature. You will probably need to use walls that are permeable to one gas but not the other.

Note

This course has not yet covered work, but it was covered in Energy and Entropy, so you may need to stretch your memory to finish part (c).

• format_list_numbered Power Series Sequence (E&M)

format_list_numbered Sequence

##### Power Series Sequence (E&M)

The first three activities provide an active-engagement version of the canonical mathematical and geometric fundamentals for power series. The subsequent activities apply these ideas to physical situations that are appropriate for an upper-division electromagnetism course, using concepts, terminology, and techniques that are common among physicists, but not often taught in mathematics courses. In particular students use the memorized formula for the binomial expansion to evaluate various electrostatic and magnetostatic field in regions of high symmetry. By factoring out a physical quantity which is large compared to another physical quantity, they manipulate the formulas for these fields into a form where memorized formulas apply. The results for the different regions of high symmetry are compared and contrasted. A few homework problems that emphasize the meaning of series notation are included.

Note: The first two activities are also included in Power Series Sequence (Mechanics) and can be skipped in E&M if already taught in Mechanics.

• face Ideal Gas

face Lecture

120 min.

##### Ideal Gas
Thermal and Statistical Physics 2020

These notes from week 6 of Thermal and Statistical Physics cover the ideal gas from a grand canonical standpoint starting with the solutions to a particle in a three-dimensional box. They include a number of small group activities.
• face Review of Thermal Physics

face Lecture

30 min.

##### Review of Thermal Physics
Thermal and Statistical Physics 2020

These are notes, essentially the equation sheet, from the final review session for Thermal and Statistical Physics.
• face Entropy and Temperature

face Lecture

120 min.

##### Entropy and Temperature
Thermal and Statistical Physics 2020

These lecture notes for the second week of Thermal and Statistical Physics involve relating entropy and temperature in the microcanonical ensemble, using a paramagnet as an example. These notes include a few small group activities.
• Static Fields 2022

Make sure that you have memorized the following identities and can use them in simple algebra problems: \begin{align} e^{u+v}&=e^u \, e^v\\ \ln{uv}&=\ln{u}+\ln{v}\\ u^v&=e^{v\ln{u}} \end{align}