Activity: Energy and Entropy review

Thermal and Statistical Physics 2020 (3 years)
This very quick lecture reviews the content taught in Energy and Entropy, and is the first content in Thermal and Statistical Physics.
  • Media
    • activity_media/02-Equations.pdf

I'll go over all of these equations.

Extensive/intensive (Schroeder 5.2)
If you consider two identical systems taken together (e.g. two cups of water, or two identical cubes of metal), each thermodynamic property either doubles or remains the same.
Extensive
An extensive property, such as mass will double when youve got twice as much stuff.
Intensive
An intensive property, such as density will be the same regardless of how much stuff youve got.
We care about extensivity and intensivity for several reasons. In one sense it functions like dimensions as a way to check our work. In another sense, it is a fundamental aspect of each measurable property, and once you are accustomed to this, you will feel very uncomfortable if you dont know whether it is extensive or intensive.

How to measure things
I may give you questions which require you to make use of these definitions.
Volume
Measure dimensions and compute it. (extensive)
Pressure
Force per area. Can equalize if systems can exchange volume. (intensive) (Schroeder 1.2)
Temperature
Find something that depends on temperature, and calibrate it. Alternatively use an ideal gas. Equalizes when systems are in contact. (intensive)
Energy
Challenging... measure work and heat (e.g. by measuring power into resistor) and use the First Law. (extensive) \begin{align} W = -\int pdV \end{align}
Entropy
(extensive) Measure heat for a quasistatic process and find \begin{align} \Delta S &= \int \frac{{\mathit{\unicode{273}}} Q}{T} \end{align} (Schroeder 3.2)
Derivatives
Measure changes of one thing as the other changes, with the right stuff held fixed.

First Law (Energy conservation, Schroeder 1.4)
\begin{align} dU &= {\mathit{\unicode{273}}} Q + {\mathit{\unicode{273}}} W \end{align}

Second Law (Entropy increases, Schroeder 2.3)
\begin{align} \Delta S_{\text{system}} + \Delta S_{\text{environment}} \ge 0 \end{align}

Thermodynamic identity (Schroeder 3.4)
\begin{align} dU &= TdS - pdV \end{align}

Total differentials
When we have a total differential, the things in front of the \(dS\), \(dV\), etc are partial derivatives. \begin{align} T &= \left(\frac{\partial U}{\partial S}\right)_V & -p &= \left(\frac{\partial U}{\partial V}\right)_S \end{align} Also, you can integrate along a path using a total differential, and can do linear algebra using total differential equations, e.g. substituting one for another. Fun example: \begin{align} dS &= \frac{1}{T}dU + \frac{p}{T}dV \\ \frac1{T} &= \left(\frac{\partial S}{\partial U}\right)_V & \frac{p}{T} &= \left(\frac{\partial S}{\partial V}\right)_U \end{align} (the second derivative shown here is known as the cyclic chain rule in a more general sense)

Thermodynamic potentials (Schroeder 1.6, 5.1)
Helmholtz free energy
\begin{align} F &= U - TS \\ dF &= dU - TdS - SdT \\ &= -SdT -pdV \end{align}
Enthalpy
\begin{align} H &= U + pV \\ dH &= dU + pdV + Vdp \\ &= TdS + Vdp \end{align}
Gibbs free energy
\begin{align} G &= H - TS \\ &= U -TS+pV \\ dG &= dH - TdS - SdT \\ &= -SdT +Vdp \end{align}

Maxwell relations
You should be able to use the fact that mixed partial derivatives do not depend on the order taken to find Maxwell relations. Also using the differentials above, e.g. \begin{align} \left(\frac{\partial {\left(\frac{\partial {G}}{\partial {T}}\right)_{p}}}{\partial {p}}\right)_{T} &=\left(\frac{\partial {\left(\frac{\partial {G}}{\partial {p}}\right)_{T}}}{\partial {T}}\right)_{p} \\ -\left(\frac{\partial {S}}{\partial {p}}\right)_{T} &= \left(\frac{\partial {V}}{\partial {T}}\right)_{p} \end{align}

Statistical entropy (Schroeder 2.6, Problem 6.43)

Boltzmann formulation (microcanonical or for large \(N\)): \begin{align} S(E) &= k_B \ln g(E) \end{align} where \(g\) is the number of microstates (or energy eigenstates). We spent little time on the Boltzmann formulation, but it is helpful to know that at sufficiently high temperatures the entropy approaches \(k_B\) times the logarithm of the number of energy eigenstates.

Gibbs formulation (always true): \begin{align} S(E) &= -k_B \sum_{i}^{\text{all states}} P_i \ln P_i \end{align}

Boltzmann ratio (Schroeder 6.1)
\begin{align} \frac{P_i}{P_j} &= e^{-\frac{E_i - E_j}{k_BT}} \\ P_i &= \frac{e^{-\frac{E_i}{k_BT}}}{Z} \\ Z &= \sum_j^{\text{all states}} e^{-\frac{E_j}{k_BT}} \end{align}

Thermal averages (Schroeder 6.2)

The average value of any quantity is given by the weighted average \begin{align} \left<X\right> &= \sum_i^{\text{all states}} P_i X_i \end{align} In particular, the internal energy is given by \begin{align} U &= \sum_i^{\text{all states}} P_i E_i \end{align}

Helmholtz free energy from statistics
\begin{align} F &= -kT\ln Z \end{align}
The following is the old review that I wrote.

Math

Total Differentials

\begin{align} dA &= \left(\frac{\partial {A}}{\partial {B}}\right)_{C}dB + \left(\frac{\partial {A}}{\partial {C}}\right)_{B}dC \end{align} You can:
  1. Do algebra
  2. Interpret coefficients as partial derivatives
  3. Integrate

Mixed partial derivatives

\begin{align} \left(\frac{\partial {\left(\frac{\partial {A}}{\partial {B}}\right)_{C}}}{\partial {C}}\right)_{B} = \left(\frac{\partial {\left(\frac{\partial {A}}{\partial {C}}\right)_{B}}}{\partial {B}}\right)_{C} \end{align}

Chain rules (need not be memorized, but may be used)

\begin{align} \left(\frac{\partial {A}}{\partial {B}}\right)_{C} &= \frac{1}{\left(\frac{\partial {B}}{\partial {A}}\right)_{C}} \\ \left(\frac{\partial {A}}{\partial {B}}\right)_{D} &= \left(\frac{\partial {A}}{\partial {C}}\right)_{D}\left(\frac{\partial {C}}{\partial {B}}\right)_{D} \\ \left(\frac{\partial {A}}{\partial {B}}\right)_{C} &= -\frac{\left(\frac{\partial {A}}{\partial {C}}\right)_{B}}{\left(\frac{\partial {B}}{\partial {C}}\right)_{A}} \end{align}

Thermodynamics

Entropy

\begin{align} \Delta S &= \int {\mathit{\unicode{273}}} \frac{Q_{\text{quasistatic}}}{T} \\ {\mathit{\unicode{273}}} Q &= TdS \\ C_\alpha &= T \left(\frac{\partial {S}}{\partial {T}}\right)_{\alpha} \end{align}

First Law

\begin{align} \Delta U &= Q + W \\ dU &= {\mathit{\unicode{273}}} Q + {\mathit{\unicode{273}}} W \\ dU &= TdS - pdV \end{align}

Second Law

\begin{align} \Delta S_{\text{system}} + \Delta S_{\text{surroundings}} \ge 0 \end{align}

Legendre Transforms

You can subtract \(TS\) from \(U\) or add \(pV\) to \(U\) to create new thermodynamic potentials that are convenient when \(T\) or \(p\) are held fixed or controlled.

Maxwell Relations

From any thermodynamic potential you can use the equality of mixed partial derivatives to create a relationship between two different partial derivatives.

Statistical Mechanics

\begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} \\ Z &= \sum_i^{\text{all states}} \frac{e^{-\beta E_i}}{Z} \\ \beta &= \frac{1}{k_BT} \\ F &= -k_BT \ln Z \\ U &= \sum_i^{\text{all states}} P_i E_i \\ S &= -k_B\sum_i^{\text{all states}} P_i \ln P_i \end{align}
  • assignment Gibbs entropy is extensive

    assignment Homework

    Gibbs entropy is extensive
    Gibbs entropy Probability Thermal and Statistical Physics 2020

    Consider two noninteracting systems \(A\) and \(B\). We can either treat these systems as separate, or as a single combined system \(AB\). We can enumerate all states of the combined by enumerating all states of each separate system. The probability of the combined state \((i_A,j_B)\) is given by \(P_{ij}^{AB} = P_i^AP_j^B\). In other words, the probabilities combine in the same way as two dice rolls would, or the probabilities of any other uncorrelated events.

    1. Show that the entropy of the combined system \(S_{AB}\) is the sum of entropies of the two separate systems considered individually, i.e. \(S_{AB} = S_A+S_B\). This means that entropy is extensive. Use the Gibbs entropy for this computation. You need make no approximation in solving this problem.
    2. Show that if you have \(N\) identical non-interacting systems, their total entropy is \(NS_1\) where \(S_1\) is the entropy of a single system.

    Note
    In real materials, we treat properties as being extensive even when there are interactions in the system. In this case, extensivity is a property of large systems, in which surface effects may be neglected.

  • assignment Adiabatic Compression

    assignment Homework

    Adiabatic Compression
    ideal gas internal energy engine Energy and Entropy 2020

    A diesel engine requires no spark plug. Rather, the air in the cylinder is compressed so highly that the fuel ignites spontaneously when sprayed into the cylinder.

    In this problem, you may treat air as an ideal gas, which satisfies the equation \(pV = Nk_BT\). You may also use the property of an ideal gas that the internal energy depends only on the temperature \(T\), i.e. the internal energy does not change for an isothermal process. For air at the relevant range of temperatures the heat capacity at fixed volume is given by \(C_V=\frac52Nk_B\), which means the internal energy is given by \(U=\frac52Nk_BT\).

    Note: in this problem you are expected to use only the equations given and fundamental physics laws. Looking up the formula in a textbook is not considered a solution at this level.

    1. If the air is initially at room temperature (taken as \(20^{o}C\)) and is then compressed adiabatically to \(\frac1{15}\) of the original volume, what final temperature is attained (before fuel injection)?

    2. By what factor does the pressure increase?

  • face Gibbs entropy approach

    face Lecture

    120 min.

    Gibbs entropy approach
    Thermal and Statistical Physics 2020

    Gibbs entropy information theory probability statistical mechanics

    These lecture notes for the first week of Thermal and Statistical Physics include a couple of small group activities in which students work with the Gibbs formulation of the entropy.
  • face Chemical potential and Gibbs distribution

    face Lecture

    120 min.

    Chemical potential and Gibbs distribution
    Thermal and Statistical Physics 2020

    chemical potential Gibbs distribution grand canonical ensemble statistical mechanics

    These notes from the fifth week of Thermal and Statistical Physics cover the grand canonical ensemble. They include several small group activities.
  • group Ice Calorimetry Lab

    group Small Group Activity

    60 min.

    Ice Calorimetry Lab

    heat entropy water ice

    The students will set up a Styrofoam cup with heating element and a thermometer in it. They will measure the temperature as a function of time, and thus the energy transferred from the power supply, from which they compute changes in entropy.
  • face Phase transformations

    face Lecture

    120 min.

    Phase transformations
    Thermal and Statistical Physics 2020

    phase transformation Clausius-Clapeyron mean field theory thermodynamics

    These lecture notes from the ninth week of Thermal and Statistical Physics cover phase transformations, the Clausius-Clapeyron relation, mean field theory and more. They include a number of small group activities.
  • assignment Free energy of a two state system

    assignment Homework

    Free energy of a two state system
    Helmholtz free energy entropy statistical mechanics Thermal and Statistical Physics 2020
    1. Find an expression for the free energy as a function of \(T\) of a system with two states, one at energy 0 and one at energy \(\varepsilon\).

    2. From the free energy, find expressions for the internal energy \(U\) and entropy \(S\) of the system.

    3. Plot the entropy versus \(T\). Explain its asymptotic behavior as the temperature becomes high.

    4. Plot the \(S(T)\) versus \(U(T)\). Explain the maximum value of the energy \(U\).

  • group Name the experiment

    group Small Group Activity

    30 min.

    Name the experiment
    Energy and Entropy 2021 (3 years)

    partial derivatives experiment thermodynamics

    Student groups design an experiment that measures an assigned partial derivative. In a compare-and-contrast wrap-up, groups report on how they would measure their derivatives.
  • group Name the experiment (changing entropy)

    group Small Group Activity

    30 min.

    Name the experiment (changing entropy)
    Energy and Entropy 2021 (2 years)

    thermodynamics entropy experiment derivative first law

    Students are placed into small groups and asked to create an experimental setup they can use to measure the partial derivative they are given, in which entropy changes.
  • assignment Energy fluctuations

    assignment Homework

    Energy fluctuations
    energy Boltzmann factor statistical mechanics heat capacity Thermal and Statistical Physics 2020 Consider a system of fixed volume in thermal contact with a resevoir. Show that the mean square fluctuations in the energy of the system is \begin{equation} \left<\left(\varepsilon-\langle\varepsilon\rangle\right)^2\right> = k_BT^2\left(\frac{\partial U}{\partial T}\right)_{V} \end{equation} Here \(U\) is the conventional symbol for \(\langle\varepsilon\rangle\). Hint: Use the partition function \(Z\) to relate \(\left(\frac{\partial U}{\partial T}\right)_V\) to the mean square fluctuation. Also, multiply out the term \((\cdots)^2\).

Learning Outcomes