assignment Homework

Gibbs free energy
thermodynamics Maxwell relation Energy and Entropy 2020 The Gibbs free energy, \(G\), is given by \begin{align*} G = U + pV - TS. \end{align*}
  1. Find the total differential of \(G\). As always, show your work.
  2. Interpret the coefficients of the total differential \(dG\) in order to find a derivative expression for the entropy \(S\).
  3. From the total differential \(dG\), obtain a different thermodynamic derivative that is equal to \[ \left(\frac{\partial {S}}{\partial {p}}\right)_{T} \]

assignment Homework

Using Gibbs Free Energy
thermodynamics entropy heat capacity internal energy equation of state Energy and Entropy 2021 (2 years)

You are given the following Gibbs free energy: \begin{equation*} G=-k T N \ln \left(\frac{a T^{5 / 2}}{p}\right) \end{equation*} where \(a\) is a constant (whose dimensions make the argument of the logarithm dimensionless).

  1. Compute the entropy.

  2. Work out the heat capacity at constant pressure \(C_p\).

  3. Find the connection among \(V\), \(p\), \(N\), and \(T\), which is called the equation of state (Hint: find the volume as a partial derivative of the Gibbs free energy).

  4. Compute the internal energy \(U\).

assignment Homework

Gibbs entropy is extensive
Gibbs entropy Probability Thermal and Statistical Physics 2020

Consider two noninteracting systems \(A\) and \(B\). We can either treat these systems as separate, or as a single combined system \(AB\). We can enumerate all states of the combined by enumerating all states of each separate system. The probability of the combined state \((i_A,j_B)\) is given by \(P_{ij}^{AB} = P_i^AP_j^B\). In other words, the probabilities combine in the same way as two dice rolls would, or the probabilities of any other uncorrelated events.

  1. Show that the entropy of the combined system \(S_{AB}\) is the sum of entropies of the two separate systems considered individually, i.e. \(S_{AB} = S_A+S_B\). This means that entropy is extensive. Use the Gibbs entropy for this computation. You need make no approximation in solving this problem.
  2. Show that if you have \(N\) identical non-interacting systems, their total entropy is \(NS_1\) where \(S_1\) is the entropy of a single system.

Note
In real materials, we treat properties as being extensive even when there are interactions in the system. In this case, extensivity is a property of large systems, in which surface effects may be neglected.

face Lecture

120 min.

Gibbs entropy approach
Thermal and Statistical Physics 2020

Gibbs entropy information theory probability statistical mechanics

These lecture notes for the first week of Thermal and Statistical Physics include a couple of small group activities in which students work with the Gibbs formulation of the entropy.

face Lecture

120 min.

Chemical potential and Gibbs distribution
Thermal and Statistical Physics 2020

chemical potential Gibbs distribution grand canonical ensemble statistical mechanics

These notes from the fifth week of Thermal and Statistical Physics cover the grand canonical ensemble. They include several small group activities.

face Lecture

5 min.

Energy and Entropy review
Thermal and Statistical Physics 2020 (3 years)

thermodynamics statistical mechanics

This very quick lecture reviews the content taught in Energy and Entropy, and is the first content in Thermal and Statistical Physics.

face Lecture

120 min.

Entropy and Temperature
Thermal and Statistical Physics 2020

paramagnet entropy temperature statistical mechanics

These lecture notes for the second week of Thermal and Statistical Physics involve relating entropy and temperature in the microcanonical ensemble, using a paramagnet as an example. These notes include a few small group activities.

assignment Homework

Boltzmann probabilities
Energy Entropy Boltzmann probabilities Thermal and Statistical Physics 2020 (3 years) Consider a three-state system with energies \((-\epsilon,0,\epsilon)\).
  1. At infinite temperature, what are the probabilities of the three states being occupied? What is the internal energy \(U\)? What is the entropy \(S\)?
  2. At very low temperature, what are the three probabilities?
  3. What are the three probabilities at zero temperature? What is the internal energy \(U\)? What is the entropy \(S\)?
  4. What happens to the probabilities if you allow the temperature to be negative?

face Lecture

120 min.

Phase transformations
Thermal and Statistical Physics 2020

phase transformation Clausius-Clapeyron mean field theory thermodynamics

These lecture notes from the ninth week of Thermal and Statistical Physics cover phase transformations, the Clausius-Clapeyron relation, mean field theory and more. They include a number of small group activities.

face Lecture

120 min.

Boltzmann probabilities and Helmholtz
Thermal and Statistical Physics 2020

ideal gas entropy canonical ensemble Boltzmann probability Helmholtz free energy statistical mechanics

These notes, from the third week of Thermal and Statistical Physics cover the canonical ensemble and Helmholtz free energy. They include a number of small group activities.

face Lecture

120 min.

Ideal Gas
Thermal and Statistical Physics 2020

ideal gas particle in a box grand canonical ensemble chemical potential statistical mechanics

These notes from week 6 of Thermal and Statistical Physics cover the ideal gas from a grand canonical standpoint starting with the solutions to a particle in a three-dimensional box. They include a number of small group activities.