The Gibbs free energy, \(G\), is given by \begin{align*} G = U + pV - TS. \end{align*}
  1. Find the total differential of \(G\). As always, show your work.
  2. Interpret the coefficients of the total differential \(dG\) in order to find a derivative expression for the entropy \(S\).
  3. From the total differential \(dG\), obtain a different thermodynamic derivative that is equal to \[ \left(\frac{\partial {S}}{\partial {p}}\right)_{T} \]
  • Found in: Energy and Entropy course(s)

You are given the following Gibbs free energy: \begin{equation*} G=-k T N \ln \left(\frac{a T^{5 / 2}}{p}\right) \end{equation*} where \(a\) is a constant (whose dimensions make the argument of the logarithm dimensionless).

  1. Compute the entropy.

  2. Work out the heat capacity at constant pressure \(C_p\).

  3. Find the connection among \(V\), \(p\), \(N\), and \(T\), which is called the equation of state (Hint: find the volume as a partial derivative of the Gibbs free energy).

  4. Compute the internal energy \(U\).

  • Found in: Energy and Entropy course(s)

Consider two noninteracting systems \(A\) and \(B\). We can either treat these systems as separate, or as a single combined system \(AB\). We can enumerate all states of the combined by enumerating all states of each separate system. The probability of the combined state \((i_A,j_B)\) is given by \(P_{ij}^{AB} = P_i^AP_j^B\). In other words, the probabilities combine in the same way as two dice rolls would, or the probabilities of any other uncorrelated events.

  1. Show that the entropy of the combined system \(S_{AB}\) is the sum of entropies of the two separate systems considered individually, i.e. \(S_{AB} = S_A+S_B\). This means that entropy is extensive. Use the Gibbs entropy for this computation. You need make no approximation in solving this problem.
  2. Show that if you have \(N\) identical non-interacting systems, their total entropy is \(NS_1\) where \(S_1\) is the entropy of a single system.

Note
In real materials, we treat properties as being extensive even when there are interactions in the system. In this case, extensivity is a property of large systems, in which surface effects may be neglected.

  • Found in: Thermal and Statistical Physics course(s)

face Lecture

30 min.

Review of Thermal Physics
These are notes, essentially the equation sheet, from the final review session for Thermal and Statistical Physics.

Suppose that a system of \(N\) atoms of type \(A\) is placed in diffusive contact with a system of \(N\) atoms of type \(B\) at the same temperature and volume.

  1. Show that after diffusive equilibrium is reached the total entropy is increased by \(2Nk\ln 2\). The entropy increase \(2Nk\ln 2\) is known as the entropy of mixing.

  2. If the atoms are identical (\(A=B\)), show that there is no increase in entropy when diffusive contact is established. The difference has been called the Gibbs paradox.

  3. Since the Helmholtz free energy is lower for the mixed \(AB\) than for the separated \(A\) and \(B\), it should be possible to extract work from the mixing process. Construct a process that could extract work as the two gasses are mixed at fixed temperature. You will probably need to use walls that are permeable to one gas but not the other.

Note

This course has not yet covered work, but it was covered in Energy and Entropy, so you may need to stretch your memory to finish part (c).

  • Found in: Thermal and Statistical Physics course(s)

face Lecture

120 min.

Gibbs entropy approach
These lecture notes for the first week of Thermal and Statistical Physics include a couple of small group activities in which students work with the Gibbs formulation of the entropy.
These notes from the fifth week of Thermal and Statistical Physics cover the grand canonical ensemble. They include several small group activities.

face Lecture

5 min.

Energy and Entropy review
This very quick lecture reviews the content taught in Energy and Entropy, and is the first content in Thermal and Statistical Physics.

face Lecture

120 min.

Entropy and Temperature
These lecture notes for the second week of Thermal and Statistical Physics involve relating entropy and temperature in the microcanonical ensemble, using a paramagnet as an example. These notes include a few small group activities.
Consider a three-state system with energies \((-\epsilon,0,\epsilon)\).
  1. At infinite temperature, what are the probabilities of the three states being occupied? What is the internal energy \(U\)? What is the entropy \(S\)?
  2. At very low temperature, what are the three probabilities?
  3. What are the three probabilities at zero temperature? What is the internal energy \(U\)? What is the entropy \(S\)?
  4. What happens to the probabilities if you allow the temperature to be negative?
  • Found in: Energy and Entropy, Thermal and Statistical Physics course(s)

face Lecture

120 min.

Phase transformations
These lecture notes from the ninth week of Thermal and Statistical Physics cover phase transformations, the Clausius-Clapeyron relation, mean field theory and more. They include a number of small group activities.
These notes, from the third week of Thermal and Statistical Physics cover the canonical ensemble and Helmholtz free energy. They include a number of small group activities.

face Lecture

120 min.

Ideal Gas
These notes from week 6 of Thermal and Statistical Physics cover the ideal gas from a grand canonical standpoint starting with the solutions to a particle in a three-dimensional box. They include a number of small group activities.