Using Gibbs Free Energy

  • thermodynamics entropy heat capacity internal energy equation of state
    • assignment Gibbs free energy

      assignment Homework

      Gibbs free energy
      thermodynamics Maxwell relation Energy and Entropy 2020 The Gibbs free energy, \(G\), is given by \begin{align*} G = U + pV - TS. \end{align*}
      1. Find the total differential of \(G\). As always, show your work.
      2. Interpret the coefficients of the total differential \(dG\) in order to find a derivative expression for the entropy \(S\).
      3. From the total differential \(dG\), obtain a different thermodynamic derivative that is equal to \[ \left(\frac{\partial {S}}{\partial {p}}\right)_{T} \]
    • face Review of Thermal Physics

      face Lecture

      30 min.

      Review of Thermal Physics
      Thermal and Statistical Physics 2020

      thermodynamics statistical mechanics

      These are notes, essentially the equation sheet, from the final review session for Thermal and Statistical Physics.
    • face Chemical potential and Gibbs distribution

      face Lecture

      120 min.

      Chemical potential and Gibbs distribution
      Thermal and Statistical Physics 2020

      chemical potential Gibbs distribution grand canonical ensemble statistical mechanics

      These notes from the fifth week of Thermal and Statistical Physics cover the grand canonical ensemble. They include several small group activities.
    • face Energy and Entropy review

      face Lecture

      5 min.

      Energy and Entropy review
      Thermal and Statistical Physics 2020 (3 years)

      thermodynamics statistical mechanics

      This very quick lecture reviews the content taught in Energy and Entropy, and is the first content in Thermal and Statistical Physics.
    • assignment Gibbs entropy is extensive

      assignment Homework

      Gibbs entropy is extensive
      Gibbs entropy Probability Thermal and Statistical Physics 2020

      Consider two noninteracting systems \(A\) and \(B\). We can either treat these systems as separate, or as a single combined system \(AB\). We can enumerate all states of the combined by enumerating all states of each separate system. The probability of the combined state \((i_A,j_B)\) is given by \(P_{ij}^{AB} = P_i^AP_j^B\). In other words, the probabilities combine in the same way as two dice rolls would, or the probabilities of any other uncorrelated events.

      1. Show that the entropy of the combined system \(S_{AB}\) is the sum of entropies of the two separate systems considered individually, i.e. \(S_{AB} = S_A+S_B\). This means that entropy is extensive. Use the Gibbs entropy for this computation. You need make no approximation in solving this problem.
      2. Show that if you have \(N\) identical non-interacting systems, their total entropy is \(NS_1\) where \(S_1\) is the entropy of a single system.

      Note
      In real materials, we treat properties as being extensive even when there are interactions in the system. In this case, extensivity is a property of large systems, in which surface effects may be neglected.

    • face Gibbs entropy approach

      face Lecture

      120 min.

      Gibbs entropy approach
      Thermal and Statistical Physics 2020

      Gibbs entropy information theory probability statistical mechanics

      These lecture notes for the first week of Thermal and Statistical Physics include a couple of small group activities in which students work with the Gibbs formulation of the entropy.
    • face Phase transformations

      face Lecture

      120 min.

      Phase transformations
      Thermal and Statistical Physics 2020

      phase transformation Clausius-Clapeyron mean field theory thermodynamics

      These lecture notes from the ninth week of Thermal and Statistical Physics cover phase transformations, the Clausius-Clapeyron relation, mean field theory and more. They include a number of small group activities.
    • face Entropy and Temperature

      face Lecture

      120 min.

      Entropy and Temperature
      Thermal and Statistical Physics 2020

      paramagnet entropy temperature statistical mechanics

      These lecture notes for the second week of Thermal and Statistical Physics involve relating entropy and temperature in the microcanonical ensemble, using a paramagnet as an example. These notes include a few small group activities.
    • assignment Gibbs sum for a two level system

      assignment Homework

      Gibbs sum for a two level system
      Gibbs sum Microstate Thermal average energy Thermal and Statistical Physics 2020
      1. Consider a system that may be unoccupied with energy zero, or occupied by one particle in either of two states, one of energy zero and one of energy \(\varepsilon\). Find the Gibbs sum for this system is in terms of the activity \(\lambda\equiv e^{\beta\mu}\). Note that the system can hold a maximum of one particle.

      2. Solve for the thermal average occupancy of the system in terms of \(\lambda\).

      3. Show that the thermal average occupancy of the state at energy \(\varepsilon\) is \begin{align} \langle N(\varepsilon)\rangle = \frac{\lambda e^{-\frac{\varepsilon}{kT}}}{\mathcal{Z}} \end{align}

      4. Find an expression for the thermal average energy of the system.

      5. Allow the possibility that the orbitals at \(0\) and at \(\varepsilon\) may each be occupied each by one particle at the same time; Show that \begin{align} \mathcal{Z} &= 1 + \lambda + \lambda e^{-\frac{\varepsilon}{kT}} + \lambda^2 e^{-\frac{\varepsilon}{kT}} \\ &= (1+\lambda)\left(1+e^{-\frac{\varepsilon}{kT}}\right) \end{align} Because \(\mathcal{Z}\) can be factored as shown, we have in effect two independent systems.

    • assignment Entropy of mixing

      assignment Homework

      Entropy of mixing
      Entropy Equilibrium Sackur-Tetrode Thermal and Statistical Physics 2020

      Suppose that a system of \(N\) atoms of type \(A\) is placed in diffusive contact with a system of \(N\) atoms of type \(B\) at the same temperature and volume.

      1. Show that after diffusive equilibrium is reached the total entropy is increased by \(2Nk\ln 2\). The entropy increase \(2Nk\ln 2\) is known as the entropy of mixing.

      2. If the atoms are identical (\(A=B\)), show that there is no increase in entropy when diffusive contact is established. The difference has been called the Gibbs paradox.

      3. Since the Helmholtz free energy is lower for the mixed \(AB\) than for the separated \(A\) and \(B\), it should be possible to extract work from the mixing process. Construct a process that could extract work as the two gasses are mixed at fixed temperature. You will probably need to use walls that are permeable to one gas but not the other.

      Note

      This course has not yet covered work, but it was covered in Energy and Entropy, so you may need to stretch your memory to finish part (c).

  • Energy and Entropy 2021 (2 years)

    You are given the following Gibbs free energy: \begin{equation*} G=-k T N \ln \left(\frac{a T^{5 / 2}}{p}\right) \end{equation*} where \(a\) is a constant (whose dimensions make the argument of the logarithm dimensionless).

    1. Compute the entropy.

    2. Work out the heat capacity at constant pressure \(C_p\).

    3. Find the connection among \(V\), \(p\), \(N\), and \(T\), which is called the equation of state (Hint: find the volume as a partial derivative of the Gibbs free energy).

    4. Compute the internal energy \(U\).