Gibbs entropy is extensive

  • Gibbs entropy Probability
    • face Energy and Entropy review

      face Lecture

      5 min.

      Energy and Entropy review
      Thermal and Statistical Physics 2020 (3 years)

      thermodynamics statistical mechanics

      This very quick lecture reviews the content taught in Energy and Entropy, and is the first content in Thermal and Statistical Physics.
    • assignment Pressure of thermal radiation

      assignment Homework

      Pressure of thermal radiation
      Thermal radiation Pressure Thermal and Statistical Physics 2020

      (modified from K&K 4.6) We discussed in class that \begin{align} p &= -\left(\frac{\partial F}{\partial V}\right)_T \end{align} Use this relationship to show that

      1. \begin{align} p &= -\sum_j \langle n_j\rangle\hbar \left(\frac{d\omega_j}{dV}\right), \end{align} where \(\langle n_j\rangle\) is the number of photons in the mode \(j\);

      2. Solve for the relationship between pressure and internal energy.

    • face Chemical potential and Gibbs distribution

      face Lecture

      120 min.

      Chemical potential and Gibbs distribution
      Thermal and Statistical Physics 2020

      chemical potential Gibbs distribution grand canonical ensemble statistical mechanics

      These notes from the fifth week of Thermal and Statistical Physics cover the grand canonical ensemble. They include several small group activities.
    • face Boltzmann probabilities and Helmholtz

      face Lecture

      120 min.

      Boltzmann probabilities and Helmholtz
      Thermal and Statistical Physics 2020

      ideal gas entropy canonical ensemble Boltzmann probability Helmholtz free energy statistical mechanics

      These notes, from the third week of Thermal and Statistical Physics cover the canonical ensemble and Helmholtz free energy. They include a number of small group activities.
    • face Gibbs entropy approach

      face Lecture

      120 min.

      Gibbs entropy approach
      Thermal and Statistical Physics 2020

      Gibbs entropy information theory probability statistical mechanics

      These lecture notes for the first week of Thermal and Statistical Physics include a couple of small group activities in which students work with the Gibbs formulation of the entropy.
    • assignment Energy fluctuations

      assignment Homework

      Energy fluctuations
      energy Boltzmann factor statistical mechanics heat capacity Thermal and Statistical Physics 2020 Consider a system of fixed volume in thermal contact with a resevoir. Show that the mean square fluctuations in the energy of the system is \begin{equation} \left<\left(\varepsilon-\langle\varepsilon\rangle\right)^2\right> = k_BT^2\left(\frac{\partial U}{\partial T}\right)_{V} \end{equation} Here \(U\) is the conventional symbol for \(\langle\varepsilon\rangle\). Hint: Use the partition function \(Z\) to relate \(\left(\frac{\partial U}{\partial T}\right)_V\) to the mean square fluctuation. Also, multiply out the term \((\cdots)^2\).
    • assignment Energy, Entropy, and Probabilities

      assignment Homework

      Energy, Entropy, and Probabilities
      Energy Entropy Probabilities Thermodynamic identity

      The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier \(\beta\), we can prove that \(\beta=\frac1{kT}\) based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.

      The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align}: We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier \(\beta\) as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate \(U\) and \(S\) to \(\beta\). Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that \(\beta = \frac1{kT}\).

    • assignment Energy, Entropy, and Probabilities

      assignment Homework

      Energy, Entropy, and Probabilities
      Thermal and Statistical Physics 2020

      The goal of this problem is to show that once we have maximized the entropy and found the microstate probabilities in terms of a Lagrange multiplier \(\beta\), we can prove that \(\beta=\frac1{kT}\) based on the statistical definitions of energy and entropy and the thermodynamic definition of temperature embodied in the thermodynamic identity.

      The internal energy and entropy are each defined as a weighted average over microstates: \begin{align} U &= \sum_i E_i P_i & S &= -k_B\sum_i P_i \ln P_i \end{align} We saw in clase that the probability of each microstate can be given in terms of a Lagrange multiplier \(\beta\) as \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} & Z &= \sum_i e^{-\beta E_i} \end{align} Put these probabilities into the above weighted averages in order to relate \(U\) and \(S\) to \(\beta\). Then make use of the thermodynamic identity \begin{align} dU = TdS - pdV \end{align} to show that \(\beta = \frac1{kT}\).

    • assignment Adiabatic Compression

      assignment Homework

      Adiabatic Compression
      ideal gas internal energy engine Energy and Entropy 2020

      A diesel engine requires no spark plug. Rather, the air in the cylinder is compressed so highly that the fuel ignites spontaneously when sprayed into the cylinder.

      In this problem, you may treat air as an ideal gas, which satisfies the equation \(pV = Nk_BT\). You may also use the property of an ideal gas that the internal energy depends only on the temperature \(T\), i.e. the internal energy does not change for an isothermal process. For air at the relevant range of temperatures the heat capacity at fixed volume is given by \(C_V=\frac52Nk_B\), which means the internal energy is given by \(U=\frac52Nk_BT\).

      Note: in this problem you are expected to use only the equations given and fundamental physics laws. Looking up the formula in a textbook is not considered a solution at this level.

      1. If the air is initially at room temperature (taken as \(20^{o}C\)) and is then compressed adiabatically to \(\frac1{15}\) of the original volume, what final temperature is attained (before fuel injection)?

      2. By what factor does the pressure increase?

    • assignment Boltzmann probabilities

      assignment Homework

      Boltzmann probabilities
      Energy Entropy Boltzmann probabilities Thermal and Statistical Physics 2020 (3 years) Consider a three-state system with energies \((-\epsilon,0,\epsilon)\).
      1. At infinite temperature, what are the probabilities of the three states being occupied? What is the internal energy \(U\)? What is the entropy \(S\)?
      2. At very low temperature, what are the three probabilities?
      3. What are the three probabilities at zero temperature? What is the internal energy \(U\)? What is the entropy \(S\)?
      4. What happens to the probabilities if you allow the temperature to be negative?
  • Thermal and Statistical Physics 2020

    Consider two noninteracting systems \(A\) and \(B\). We can either treat these systems as separate, or as a single combined system \(AB\). We can enumerate all states of the combined by enumerating all states of each separate system. The probability of the combined state \((i_A,j_B)\) is given by \(P_{ij}^{AB} = P_i^AP_j^B\). In other words, the probabilities combine in the same way as two dice rolls would, or the probabilities of any other uncorrelated events.

    1. Show that the entropy of the combined system \(S_{AB}\) is the sum of entropies of the two separate systems considered individually, i.e. \(S_{AB} = S_A+S_B\). This means that entropy is extensive. Use the Gibbs entropy for this computation. You need make no approximation in solving this problem.
    2. Show that if you have \(N\) identical non-interacting systems, their total entropy is \(NS_1\) where \(S_1\) is the entropy of a single system.

    Note
    In real materials, we treat properties as being extensive even when there are interactions in the system. In this case, extensivity is a property of large systems, in which surface effects may be neglected.