These notes from week 6 of Thermal and Statistical Physics cover the ideal gas from a grand canonical standpoint starting with the solutions to a particle in a three-dimensional box. They include a number of small group activities.
Week 6
Topics are everything through week 4, including week 3 homework, which was due in week 4. Problems should be similar to homework problems, but designed to be completed in class. The exam will be closed notes. You should be able to remember the fundamental equations: \begin{align} dU &= TdS - pdV \\ F &= U - TS \\ dF &= -SdT - pdV \\ P_i &= \frac{e^{-\beta E_i}}{Z} \\ Z &= \sum_i e^{-\beta E_i} \\ U &= \sum_i E_i P_i \\ F &= -kT\ln Z \\ S &= -k\sum_i P_i\ln P_i \\ \end{align} If you need a property of a particular system (the ideal gas, the simple harmonic oscillator), it will be given to you. There is no need, for instance, to remember the Stefan-Boltzmann law or the Planck distribution.
You may recall that when we solved for the free energy of an ideal gas, we had a fair amount of work to sum over all possible sets of quantum numbers for each atom, and then to remove the double-counting due to the fact that our atoms were identical. We had a similar issue when dealing with photon modes and blackbody radiation, but in that case one approach was to treat each mode as a separate system, and then just sum over all the modes separately, without ever needing to find the partition function of all the modes taken together.
This week we will be looking at how we can treat each orbital (i.e. possible quantum state for a single non-interacting particle) as a separate system (which may or may not be occupied). This can only work when we work in the grand canonical ensemble, but will greatly simplify our understanding of such systems.
Kittel uses the term orbital to refer to an energy eigenstate (or wave function) of a one-particle system. How do things differ when we have more than one particle?
Suppose we have three particles (and ignore spin for a moment). The wave function would be written as \(\Psi(\vec r_1, \vec r_2, \vec r_3,\cdots)\). This function in general has nothing to do with any single-particle orbitals. Orbitals arise when we consider a Hamiltonian in which there are no interactions between particles: \begin{align} \hat H &= \frac{\hat p_1^2}{2m} + V(\vec r_1) + \frac{\hat p_2^2}{2m} + V(\vec r_2) + \cdots \end{align} When our Hamiltonian is separable in this way (i.e. the particles don't interact, and there are no terms that involve both \(\vec r_1\) and \(\vec r_2\)), we can use separation of variables in the solution, and we obtain a wave function that is a product of orbitals: \begin{align} \left|i_1,i_2,i_3,\cdots\right> &\dot= \phi_{i_1}(\vec r_1)\phi_{i_2}(\vec r_2)\phi_{i_3}(\vec r_3)\cdots \end{align} Assuming the potential and mass are the same for every particle, these orbitals are eigenstates of the following single-particle eigenvalue equation: \begin{align} \left(\frac{\hat p^2}{2m} + V(\vec r)\right)\phi_i(\vec r) &= \varepsilon_i \phi_i(\vec r) \end{align} There is a catch, however, which arises if the particles are truly indistinguishable (as is the case for electrons, protons, atoms of the same isotope, etc.). In this case, there is a symmetry which means that permuting the labels of our particles cannot change any probabilities: \begin{align} |\Psi(\vec r_1, \vec r_2, \vec r_3,\cdots)|^2 &= |\Psi(\vec r_2, \vec r_1, \vec r_3,\cdots)|^2 \\ &= |\Psi(\vec r_2, \vec r_3, \vec r_1,\cdots)|^2 \end{align} The simple product we wrote above doesn't have this symmetery, and thus while it is an eigenfunction of our eigenvalue equation, it cannot represent the state of a real system of identical particles. Fortunately, this is pretty easy to resolve: permuting the labels doesn't change the energy, so we have a largish degenerate subspace in which to work. We are simply required to take a linear combination of these product states which does have the necessary symmetry.
The above equation, while true, does not tell us what happens to the wave function when we do a permutation, only to its magnitude. As it turns out, there are two types of symmetry possible: bosons and fermions.
Fermions are particles with half-integer spin, such as electrons and protons. Fermions are antisymmetric when we exchange the labels of any two particles. \begin{align} \Psi(\vec r_1, \vec r_2, \vec r_3,\cdots) &= -\Psi(\vec r_2, \vec r_1, \vec r_3,\cdots) \end{align} This formula is Pauli's exclusion principle.
This isn't a quantum class, so I won't say much more, but we do need to connect with the orbitals picture. When we have non-interacting fermions, their energy eigenstates can be written using a Slater determinant, which is just a convenient way to write the proper antisymmetric linear combination of all possible product states with the same set of orbitals: \begin{multline} \Psi_{i_1i_2i_3\cdots}(\vec r_1, \vec r_2, \vec r_3,\cdots) =\\ \frac1{\sqrt{N!}} \left|\begin{array}{ccccc} \phi_{i_1}(\vec r_1) & \phi_{i_1}(\vec r_1) & \phi_{i_1}(\vec r_1) & \cdots\\ \phi_{i_2}(\vec r_1) & \phi_{i_2}(\vec r_1) & \phi_{i_2}(\vec r_1) & \cdots\\ \phi_{i_3}(\vec r_1) & \phi_{i_3}(\vec r_1) & \phi_{i_3}(\vec r_1) & \cdots\\ \vdots &\vdots &\vdots &\ddots & \end{array}\right| \end{multline} This relies on the properties of a determinant, which changes sign if you swap two rows or two columns. This means that if two of your orbitals are the same, the result will be zero, so the “occupancy” of any orbital is either 0 or 1. Note that the \(N!\) is required in order to ensure that the wave function is normalized provided the orbitals are orthonormal.
Bosons have integer spin, and differ from fermions in that their sign does not change when you interchange particles. \begin{align} \Psi(\vec r_1, \vec r_2, \vec r_3,\cdots) &= \Psi(\vec r_2, \vec r_1, \vec r_3,\cdots) \end{align}
The wavefunction for noninteracting bosons looks very much like the Slater determinant above, only with a special version of the determinant that has all \(+\) signs. The bosons can have as many particles as they want in a given orbital. In the limiting case where all particles are in the same orbital, a single product of orbitals satisfies the required symmetry.
Let us now consider a set of non-interacting fermions. These fermions have a Hamiltonian with a set of single-particle energy eigenvalues given by \(\varepsilon_i\). How do we find the probability of any given many-body microstate? As always, the probability of any given microstate is given by the Boltzmann distribution, but given that are particles are non-interacting, we'd prefer to deal with just one at a time. As it turns out, dealing with one particle at a time is not really possible, but in a grand canonical ensemble we can deal with a single orbital at a time with much greater ease. We can think of each orbital as a separate system, and ask how many particles it has! Particles can now be exchanged between orbitals just like they were between systems last week.
Note that the same statistics would apply to a state for a classical particle if there were an infinite energy required to have two particles in the same state. The physics here is a system that can hold either zero or one particles, and there are various ways you could imagine that happening.
The average occupancy of an orbital is called the Fermi-Dirac function, and is normally written as: \begin{align} f(\varepsilon) &= \frac{1}{e^{\beta(\varepsilon_i-\mu)}+1} \end{align} Whenever you are looking at non-interacting fermions, \(f(\varepsilon)\) will be very helpful.
When talking about electrons, we often refer to the chemical potential \(\mu\) as the Fermi level. Kittel also defines the Fermi energy \(\varepsilon_F\) as the Fermi level when the temperature is zero, i.e. \begin{align} \varepsilon_F \equiv \mu(T=0) \end{align} At zero temperature, all the orbitals with energy less than \(\varepsilon_F\) are occupied, while all the orbitals with higher energy are unoccupied.
You might (or might not) be wondering how we can talk about electrons as non-interacting particles. After all, they are charged particles, which naturally repel each other rather strongly. Indeed, a Slater determinant is a terrible approximation for an energy eigenstate for any many-electron system. So why are we bothering talking about orbitals and the Fermi-Dirac distribution that relies on orbitals being an actual thing?
I'm not going to thoroughly explain this, but rather just give a few hints about why what we're doing might be reasonable. The key idea is that what we are really interested in is the behavior of excited states of our many-body system. (The ground state is also very interesting, e.g. if you want to study vibrations or phonons, but not in terms of the thermal behavior of the electrons themselves.) Fortunately, even though the electrons really do interact with one another very strongly, it is possible to construct a picture of elementary excitations that treats these excitations as not interacting with one another. In this kind of a picture, what we are talking about are called quasiparticles. These represent an excitation of the many-body state. And it turns out that in many cases (particularly for solids) we can represent a given excited state of the many-body system as a sum of the energy of a bunch of non-interacting quasiparticles. When this breaks down, we invent new names like exciton to represent an excitation in which more than one quasiparticle are interacting.
The same ideas apply to bosons as to fermions: we can treat each orbital as a separate system in the grand canonical ensemble. In this case, however, the occupancy \(N\) can have any (non-negative) value.
This turns out to be just the Planck distribution we aleady saw, only with a chemical potential as reference. Why does this bosonic system look like a simple harmonic oscillator? Since the particles are non-interacting, we have the same set of energy eigenvalues, which is to say an equally spaced series of states. This is conversely related to why we can describe solutions to the simple harmonic oscillator as bosonic phonons.
This expression, the Bose-Einstein distribution, tells us that at low temperatures, we could end up seeing a lot of particles in low energy states (if there are any eigenvalues below \(\mu\)), in contrast to the Fermi-Dirac distribution, which never sees more than one particle per state.
We are now prepared to talk about a gas in the classical limit. In the classical limit, there is no difference in behavior between fermions and bosons. This happens when the probability of finding a particle in a particular orbital is \(\ll 1\). And this happens when \(\beta(\varepsilon-\mu)\gg 1\) for all orbitals, i.e. when \(\mu\) is very negative. When this is the case, both the Fermi-Dirac distribution and the Bose-Einstein distribution become identical. \begin{align} f_{FD}(\varepsilon) = \frac1{e^{\beta(\varepsilon-\mu)}+1} &\approx e^{-\beta(\varepsilon-\mu)} \\ f_{BE}(\varepsilon) = \frac1{e^{\beta(\varepsilon-\mu)}-1} &\approx e^{-\beta(\varepsilon-\mu)} \end{align} In this limit (which is the low-density limit), the system will behave as a classical ideal gas.
A reasonable question is, “what is the chemical potential.” We already handled this, but can now look at this answer in terms of orbitals and the classical distribution function. (Note: classical distribution function is a bit of a misnomer in this context, as it defines how many particles are in a given quantum mechanical orbital.) \begin{align} \langle N\rangle &= \sum_{i}^{\text{orbitals}} f(\varepsilon_i) \\ &= \sum_i^{\text{orbitals}} e^{-\beta(\varepsilon_i-\mu)} \\ &= e^{\beta\mu} \sum_i^{\text{orbitals}} e^{-\beta(\varepsilon_i)} \\ &= e^{\beta\mu} Z_1 \\ N &= e^{\beta\mu} n_QV \end{align} where \(Z_1\) is the partition function for a single particle in a box, which we derived a few weeks ago to be \(n_QV\) where \(n_Q\equiv \left(\frac{mkT}{2\pi \hbar^2}\right)^{\frac32}\). Thus we can once again find the expression we found last week, where \begin{align} e^{\beta\mu} &= \frac1{n_Q}\frac{N}{V} = \frac{n}{n_Q} \end{align} We can solve for the chemical potential \begin{align} \mu &= kT\left(\ln N - \ln V -\tfrac32 \ln(kT) + \tfrac32 \ln\left(2\pi\hbar^2/m\right)\right) \end{align} Thus it decreases as volume increases or as the temperature increases. We can further find the free energy by integrating the chemical potential. This is again redundant when compared with the approach we already solved for this. Remember that \begin{align} dF &= -SdT -pdV + \mu dN \\ \mu &= \left(\frac{\partial F}{\partial N}\right)_{V,T} \end{align} Note that this must be an integral at fixed \(V\) and \(T\): \begin{align} F &= \int_0^N \mu dN \\ &= \int_0^N kT\left(\ln N - \ln V - \ln n_Q\right) dN \\ &= kT\left(N\ln N-N -N\ln V -N\ln n_Q\right) \\ &= NkT\left(\ln\left(\frac{n}{n_Q}\right) -1\right) \end{align}
This expression for the entropy is known as the Sackur-Tetrode equation.
This has been a series of practice computations involving the ideal gas. The results are useful for some of your homework, and the process of finding these properties is something you will need to know for the final exam. Ultimately, pretty much everything comes down to summing and integrating to find partition functions, and then taking derivatives (and occasional integrals) to find everything else.
assignment Homework
Let us imagine a new mechanics in which the allowed occupancies of an orbital are 0, 1, and 2. The values of the energy associated with these occupancies are assumed to be \(0\), \(\varepsilon\), and \(2\varepsilon\), respectively.
Derive an expression for the ensemble average occupancy \(\langle N\rangle\), when the system composed of this orbital is in thermal and diffusive contact with a resevoir at temperature \(T\) and chemical potential \(\mu\).
Return now to the usual quantum mechanics, and derive an expression for the ensemble average occupancy of an energy level which is doubly degenerate; that is, two orbitals have the identical energy \(\varepsilon\). If both orbitals are occupied the toal energy is \(2\varepsilon\). How does this differ from part (a)?
face Lecture
120 min.
Fermi level fermion boson Bose gas Bose-Einstein condensate ideal gas statistical mechanics phase transition
These lecture notes from week 7 of Thermal and Statistical Physics apply the grand canonical ensemble to fermion and bosons ideal gasses. They include a few small group activities.assignment Homework
Show that a Fermi electron gas in the ground state exerts a pressure \begin{align} p = \frac{\left(3\pi^2\right)^{\frac23}}{5} \frac{\hbar^2}{m}\left(\frac{N}{V}\right)^{\frac53} \end{align} In a uniform decrease of the volume of a cube every orbital has its energy raised: The energy of each orbital is proportional to \(\frac1{L^2}\) or to \(\frac1{V^{\frac23}}\).
Find an expression for the entropy of a Fermi electron gas in the region \(kT\ll \varepsilon_F\). Notice that \(S\rightarrow 0\) as \(T\rightarrow 0\).
face Lecture
30 min.
thermodynamics statistical mechanics
These are notes, essentially the equation sheet, from the final review session for Thermal and Statistical Physics.assignment Homework
Find the chemical potential of an ideal monatomic gas in two dimensions, with \(N\) atoms confined to a square of area \(A=L^2\). The spin is zero.
Find an expression for the energy \(U\) of the gas.
Find an expression for the entropy \(\sigma\). The temperature is \(kT\).
assignment Homework
Consider a system that may be unoccupied with energy zero, or occupied by one particle in either of two states, one of energy zero and one of energy \(\varepsilon\). Find the Gibbs sum for this system is in terms of the activity \(\lambda\equiv e^{\beta\mu}\). Note that the system can hold a maximum of one particle.
Solve for the thermal average occupancy of the system in terms of \(\lambda\).
Show that the thermal average occupancy of the state at energy \(\varepsilon\) is \begin{align} \langle N(\varepsilon)\rangle = \frac{\lambda e^{-\frac{\varepsilon}{kT}}}{\mathcal{Z}} \end{align}
Find an expression for the thermal average energy of the system.
Allow the possibility that the orbitals at \(0\) and at \(\varepsilon\) may each be occupied each by one particle at the same time; Show that \begin{align} \mathcal{Z} &= 1 + \lambda + \lambda e^{-\frac{\varepsilon}{kT}} + \lambda^2 e^{-\frac{\varepsilon}{kT}} \\ &= (1+\lambda)\left(1+e^{-\frac{\varepsilon}{kT}}\right) \end{align} Because \(\mathcal{Z}\) can be factored as shown, we have in effect two independent systems.
assignment Homework
assignment Homework
Einstein condensation temperature Starting from the density of free particle orbitals per unit energy range \begin{align} \mathcal{D}(\varepsilon) = \frac{V}{4\pi^2}\left(\frac{2M}{\hbar^2}\right)^{\frac32}\varepsilon^{\frac12} \end{align} show that the lowest temperature at which the total number of atoms in excited states is equal to the total number of atoms is \begin{align} T_E &= \frac1{k_B} \frac{\hbar^2}{2M} \left( \frac{N}{V} \frac{4\pi^2}{\int_0^\infty\frac{\sqrt{\xi}}{e^\xi-1}d\xi} \right)^{\frac23} T_E &= \end{align} The infinite sum may be numerically evaluated to be 2.612. Note that the number derived by integrating over the density of states, since the density of states includes all the states except the ground state.
Note: This problem is solved in the text itself. I intend to discuss Bose-Einstein condensation in class, but will not derive this result.
assignment Homework
assignment Homework
Nuclei of a particular isotope species contained in a crystal have spin \(I=1\), and thus, \(m = \{+1,0,-1\}\). The interaction between the nuclear quadrupole moment and the gradient of the crystalline electric field produces a situation where the nucleus has the same energy, \(E=\varepsilon\), in the state \(m=+1\) and the state \(m=-1\), compared with an energy \(E=0\) in the state \(m=0\), i.e. each nucleus can be in one of 3 states, two of which have energy \(E=\varepsilon\) and one has energy \(E=0\).
Find the Helmholtz free energy \(F = U-TS\) for a crystal containing \(N\) nuclei which do not interact with each other.
Find an expression for the entropy as a function of temperature for this system. (Hint: use results of part a.)