Activity: Ideal Gas

Thermal and Statistical Physics 2020
These notes from week 6 of Thermal and Statistical Physics cover the ideal gas from a grand canonical standpoint starting with the solutions to a particle in a three-dimensional box. They include a number of small group activities.

Week 6

Week 6: Ideal gas (K&K 6, Schroeder 6.7)

Midterm on Monday

Topics are everything through week 4, including week 3 homework, which was due in week 4. Problems should be similar to homework problems, but designed to be completed in class. The exam will be closed notes. You should be able to remember the fundamental equations: \begin{align} dU &= TdS - pdV \\ F &= U - TS \\ dF &= -SdT - pdV \\ P_i &= \frac{e^{-\beta E_i}}{Z} \\ Z &= \sum_i e^{-\beta E_i} \\ U &= \sum_i E_i P_i \\ F &= -kT\ln Z \\ S &= -k\sum_i P_i\ln P_i \\ \end{align} If you need a property of a particular system (the ideal gas, the simple harmonic oscillator), it will be given to you. There is no need, for instance, to remember the Stefan-Boltzmann law or the Planck distribution.

Motivation

You may recall that when we solved for the free energy of an ideal gas, we had a fair amount of work to sum over all possible sets of quantum numbers for each atom, and then to remove the double-counting due to the fact that our atoms were identical. We had a similar issue when dealing with photon modes and blackbody radiation, but in that case one approach was to treat each mode as a separate system, and then just sum over all the modes separately, without ever needing to find the partition function of all the modes taken together.

This week we will be looking at how we can treat each orbital (i.e. possible quantum state for a single non-interacting particle) as a separate system (which may or may not be occupied). This can only work when we work in the grand canonical ensemble, but will greatly simplify our understanding of such systems.

Quantum mechanics and orbitals

Kittel uses the term orbital to refer to an energy eigenstate (or wave function) of a one-particle system. How do things differ when we have more than one particle?

Suppose we have three particles (and ignore spin for a moment). The wave function would be written as $\Psi(\vec r_1, \vec r_2, \vec r_3,\cdots)$. This function in general has nothing to do with any single-particle orbitals. Orbitals arise when we consider a Hamiltonian in which there are no interactions between particles: \begin{align} \hat H &= \frac{\hat p_1^2}{2m} + V(\vec r_1) + \frac{\hat p_2^2}{2m} + V(\vec r_2) + \cdots \end{align} When our Hamiltonian is separable in this way (i.e. the particles don't interact, and there are no terms that involve both $\vec r_1$ and $\vec r_2$), we can use separation of variables in the solution, and we obtain a wave function that is a product of orbitals: \begin{align} \left|i_1,i_2,i_3,\cdots\right> &\dot= \phi_{i_1}(\vec r_1)\phi_{i_2}(\vec r_2)\phi_{i_3}(\vec r_3)\cdots \end{align} Assuming the potential and mass are the same for every particle, these orbitals are eigenstates of the following single-particle eigenvalue equation: \begin{align} \left(\frac{\hat p^2}{2m} + V(\vec r)\right)\phi_i(\vec r) &= \varepsilon_i \phi_i(\vec r) \end{align} There is a catch, however, which arises if the particles are truly indistinguishable (as is the case for electrons, protons, atoms of the same isotope, etc.). In this case, there is a symmetry which means that permuting the labels of our particles cannot change any probabilities: \begin{align} |\Psi(\vec r_1, \vec r_2, \vec r_3,\cdots)|^2 &= |\Psi(\vec r_2, \vec r_1, \vec r_3,\cdots)|^2 \\ &= |\Psi(\vec r_2, \vec r_3, \vec r_1,\cdots)|^2 \end{align} The simple product we wrote above doesn't have this symmetery, and thus while it is an eigenfunction of our eigenvalue equation, it cannot represent the state of a real system of identical particles. Fortunately, this is pretty easy to resolve: permuting the labels doesn't change the energy, so we have a largish degenerate subspace in which to work. We are simply required to take a linear combination of these product states which does have the necessary symmetry.

The above equation, while true, does not tell us what happens to the wave function when we do a permutation, only to its magnitude. As it turns out, there are two types of symmetry possible: bosons and fermions.

Fermions

Fermions are particles with half-integer spin, such as electrons and protons. Fermions are antisymmetric when we exchange the labels of any two particles. \begin{align} \Psi(\vec r_1, \vec r_2, \vec r_3,\cdots) &= -\Psi(\vec r_2, \vec r_1, \vec r_3,\cdots) \end{align} This formula is Pauli's exclusion principle.

This isn't a quantum class, so I won't say much more, but we do need to connect with the orbitals picture. When we have non-interacting fermions, their energy eigenstates can be written using a Slater determinant, which is just a convenient way to write the proper antisymmetric linear combination of all possible product states with the same set of orbitals: \begin{multline} \Psi_{i_1i_2i_3\cdots}(\vec r_1, \vec r_2, \vec r_3,\cdots) =\\ \frac1{\sqrt{N!}} \left|\begin{array}{ccccc} \phi_{i_1}(\vec r_1) & \phi_{i_1}(\vec r_1) & \phi_{i_1}(\vec r_1) & \cdots\\ \phi_{i_2}(\vec r_1) & \phi_{i_2}(\vec r_1) & \phi_{i_2}(\vec r_1) & \cdots\\ \phi_{i_3}(\vec r_1) & \phi_{i_3}(\vec r_1) & \phi_{i_3}(\vec r_1) & \cdots\\ \vdots &\vdots &\vdots &\ddots & \end{array}\right| \end{multline} This relies on the properties of a determinant, which changes sign if you swap two rows or two columns. This means that if two of your orbitals are the same, the result will be zero, so the “occupancy” of any orbital is either 0 or 1. Note that the $N!$ is required in order to ensure that the wave function is normalized provided the orbitals are orthonormal.

Bosons

Bosons have integer spin, and differ from fermions in that their sign does not change when you interchange particles. \begin{align} \Psi(\vec r_1, \vec r_2, \vec r_3,\cdots) &= \Psi(\vec r_2, \vec r_1, \vec r_3,\cdots) \end{align}

The wavefunction for noninteracting bosons looks very much like the Slater determinant above, only with a special version of the determinant that has all $+$ signs. The bosons can have as many particles as they want in a given orbital. In the limiting case where all particles are in the same orbital, a single product of orbitals satisfies the required symmetry.

Fermi-Dirac distribution

Let us now consider a set of non-interacting fermions. These fermions have a Hamiltonian with a set of single-particle energy eigenvalues given by $\varepsilon_i$. How do we find the probability of any given many-body microstate? As always, the probability of any given microstate is given by the Boltzmann distribution, but given that are particles are non-interacting, we'd prefer to deal with just one at a time. As it turns out, dealing with one particle at a time is not really possible, but in a grand canonical ensemble we can deal with a single orbital at a time with much greater ease. We can think of each orbital as a separate system, and ask how many particles it has! Particles can now be exchanged between orbitals just like they were between systems last week.

Small groups
Work out the grand partition function for a single orbital with energy $\varepsilon_i$ that may occupied by a fermion.
Now that we are thinking of an orbital as a system, we can pretty easily write down all the possible states of that system: it is either occupied or unoccupied. The latter case has $0$ energy, and also $N=0$, while the former case has energy $\varepsilon$ and $N=1$. Summing over these gives the Gibbs sum \begin{align} \mathcal{Z} &= \sum_i^{\text{all }\mu\text{states}} e^{-\beta(\varepsilon_i - \mu N_i)} \\ &= 1 + e^{-\beta(\varepsilon - \mu)} \end{align}

Note that the same statistics would apply to a state for a classical particle if there were an infinite energy required to have two particles in the same state. The physics here is a system that can hold either zero or one particles, and there are various ways you could imagine that happening.

Small groups
Find the energy and the average occupancy ($\langle N\rangle$) of the orbital
If we want to find $\langle N\rangle$ of the system, we can do that in the usual way Finding is basically the same \begin{align} \langle N\rangle &= \sum_i N_iP_i \\ &= \frac{0 + e^{-\beta(\varepsilon-\mu)}}{\mathcal{Z}} \\ &= \frac{e^{-\beta(\varepsilon-\mu)}}{ 1 + e^{-\beta(\varepsilon - \mu)}} \\ &= \frac{1}{ 1 + e^{\beta(\varepsilon - \mu)}} \end{align} Finding the energy is basically the same, since the energy is proportional to the occupancy: \begin{align} \langle E\rangle &= \sum_i E_iP_i \\ &= \frac{0 + \varepsilon e^{-\beta(\varepsilon-\mu)}}{\mathcal{Z}} \\ &= \varepsilon \langle N\rangle \end{align}

The average occupancy of an orbital is called the Fermi-Dirac function, and is normally written as: \begin{align} f(\varepsilon) &= \frac{1}{e^{\beta(\varepsilon_i-\mu)}+1} \end{align} Whenever you are looking at non-interacting fermions, $f(\varepsilon)$ will be very helpful.

Small groups
Sketch the Fermi-Dirac function.

When talking about electrons, we often refer to the chemical potential $\mu$ as the Fermi level. Kittel also defines the Fermi energy $\varepsilon_F$ as the Fermi level when the temperature is zero, i.e. \begin{align} \varepsilon_F \equiv \mu(T=0) \end{align} At zero temperature, all the orbitals with energy less than $\varepsilon_F$ are occupied, while all the orbitals with higher energy are unoccupied.

Actual electrons

You might (or might not) be wondering how we can talk about electrons as non-interacting particles. After all, they are charged particles, which naturally repel each other rather strongly. Indeed, a Slater determinant is a terrible approximation for an energy eigenstate for any many-electron system. So why are we bothering talking about orbitals and the Fermi-Dirac distribution that relies on orbitals being an actual thing?

I'm not going to thoroughly explain this, but rather just give a few hints about why what we're doing might be reasonable. The key idea is that what we are really interested in is the behavior of excited states of our many-body system. (The ground state is also very interesting, e.g. if you want to study vibrations or phonons, but not in terms of the thermal behavior of the electrons themselves.) Fortunately, even though the electrons really do interact with one another very strongly, it is possible to construct a picture of elementary excitations that treats these excitations as not interacting with one another. In this kind of a picture, what we are talking about are called quasiparticles. These represent an excitation of the many-body state. And it turns out that in many cases (particularly for solids) we can represent a given excited state of the many-body system as a sum of the energy of a bunch of non-interacting quasiparticles. When this breaks down, we invent new names like exciton to represent an excitation in which more than one quasiparticle are interacting.

Bose-Einstien distribution

The same ideas apply to bosons as to fermions: we can treat each orbital as a separate system in the grand canonical ensemble. In this case, however, the occupancy $N$ can have any (non-negative) value.

Small groups
Solve for the Gibbs sum for an orbital with energy $\varepsilon$, and solve for the $\langle N\rangle$ for a single orbital occupied by bosons.
The Gibbs sum will be \begin{align} \mathcal{Z} &= \sum_{N=0}^{\infty} e^{-\beta(N\varepsilon-\mu N)} \\ &= \sum_{n=0}^{\infty} \left(e^{-\beta(\varepsilon-\mu)}\right)^N \\ &= \sum_{n=0}^{\infty} \left(e^{-\beta(\varepsilon-\mu)}\right)^N \end{align} This looks suspiciously like a simple harmonic oscillator. The same harmonic summation trick applies, and we see that \begin{align} \mathcal{Z} &= 1 + e^{-\beta(\varepsilon-\mu)} + \left(e^{-\beta(\varepsilon-\mu)}\right)^2 +\cdots \\ e^{-\beta(\varepsilon-\mu)}\mathcal{Z} &= e^{-\beta(\varepsilon-\mu)} + \left(e^{-\beta(\varepsilon-\mu)}\right)^2 +\cdots \end{align} Subtracting the two gives \begin{align} \left(1 - e^{-\beta(\varepsilon-\mu)}\right)\mathcal{Z} &= 1 \\ \mathcal{Z} &= \frac1{1 - e^{-\beta(\varepsilon-\mu)}} \end{align} Solving for the average occupancy $\langle N\rangle$ is again more tedious than for a fermion: \begin{align} \langle N\rangle &= \sum_i N_i P_i \\ &= \frac1{\mathcal{Z}}\sum_{N=0}^\infty N e^{-\beta(\varepsilon-\mu)N} \\ &= \frac1{\mathcal{Z}}\frac{\partial \mathcal{Z}}{\partial \mu}\left(\frac{1}{\beta}\right) \\ &= -\frac{\bcancel{1 - e^{-\beta(\varepsilon-\mu)}}}{ \left(1 - e^{-\beta(\varepsilon-\mu)}\right)^{\bcancel{2}} } \left(-e^{-\beta(\varepsilon-\mu)}\right) \cancel{\beta}\left(\frac{1}{\cancel{\beta}}\right) \\ &= \frac{ e^{-\beta(\varepsilon-\mu)} }{ 1-e^{-\beta(\varepsilon-\mu)} } \\ f(\varepsilon) &= \frac{ 1 }{ e^{\beta(\varepsilon-\mu)}-1 } \end{align}

This turns out to be just the Planck distribution we aleady saw, only with a chemical potential as reference. Why does this bosonic system look like a simple harmonic oscillator? Since the particles are non-interacting, we have the same set of energy eigenvalues, which is to say an equally spaced series of states. This is conversely related to why we can describe solutions to the simple harmonic oscillator as bosonic phonons.

Small groups
Sketch the Bose-Einstein distribution function.

This expression, the Bose-Einstein distribution, tells us that at low temperatures, we could end up seeing a lot of particles in low energy states (if there are any eigenvalues below $\mu$), in contrast to the Fermi-Dirac distribution, which never sees more than one particle per state.

Entropy

Small groups
Find the entropy of a single orbital that may hold a fermion.
We begin with the probabilities of the two microstates: \begin{align} P_0 &= \frac1{\mathcal{Z}} & P_1 &= \frac{e^{-\beta(\varepsilon-\mu)}}{\mathcal{Z}} \end{align} where \begin{align} \mathcal{Z} &= 1 + e^{-\beta(\varepsilon-\mu)} \end{align} Now we just find the entropy using FINISH THIS!

Classical ideal gas

We are now prepared to talk about a gas in the classical limit. In the classical limit, there is no difference in behavior between fermions and bosons. This happens when the probability of finding a particle in a particular orbital is $\ll 1$. And this happens when $\beta(\varepsilon-\mu)\gg 1$ for all orbitals, i.e. when $\mu$ is very negative. When this is the case, both the Fermi-Dirac distribution and the Bose-Einstein distribution become identical. \begin{align} f_{FD}(\varepsilon) = \frac1{e^{\beta(\varepsilon-\mu)}+1} &\approx e^{-\beta(\varepsilon-\mu)} \\ f_{BE}(\varepsilon) = \frac1{e^{\beta(\varepsilon-\mu)}-1} &\approx e^{-\beta(\varepsilon-\mu)} \end{align} In this limit (which is the low-density limit), the system will behave as a classical ideal gas.

A reasonable question is, “what is the chemical potential.” We already handled this, but can now look at this answer in terms of orbitals and the classical distribution function. (Note: classical distribution function is a bit of a misnomer in this context, as it defines how many particles are in a given quantum mechanical orbital.) \begin{align} \langle N\rangle &= \sum_{i}^{\text{orbitals}} f(\varepsilon_i) \\ &= \sum_i^{\text{orbitals}} e^{-\beta(\varepsilon_i-\mu)} \\ &= e^{\beta\mu} \sum_i^{\text{orbitals}} e^{-\beta(\varepsilon_i)} \\ &= e^{\beta\mu} Z_1 \\ N &= e^{\beta\mu} n_QV \end{align} where $Z_1$ is the partition function for a single particle in a box, which we derived a few weeks ago to be $n_QV$ where $n_Q\equiv \left(\frac{mkT}{2\pi \hbar^2}\right)^{\frac32}$. Thus we can once again find the expression we found last week, where \begin{align} e^{\beta\mu} &= \frac1{n_Q}\frac{N}{V} = \frac{n}{n_Q} \end{align} We can solve for the chemical potential \begin{align} \mu &= kT\left(\ln N - \ln V -\tfrac32 \ln(kT) + \tfrac32 \ln\left(2\pi\hbar^2/m\right)\right) \end{align} Thus it decreases as volume increases or as the temperature increases. We can further find the free energy by integrating the chemical potential. This is again redundant when compared with the approach we already solved for this. Remember that \begin{align} dF &= -SdT -pdV + \mu dN \\ \mu &= \left(\frac{\partial F}{\partial N}\right)_{V,T} \end{align} Note that this must be an integral at fixed $V$ and $T$: \begin{align} F &= \int_0^N \mu dN \\ &= \int_0^N kT\left(\ln N - \ln V - \ln n_Q\right) dN \\ &= kT\left(N\ln N-N -N\ln V -N\ln n_Q\right) \\ &= NkT\left(\ln\left(\frac{n}{n_Q}\right) -1\right) \end{align}

Small groups
Solve for the entropy of the ideal gas (from this free energy).
\begin{align} -S &= \left(\frac{\partial F}{\partial T}\right)_{V,N} \\ &= Nk\left(\ln\left(\frac{n}{n_Q}\right) -1\right) - \frac{NkT}{n_Q}\frac{dn_Q}{dT} \\ &= Nk\left(\ln\left(\frac{n}{n_Q}\right) -1\right) - \frac{Nk\bcancel{T}}{\cancel{n_Q}}\frac32 \frac{\cancel{n_Q}}{\bcancel{T}} \\ -S &= Nk\left(\ln\left(\frac{n}{n_Q}\right) - \frac52\right) \\ S&= Nk\left(\ln\left(\frac{n_Q}{n}\right) + \frac52\right) \end{align}

This expression for the entropy is known as the Sackur-Tetrode equation.

Small groups
Solve for the pressure of the ideal gas (from the free energy)
\begin{align} p &= -\left(\frac{\partial F}{\partial V}\right)_{T,N} \\ &= \frac{NkT}{V} \end{align} That was pretty easy, once we saw that $n_Q$ was independent of volume. This expression is known as the ideal gas law.
Small groups
Solve for the internal energy of the ideal gas
\begin{align} U &= F+TS \\ &= \frac32 NkT \end{align} Also pretty familiar.
Small groups
Solve for the heat capacity at constant volume of the ideal gas
\begin{align} C_V &= \left(\frac{\partial U}{\partial T}\right)_{V,N} \\ &= T\left(\frac{\partial S}{\partial T}\right)_{V,N} \\ &= \frac32 Nk \end{align} This one is relatively easy.
Small groups
Solve for the heat capacity at constant pressure of the ideal gas
\begin{align} C_p &= T\left(\frac{\partial S}{\partial T}\right)_{p,N} \\ \end{align} This one requires one (small) step more. We have to convert the volume into a pressure in the free energy expression. \begin{align} C_p &= T\left(\frac{\partial Nk\left(\ln\left(\frac{n_Q}{n}\right) + \frac52\right)}{\partial T}\right)_{p,N} \\ &= NkT\left(\frac{\partial \left(\ln\left(\frac{n_Q}{n}\right) + \frac52\right)}{\partial T}\right)_{p,N} \\ &= NkT\left(\frac{\partial \ln\left(\frac{Vn_Q}{N}\right)}{\partial T}\right)_{p,N} \\ &= NkT\left(\frac{\partial \ln\left(\frac{NkT}{p}\frac{n_Q}{N}\right)}{\partial T}\right)_{p,N} \end{align} At this point we peek inside and see that $n_Q \propto T^{\frac32}$ and can complete the derivative \begin{align} C_p &= \frac52 Nk \end{align}

This has been a series of practice computations involving the ideal gas. The results are useful for some of your homework, and the process of finding these properties is something you will need to know for the final exam. Ultimately, pretty much everything comes down to summing and integrating to find partition functions, and then taking derivatives (and occasional integrals) to find everything else.

• assignment Distribution function for double occupancy statistics

assignment Homework

Distribution function for double occupancy statistics
Orbitals Distribution function Thermal and Statistical Physics 2020

Let us imagine a new mechanics in which the allowed occupancies of an orbital are 0, 1, and 2. The values of the energy associated with these occupancies are assumed to be $0$, $\varepsilon$, and $2\varepsilon$, respectively.

1. Derive an expression for the ensemble average occupancy $\langle N\rangle$, when the system composed of this orbital is in thermal and diffusive contact with a resevoir at temperature $T$ and chemical potential $\mu$.

2. Return now to the usual quantum mechanics, and derive an expression for the ensemble average occupancy of an energy level which is doubly degenerate; that is, two orbitals have the identical energy $\varepsilon$. If both orbitals are occupied the toal energy is $2\varepsilon$. How does this differ from part (a)?

• face Fermi and Bose gases

face Lecture

120 min.

Fermi and Bose gases
Thermal and Statistical Physics 2020

These lecture notes from week 7 of Thermal and Statistical Physics apply the grand canonical ensemble to fermion and bosons ideal gasses. They include a few small group activities.
• assignment Pressure and entropy of a degenerate Fermi gas

assignment Homework

Pressure and entropy of a degenerate Fermi gas
Fermi gas Pressure Entropy Thermal and Statistical Physics 2020
1. Show that a Fermi electron gas in the ground state exerts a pressure \begin{align} p = \frac{\left(3\pi^2\right)^{\frac23}}{5} \frac{\hbar^2}{m}\left(\frac{N}{V}\right)^{\frac53} \end{align} In a uniform decrease of the volume of a cube every orbital has its energy raised: The energy of each orbital is proportional to $\frac1{L^2}$ or to $\frac1{V^{\frac23}}$.

2. Find an expression for the entropy of a Fermi electron gas in the region $kT\ll \varepsilon_F$. Notice that $S\rightarrow 0$ as $T\rightarrow 0$.

• face Review of Thermal Physics

face Lecture

30 min.

Review of Thermal Physics
Thermal and Statistical Physics 2020

These are notes, essentially the equation sheet, from the final review session for Thermal and Statistical Physics.
• assignment Ideal gas in two dimensions

assignment Homework

Ideal gas in two dimensions
Ideal gas Entropy Chemical potential Thermal and Statistical Physics 2020
1. Find the chemical potential of an ideal monatomic gas in two dimensions, with $N$ atoms confined to a square of area $A=L^2$. The spin is zero.

2. Find an expression for the energy $U$ of the gas.

3. Find an expression for the entropy $\sigma$. The temperature is $kT$.

• assignment Gibbs sum for a two level system

assignment Homework

Gibbs sum for a two level system
Gibbs sum Microstate Thermal average energy Thermal and Statistical Physics 2020
1. Consider a system that may be unoccupied with energy zero, or occupied by one particle in either of two states, one of energy zero and one of energy $\varepsilon$. Find the Gibbs sum for this system is in terms of the activity $\lambda\equiv e^{\beta\mu}$. Note that the system can hold a maximum of one particle.

2. Solve for the thermal average occupancy of the system in terms of $\lambda$.

3. Show that the thermal average occupancy of the state at energy $\varepsilon$ is \begin{align} \langle N(\varepsilon)\rangle = \frac{\lambda e^{-\frac{\varepsilon}{kT}}}{\mathcal{Z}} \end{align}

4. Find an expression for the thermal average energy of the system.

5. Allow the possibility that the orbitals at $0$ and at $\varepsilon$ may each be occupied each by one particle at the same time; Show that \begin{align} \mathcal{Z} &= 1 + \lambda + \lambda e^{-\frac{\varepsilon}{kT}} + \lambda^2 e^{-\frac{\varepsilon}{kT}} \\ &= (1+\lambda)\left(1+e^{-\frac{\varepsilon}{kT}}\right) \end{align} Because $\mathcal{Z}$ can be factored as shown, we have in effect two independent systems.

• assignment Fluctuations in a Fermi gas

assignment Homework

Fluctuations in a Fermi gas
Fermi gas grand canonical ensemble statistical mechanics Thermal and Statistical Physics 2020 (K&K 7.11) Show for a single orbital of a fermion system that \begin{align} \left<(\Delta N)^2\right> = \left<N\right>(1+\left<N\right>) \end{align} if $\left<N\right>$ is the average number of fermions in that orbital. Notice that the fluctuation vanishes for orbitals with energies far enough from the chemical potential $\mu$ so that $\left<N\right>=1$ or $\left<N\right>=0$.
• assignment Quantum concentration

assignment Homework

Quantum concentration
bose-einstein gas statistical mechanics Thermal and Statistical Physics 2020 Consider one particle confined to a cube of side $L$; the concentration in effect is $n=L^{-3}$. Find the kinetic energy of the particle when in the ground state. There will be a value of the concentration for which this zero-point quantum kinetic energy is equal to the temperature $kT$. (At this concentration the occupancy of the lowest orbital is of the order of unity; the lowest orbital always has a higher occupancy than any other orbital.) Show that the concentration $n_0$ thus defined is equal to the quantum concentration $n_Q$ defined by (63): $$n_Q \equiv \left(\frac{MkT}{2\pi\hbar^2}\right)^{\frac32}$$ within a factor of the order of unity.
• assignment Einstein condensation temperature

assignment Homework

Einstein condensation temperature
Einstein condensation Density Thermal and Statistical Physics 2020

Einstein condensation temperature Starting from the density of free particle orbitals per unit energy range \begin{align} \mathcal{D}(\varepsilon) = \frac{V}{4\pi^2}\left(\frac{2M}{\hbar^2}\right)^{\frac32}\varepsilon^{\frac12} \end{align} show that the lowest temperature at which the total number of atoms in excited states is equal to the total number of atoms is \begin{align} T_E &= \frac1{k_B} \frac{\hbar^2}{2M} \left( \frac{N}{V} \frac{4\pi^2}{\int_0^\infty\frac{\sqrt{\xi}}{e^\xi-1}d\xi} \right)^{\frac23} T_E &= \end{align} The infinite sum may be numerically evaluated to be 2.612. Note that the number derived by integrating over the density of states, since the density of states includes all the states except the ground state.

Note: This problem is solved in the text itself. I intend to discuss Bose-Einstein condensation in class, but will not derive this result.

• face Boltzmann probabilities and Helmholtz

face Lecture

120 min.

Boltzmann probabilities and Helmholtz
Thermal and Statistical Physics 2020

These notes, from the third week of Thermal and Statistical Physics cover the canonical ensemble and Helmholtz free energy. They include a number of small group activities.

Learning Outcomes