Energy and Entropy: Winter-2026
HW 6 (SOLUTION): Due Day 29 W10 D3

  1. Boltzmann probabilities S1 5463S Consider a three-state system with energies \((-\epsilon,0,\epsilon)\).
    1. At infinite temperature, what are the probabilities of the three states being occupied? What is the internal energy \(U\)? What is the entropy \(S\)?

      At infinite temperature \(\beta=0\), which makes computing probabilities easy: they are all equal. Thus the probabilities are each \(1/3\).

      The internal energy is given by the average of the energies of the microstates, which in this case gives us zero. \begin{align} U&=\sum_i E_iP_i \\ &= -\epsilon\frac13 + 0 \cdot \frac13 + \epsilon\frac13 \\ &= 0 \end{align}

      The entropy is given by the Gibbs expression \begin{align} S&=-k\sum_i P_i \ln P_i \\ &= -k\left(\frac13\ln(1/3) + \frac13\ln(1/3) + \frac13\ln(1/3)\right) \\ &= k\ln 3 \end{align}

    2. At very low temperature, what are the three probabilities?

      At very low temperatures \(\beta\epsilon\gg1\). Remember that the probabilities are given by \begin{align} P_i &= \frac{e^{-\beta E_i}}{Z} \\ Z &= e^{\beta\epsilon} + 1 + e^{-\beta\epsilon} \end{align} We can see that our "small quantity" for a power series should be \(e^{-\beta\epsilon}\), since that is the small thing in the partition function. We can start with the ground state, which we expect to be overwhelmingly occupied: \begin{align} P_{0} &= \frac{e^{\beta\epsilon}}{e^{\beta\epsilon} + 1 + e^{-\beta\epsilon}} \\ &= \frac{1}{1 + e^{-\beta\epsilon} + e^{-2\beta\epsilon}} \\ &\approx 1 - \left(e^{-\beta\epsilon} + \cancel{e^{-2\beta\epsilon}}\right) + \left(\cancel{e^{-\beta\epsilon}} + e^{-2\beta\epsilon}\right)^2 + \cdots \end{align} At the last step, we used a power series approximation for \(1/(1-z)\). We now need to gather terms so that we keep all terms to the same order. In this case the best option is to keep all terms up to \(e^{-2\beta\epsilon}\), since that way we will be able to account for the occupation of the highest energy state. Keeping these terms gives \begin{align} P_0 \approx 1 - e^{-\beta\epsilon} + \mathcal{O}\left(e^{-3\beta\epsilon}\right) \end{align} becaue the \(e^{-2\beta\epsilon}\) terms cancel each other out. Thus the ground state will be almost 100 at the other two states we will get exponentially smaller probabilities: \begin{align} P_{1} &= \frac{1}{e^{\beta\epsilon} + 1 + e^{-\beta\epsilon}} \\ &= e^{-\beta\epsilon}\frac{1}{1 + e^{-\beta\epsilon} + e^{-2\beta\epsilon}} \\ &= e^{-\beta\epsilon} P_0 \\ &\approx e^{-\beta\epsilon}\left(1 - e^{-\beta\epsilon}\right) \\ &= e^{-\beta\epsilon} - e^{-2\beta\epsilon} \end{align} The middle state with zero energy is less occupied by precisely a factor of \(e^{-\beta\epsilon}\). We could have predicted this from the Boltzmann ratio. \begin{align} P_{2} &= \frac{e^{-\beta\epsilon}}{e^{\beta\epsilon} + 1 + e^{-\beta\epsilon}} \\ &= e^{-\beta\epsilon}P_1 \\ &\approx e^{-2\beta\epsilon} \end{align} And the high energy state is hardly occupied at all, the same factor smaller than the previous state.

      This solution kept all terms that were at least order \(e^{-2\beta\epsilon}\) for each probability, which resulted in a set of probabilities that add up to one. It would also have been reasonable to answer that \(P_0\approx 1\) and \(P_1\approx e^{-\beta\epsilon}\), and then discuss that actually the probability of being in the ground state is not precisely 1.

      I could understand saying that \(P_2\approx 0\), but ideally you should give a nonzero answer for each probability when asked about very low temperatures, because none of them are exactly zero. If you have an experimant that measures \(P_2\) (perhaps state 2 has a distinctive property you can observe), then you will not find it to be zero at any temperature (unless you have poor resolution), and it is best to show how it scales.

    3. What are the three probabilities at zero temperature? What is the internal energy \(U\)? What is the entropy \(S\)?

      At zero temperature, the the above answers simplify. The lowest energy state is 100 probability of being in either of the two higher energy states.

      The internal energy at zero temperature is thus \(-\epsilon\), since the system is definitely in the ground state.

      The entropy is maybe a bit tricky, depending on whether you remember the value of zero times log of zero (which is a bit tricky because it is zero times infinity). I'll go through this in detail here, but you could give a shorter answer (so long as it's clear). \begin{align} S &= -k\sum_i P_i\ln P_i \\ &= -k\left(\cancelto{0}{1\ln 1} + \cancelto{0}{0\ln 0} + \cancelto{0}{0\ln 0}\right) \\ &= 0 \end{align} The latter two cases could be confusing, since \(\ln0 =-\infty\). We resolve this by using L'Hopital's rule. \begin{align} \lim_{P\rightarrow0}P\ln P &= \lim_{P\rightarrow0}\frac{\ln P}{\frac{1}{P}} = \frac{\infty}{\infty} \\ &= \lim_{P\rightarrow0}\frac{\frac{d}{dP}\ln P}{\frac{d}{dP}\frac{1}{P}} \\ &= \lim_{P\rightarrow0}\frac{\frac{1}{P}}{\frac{1}{P^2}} \\ &= \lim_{P\rightarrow0}P \\ &= 0 \end{align} You don't need to do this more than once yourself, but you do need to remember this result.

      So the point is that when the temperature is zero, the entropy is also zero. This will always happen if the ground state is not degenerate.

    4. What happens to the probabilities if you allow the temperature to be negative?

      If we allow the temperature to be negative, then higher energy states will be more probable than lower energy states. If the energy is small and negative (which was not specified in the question), then the system will almost always be in the \(+\epsilon\) energy state.

      Another behavior with negative temperatures for this system is that \(U>0\). For positive temperatures, the internal energy only approaches zero as the temperature gets very high. If the temperature becomes negative, the energy can exceed zero. For other systems, of course, this will not be the case, but this will be true for any system in which the energy states are symmetrically arranged around zero.

  2. Diatomic hydrogen S1 5463S

    At low temperatures, a diatomic molecule can be well described as a rigid rotor. The Hamiltonian of such a system is simply proportional to the square of the angular momentum \begin{align} H &= \frac{1}{2I}L^2 \end{align} and the energy eigenvalues are \begin{align} E_{\ell m} &= \hbar^2 \frac{\ell(\ell+1)}{2I} \end{align}

    1. What is the energy of the ground state and the first and second excited states of the \(H_2\) molecule? i.e. the lowest three distinct energy eigenvalues.

      \begin{align} E_{\ell m} &= \hbar^2 \frac{\ell(\ell+1)}{2I} \\ I &= \sum_i m_i r_i^2 \\ &= 2 m_H \left(\frac{d_{H-H}}{2}\right)^2 \\ E_{\ell m} &= \frac{\hbar^2}{m_H d_{H-H}^2} \ell(\ell+1) \end{align} At this point we need to put in some constants. The bond length of molecular hydrogen is 0.7 angstrom. The atomic weight of hydrogen is essentially 1, which puts it at \(1.66\times 10^{-27}\)kg. \(\hbar\), of course, is \(1.05\times 10^{-34}\)Js. Taken together, this gives us: \begin{align} E_{\ell m} &= \ell(\ell+1) 1.21\times 10^{-21}\text{~J} \\ &= \ell(\ell+1) 7.6 \text{~meV} \end{align} I like using millielectronvolts as a unit, since I remember that room temperature is 25meV and visible light is around 2eV.

      l \(E_{\ell m}\)
      0 0
      1 7.6 \(\text{~meV} \times 2 = 15.2\) meV
      2 \(7.6 \text{~meV} \times 6 = 45 \text{~meV}\)

    2. At room temperature, what is the relative probability of finding a hydrogen molecule in the \(\ell=0\) state versus finding it in any one of the \(\ell=1\) states?
      i.e. what is \(P_{\ell=0,m=0}/\left(P_{\ell=1,m=-1} + P_{\ell=1,m=0} + P_{\ell=1,m=1}\right)\)

      To solve this, we need to first find the Boltzmann factor for these two energies. Boltzmann's constant is \(8.62\times 10^{-5}\) eV/K, so \begin{align} \frac{P_{00}}{P_{10}} &= e^{-\frac{E_{00}-E_{10}}{k_BT}} \\ &= e^{\frac{0.015\text{~eV}}{8.62\times10^{-5} \text{ eV/K} 300 \text{~K}}} \\ &= e^{\frac{15\text{~meV}}{25.6 \text{~meV}}} \\ &= 1.81 \end{align} At this point we have to read the question carefully. The question asks about the probability of finding the molecule in any of the \(\ell=1\) states, and there are three such states. Thus (since they have the same energy) the probability of finding it in any of those three states is three times the probability of finding it in any given one of those states. \begin{align} P_{\ell=1} &= P_{10} + P_{11} + P_{1-1} \\ &= 3P_{10} \end{align} This tells us how to find the final solution based on our previous answer: \begin{align} \frac{P_{\ell=0}}{P_{\ell=1}} &= \frac13 \frac{P_{00}}{P_{10}} \\ &= 0.603 \end{align} so this tells us that at room temperature, hydrogen molecules are twice as likely to be in an \(\ell=1\) state as they are to be in an \(\ell=0\) state.

    3. At what temperature is the value of this ratio 1?

      Our formula is: \begin{align} \frac{P_{\ell=0}}{P_{\ell=1}} &= \frac13 e^{-\frac{\Delta E}{k_BT}} \\ &= \frac13 e^{\frac{15.2\text{~meV}}{0.0863\text{~meV/K~}T}} \\ 1 &= \frac13 e^{\frac{176\text{~K}}{T}} \\ 3 &= e^{\frac{176\text{~K}}{T}} \\ \ln 3 &= \frac{176\text{~K}}{T} \\ T &= \frac{176\text{~K}}{\ln 3} \\ &= 160\text{~K} \end{align} So if you want equal amounts of \(\ell=0\) hydrogen (known as para hydrogen and \(\ell=1\) hydrogen (known as ortho hydrogen), you have to cool it down to 80 K.

    4. At room temperature, what is the probability of finding a hydrogen molecule in any one of the \(\ell=2\) states versus that of finding it in the ground state?
      i.e. what is \(P_{\ell=0,m=0}/\left(P_{\ell=2,m=-2} + P_{\ell=2,m=-1} + \cdots + P_{\ell=2,m=2}\right)\)

      Now we want to find the ratio \begin{align} \frac{P_{\ell=2}}{P_{\ell=0}} &= 5 e^{-\frac{E_{20}-E_{00}}{k_BT}} \\ &= 5 e^{-\frac{\frac{\hbar^2}{m_H d_{H-H}^2} 6}{k_BT}} \\ &= 5 \left(e^{-\frac{\frac{\hbar^2}{m_H d_{H-H}^2}}{k_BT}}\right)^6 \\ &= 5\left(\frac{P_{10}}{P_{00}}\right)^6 \textbf{FIXME ERROR HERE} \\ &= 0.142 \end{align} So even \(\ell=2\) is somewhat occupied relative to the ground state.

      There is, however, an interesting interaction which makes \(\ell=0\) and \(\ell=1\) much more important. Because of the Pauli exclusion principle, the wave function of the protons (the hydrogen nuclei) must be antisymmetric. Since the odd-\(l\) states are antisymmetric, they must correspond to a symmetric nuclear spin state (i.e. singlet). By contrast, the even-\(l\) states have are symmetric under the exchange of the two nuclei, which means that the spin state must be antisymmetric (i.e. triplet).

      The gist of all this is that flipping the nuclear spin state is tricky: the strongest intermolecular interactions don't affect the spin. This means that it takes a long time for the nuclear spin states to flip, which then means that switching from even-\(l\) states to odd-\(l\) states takes a long time. Thus as you cool hydrogen, it quickly equilibrates to the point where it is (almost) all in the \(\ell=0\) (para hydrogen) and \(\ell=1\) states (ortho hydrogen), but it can take days to fully equilibrate... and energy is released during the process!

      If you're interested in reading more about this fascinating system, Wikipedia has a well-written article.

  3. Gas in the atmosphere S1 5463S Let's consider our atmosphere. In this problem we will make an inaccurate assumption that the entire atmosphere is at room temperature.
    1. What is the relative probability of a nitrogen molecule of being in a particular eigenstate outside the influence of the Earth's gravity, relative to being in an eigenstate at the Earth's surface. You may assume the energy difference of the two states is the gravitational potential energy difference? How about an oxygen molecule?

      We begin by writing down the ratio of Boltzmann probabilities. \begin{align} \frac{P_{\text{outer space}}}{P_{\text{Corvallis}}} &= \frac{ \frac{e^{-\beta E_{\text{outer space}}}}{Z} }{ \frac{e^{-\beta E_{\text{Corvallis}}}}{Z} }\\ E_{\text{Corvallis}} &= -\frac{GMm}{R} \\ E_{\text{Outer space}} &= 0\\ \frac{P_{\text{outer space}}}{P_{\text{Corvallis}}} &= e^{-\beta \frac{GMm}{R}} \end{align} where \(G\) is Newton's gravitational constant, \(R\) is the radius of the earth, \(M\) is the mass of the earth, and \(m\) is the mass of the molecule. At this point we just have to figure out the mass of a molecule of nitrogen and put in numbers. The atomic weight of nitrogen is 14, and thus its molecular weight is 28 g/mol. You could can also just google for "nitrogen atomic mass in kg" and then multiply by two to find \(m=4.6\times 10^{-26}\text{ kg}\). \begin{align} \frac{P_{\text{outer space}}}{P_{\text{Corvallis}}} &= e^{-\frac{6.7\times 10^{-11}\text{ J}\text{kg}^{-2}\text{m}\cdot 6\times 10^{24}\text{kg}\cdot 4.6\times 10^{-11}\text{kg} }{ 6.4\times 10^{6}\text{m} \cdot 1.4\times 10^{-23}\text{J}/\text{K}\cdot 300\text{K} }} \\ &= e^{-3.4\times 10^{17}} \text{FIX CALC ERROR} \end{align} which is pretty darn close to zero.

    2. How does your answer change if you consider a helium atom?
      The probability of being in space gets larger by a lot, but it's still too small for my computer to handle.
  4. Nucleus in a Magnetic Field S1 5463S

    Nuclei of a particular isotope species contained in a crystal have spin \(I=1\), and thus, \(m = \{+1,0,-1\}\). The interaction between the nuclear quadrupole moment and the gradient of the crystalline electric field produces a situation where the nucleus has the same energy, \(E=\varepsilon\), in the state \(m=+1\) and the state \(m=-1\), compared with an energy \(E=0\) in the state \(m=0\), i.e. each nucleus can be in one of 3 states, two of which have energy \(E=\varepsilon\) and one has energy \(E=0\).

    1. Find the Helmholtz free energy \(F = U-TS\) for a crystal containing \(N\) nuclei which do not interact with each other.

      We can find the free energy for a single spin from \begin{align} F_1 &= -k_BT \ln Z \\ &= -k_BT \ln \left(1 + 2e^{-\beta\varepsilon} \right) \end{align} Since we have \(N\) nuclei and the free energy is extensive, \begin{align} F &= -Nk_BT \ln \left(1 + 2e^{-\beta\varepsilon} \right) \end{align}

    2. Find an expression for the entropy as a function of temperature for this system. (Hint: use results of part a.)

      We can find the entropy from \begin{align} dF &= -SdT -pdV \\ S &= -\left(\frac{\partial {F}}{\partial {T}}\right)_{V} \\ &= Nk_B \ln \left(1 + 2e^{-\beta\varepsilon}\right) +Nk_BT \frac{-2\varepsilon e^{-\beta\varepsilon} \frac{d\beta}{dT}}{1 + 2e^{-\beta\varepsilon}} \\ &= Nk_B \ln \left(1 + 2e^{-\beta\varepsilon}\right) +Nk_BT \frac{2\varepsilon e^{-\beta\varepsilon}\frac{1}{k_BT^2} }{1 + 2e^{-\beta\varepsilon}} \\ S &= Nk_B \ln \left(1 + 2e^{-\beta\varepsilon}\right) +N\frac{\varepsilon}{T} \frac{ 2e^{-\beta\varepsilon} }{1 + 2e^{-\beta\varepsilon}} \end{align}

    3. Indicate what your results predict for the entropy at the extremes of very high temperature and very low temperature.

      At low temperatures \(\beta\varepsilon \gg 1\), and we find the entropy is: \begin{align} S &\approx Nk_B 2e^{-\beta\varepsilon} + 2N \frac{\varepsilon}{T}e^{-\beta\varepsilon} \\ &= 2N\left(k_B + \frac{\varepsilon}{T}\right)e^{-\beta\varepsilon} \end{align} This is very small, since \(e^{-\beta\varepsilon}\) is way tinier than \(\varepsilon/T\) is huge.

      At high temperatures \(\beta\varepsilon \ll 1\), which means we can Taylor expand the exponentials: \begin{align} S &\approx Nk_B \ln(1 + 2(1-\beta\varepsilon)) + 2N \frac{\varepsilon}{T} \frac{1 - \beta\varepsilon}{1 - 2 + 2\beta\varepsilon} \\ &= Nk_B \ln(3 - 2\beta\varepsilon) - 2Nk_B\beta\varepsilon \frac{1 - \beta\varepsilon}{1 - 2\beta\varepsilon} \\ &\approx Nk_B\ln 3 + Nk_B\ln\left(1-\frac23\beta\varepsilon\right) - 2Nk_B\beta\varepsilon \\ &\approx Nk_B\ln 3 - \frac83 Nk_B\beta\varepsilon \end{align} So at high temperatures, the entropy approaches \(k_B\ln 3\) per nuclei, as we should expect. It is also interesting to note that the entropy is always less than this value.

  5. Heat capacity for particle in a box S1 5463S Consider a particle in a box. The energy eigenvalues are given by \begin{align} E_n &= E_1 n^2 &\text{where }n=1,2,3,\cdots \end{align}
    1. Solve for the entropy of this system at temperature \(T\) where \(k_BT\gg E_1\). Your analytic solution should not involve a summation or an integral.

      We begin by writing down the partition function \begin{align} Z &= \sum_{n=1}^{\infty} e^{-\beta E_1 n^2} \end{align} There isn't much we can do with this right off the bat. So we need to make use of the fact that \(\beta E_1\ll 1\). You might be tempted to do a power series expantion of the exponential, but since \(n^2\) becomes infinite that doesn't really work. Instead we want to observe that when \(n\) changes by 1, the thing in the sum hardly changes. So we can safely turn our sum into an integral. \begin{align} Z &\approx \int_{0}^{\infty} e^{-\beta E_1 n^2}dn \\ &= \frac12\int_{-\infty}^{\infty}e^{-\beta E_1 n^2}dn \\ u &= \sqrt{\beta E_1}n \quad du = \sqrt{\beta E_1} dn \\ Z &= \frac12 \sqrt{\frac{kT}{E_1}}\int_{-\infty}^{\infty}e^{-u^2}du \\ &= \sqrt{\frac{\pi kT}{4E_1}} \end{align} where at the last stage we used the definite integral of a Gaussian, which you can solve for (you square it and then write it as a two-dimensional integral that you can solve in polar coordinates), but I don't expect you to solve for. If this were an exam, I'd give you a few definite integrals including the one you need.

      So now we want to solve for the Helmholtz free energy, and then on for the entropy. \begin{align} F &= -kT\ln Z \\ &= -kT \ln\sqrt{\frac{\pi kT}{4E_1}} \\ &= -\frac{kT}{2}\ln\left(\frac{\pi kT}{4E_1}\right) \end{align} Now we recall that \(dF=-SdT-pdV\) and from that we find: \begin{align} S &= -\left(\frac{\partial {F}}{\partial {T}}\right)_{V} \\ &= \frac12k\ln\left(\frac{\pi kT}{4E_1}\right) + \frac{k}{2} \end{align} I can do the derivative of the lograithm in my head by thinking of the logarithm as \begin{align} \ln\left(\frac{\pi kT}{4E_1}\right) &= \ln\pi+\ln k+\ln T - \ln 4 - \ln E_1 \end{align} and since only one of those terms dependds on \(T\), the rest vanish when I take a derivative, and I just have to worry about the derivative of \(\ln T\), which isn't so hard.

      I feel good about this answer because it looks positive.

    2. When \(kT\gg E_1\), solve for the heat capacity \(C_V\) of this system, given by \begin{align} C_V&=\left(\frac{\partial {U}}{\partial {T}}\right)_{V} \\ &= T\left(\frac{\partial {S}}{\partial {T}}\right)_{V} \end{align}
      \begin{align} C_V &= T\left(\frac{\partial {S}}{\partial {T}}\right)_{V} \\ &= T\left(\frac12 k \frac{1}{T}\right) \\ &= \frac12 k \end{align} This tells me the heat capacity is independent of temperature and is half a Boltzmann constant. I feel very good about this, because it matches with the prediction from the equipartition theorem.
    Hint: You will need to approximate a sum as an integral in this problem. You will also need to explain in words why this approximation is justified.