Chapter 7 Statistical mechanics When one is faced with a condensed-phase system, usually containing many molecules, that is at or near thermal equilibrium, it is not necessary or even wise to try to describe it in terms of quantum wave functions or even classical trajectories of all of the constituent molecules. Instead, the powerful tools of statistical mechanics allow one to focus on quantities that describe the most important features of the manmy-molecule system. In this Chapter, you will learn about these tools and see some important examples of their application I. Collections of Molecules at or Near equilibrium As noted in Chapter 5, the approach one takes in studying a system composed of a very large number of molecules at or near thermal equilibrium can be quite different from how one studies systems containing a few isolated molecules. In principle, it is possible to conceive of computing the quantum energy levels and wave functions of a collection of many molecules, but doing so becomes impractical once the number of atoms in the system reaches a few thousand or if the molecules have significant intermolecular interactions. Also, as noted in Chapter 5, following the time evolution of such a large number of molecules can be confusing if one focuses on the short-time behavior of any single molecule(e.g, one sees"jerky"changes in its energy, momentum, and angular momentum). By examining, instead, the long-time average behavior of each molecule or, alternatively, the average properties of a significantly large number of PAGE 1
PAGE 1 Chapter 7. Statistical Mechanics When one is faced with a condensed-phase system, usually containing many molecules, that is at or near thermal equilibrium, it is not necessary or even wise to try to describe it in terms of quantum wave functions or even classical trajectories of all of the constituent molecules. Instead, the powerful tools of statistical mechanics allow one to focus on quantities that describe the most important features of the many-molecule system. In this Chapter, you will learn about these tools and see some important examples of their application. I. Collections of Molecules at or Near Equilibrium As noted in Chapter 5, the approach one takes in studying a system composed of a very large number of molecules at or near thermal equilibrium can be quite different from how one studies systems containing a few isolated molecules. In principle, it is possible to conceive of computing the quantum energy levels and wave functions of a collection of many molecules, but doing so becomes impractical once the number of atoms in the system reaches a few thousand or if the molecules have significant intermolecular interactions. Also, as noted in Chapter 5, following the time evolution of such a large number of molecules can be “confusing” if one focuses on the short-time behavior of any single molecule (e.g., one sees “jerky” changes in its energy, momentum, and angular momentum). By examining, instead, the long-time average behavior of each molecule or, alternatively, the average properties of a significantly large number of
molecules, one is often better able to understand, interpret, and simulate such condensed media systems. This is where the power of statistical mechanics comes into play A. The Distribution of Energy Among levels One of the most important concepts of statistical mechanics involves how a specified amount of total energy e can be shared among a collection of molecules and among the internal(translational, rotational, vibrational, electronic)degrees of freedom of these molecules. The primary outcome of asking what is the most probable distribution of energy among a large number n of molecules within a container of volume v that is maintained in equilibrium at a specified temperature t is the most important equation in statistical mechanics, the Boltzmann population formula E; /kT)Q This equation expresses the probability Pi of finding the system(which, in the case introduced above, is the whole collection of N interacting molecules)in its quantum state, where E, is the energy of this quantum state, T is the temperature in K, Q2, is the degeneracy of the jth state, and the denominator Q is the so-called partition function Q=2,Q, exp(-E,/kT) The classical mechanical equivalent of the above quantum Boltzmann population formula for a system with M coordinates(collectively denoted q) and M momenta( denoted p)is PAGE 2
PAGE 2 molecules, one is often better able to understand, interpret, and simulate such condensedmedia systems. This is where the power of statistical mechanics comes into play. A. The Distribution of Energy Among Levels One of the most important concepts of statistical mechanics involves how a specified amount of total energy E can be shared among a collection of molecules and among the internal (translational, rotational, vibrational, electronic) degrees of freedom of these molecules. The primary outcome of asking what is the most probable distribution of energy among a large number N of molecules within a container of volume V that is maintained in equilibrium at a specified temperature T is the most important equation in statistical mechanics, the Boltzmann population formula: Pj = Wj exp(- Ej /kT)/Q. This equation expresses the probability Pj of finding the system (which, in the case introduced above, is the whole collection of N interacting molecules) in its jth quantum state, where Ej is the energy of this quantum state, T is the temperature in K, Wj is the degeneracy of the jth state, and the denominator Q is the so-called partition function: Q = Sj Wj exp(- Ej /kT). The classical mechanical equivalent of the above quantum Boltzmann population formula for a system with M coordinates (collectively denoted q) and M momenta (denoted p) is:
P(a, p)=h-Mexp(-H(, p)/kT)Q where H is the classical Hamiltonian, h is Planck's constant, and the classical partition function Q is Q=hMexp(-H(q, p)k) dq dp Notice that the boltzmann formula does not say that only those states of a given energy can be populated; it gives non-zero probabilities for populating all states from the lowest to the highest. However, it does say that states of higher energy E, are disfavored by the exp(e /kT)factor, but if states of higher energy have larger degeneracies Q2, (which they usually do), the overall population of such states may not be low. That is, there is a competition between state degeneracy Q,, which tends to grow as the state's energy grows, and exp (E, /kT)which decreases with increasing energy. If the number of particles N is huge, the degeneracy Q2 grows as a high power (lets denote this power as K)of e because the degeneracy is related to the number of ways the energy can be distributed among the n molecules. In fact, K grows at least as fast as N. As a result of Q2 growing as E, the product function P(E)=E exp(-E/kT)has the form shown in Fig. 7.1 (for K=10) PAGE 3
PAGE 3 P(q,p) = h-M exp (- H(q, p)/kT)/Q, where H is the classical Hamiltonian, h is Planck's constant, and the classical partition function Q is Q = h-M ò exp (- H(q, p)/kT) dq dp . Notice that the Boltzmann formula does not say that only those states of a given energy can be populated; it gives non-zero probabilities for populating all states from the lowest to the highest. However, it does say that states of higher energy Ej are disfavored by the exp (- Ej /kT) factor, but if states of higher energy have larger degeneracies Wj (which they usually do), the overall population of such states may not be low. That is, there is a competition between state degeneracy Wj , which tends to grow as the state's energy grows, and exp (-Ej /kT) which decreases with increasing energy. If the number of particles N is huge, the degeneracy W grows as a high power (let’s denote this power as K) of E because the degeneracy is related to the number of ways the energy can be distributed among the N molecules. In fact, K grows at least as fast as N. As a result of W growing as EK , the product function P(E) = EK exp(-E/kT) has the form shown in Fig. 7.1 (for K=10)
50u000 450000 400000 350000 300o 山 250000 200000 150000 50000 0 0102030405060708090100 E水T Figure 7. 1 Probability Weighting Factor P(E)as a Function of E for K=10 By taking the derivative of this function P(E)with respect to E, and finding the energy at which this derivative vanishes, one can show that this probability function has a peak at E*=kkT, and that at this energy value P(E*)=(KkT) exp(-K) By then asking at what energy e' the function P(E)drops to exp(-1)of this maximum alue P(es) PAGE 4
PAGE 4 Figure 7.1 Probability Weighting Factor P(E) as a Function of E for K = 10. By taking the derivative of this function P(E) with respect to E, and finding the energy at which this derivative vanishes, one can show that this probability function has a peak at E* = K kT, and that at this energy value, P(E*) = (KkT)K exp(-K), By then asking at what energy E' the function P(E) drops to exp(-1) of this maximum value P(E*):
P(E)=exp(-1)P(E*) E=KkT(1+(2/K)) So the width of the P(E) graph, measured as the change in energy needed to cause P(e)to drop to exp(-1)of its maximum value divided by the value of the energy at which P(e) assumes this maximum value is (EE*)E*=(2/K)2. This width gets smaller and smaller as K increases. The primary conclusion is that as the number n of molecules in the sample grows, which, as discussed earlier, causes K to grow, the energy probability function becomes more and more sharply peaked about the most probable energy E*. This, in turn, suggests that we may be able to model, aside from infrequent fluctuations, the behavior of systems with many molecules by focusing on the most probable situation(i.e, having the energy E*)and ignoring deviations from his case It is for the reasons just shown that for so-called macroscopic systems near equilibrium, in which N(and hence K)is extremely large(e.g, N-10 to 10), only the most probable distribution of the total energy among the n molecules need be considered This is the situation in which the equations of statistical mechanics are so useful PAGE 5
PAGE 5 P(E') = exp(-1) P(E*), one finds E' = K kT (1+ (2/K)1/2 ). So the width of the P(E) graph, measured as the change in energy needed to cause P(E) to drop to exp(-1) of its maximum value divided by the value of the energy at which P(E) assumes this maximum value, is (E'-E*)/E* = (2/K)1/2 . This width gets smaller and smaller as K increases. The primary conclusion is that as the number N of molecules in the sample grows, which, as discussed earlier, causes K to grow, the energy probability function becomes more and more sharply peaked about the most probable energy E*. This, in turn, suggests that we may be able to model, aside from infrequent fluctuations, the behavior of systems with many molecules by focusing on the most probable situation (i.e., having the energy E*) and ignoring deviations from this case. It is for the reasons just shown that for so-called macroscopic systems near equilibrium, in which N (and hence K) is extremely large (e.g., N ~ 1010 to 1024), only the most probable distribution of the total energy among the N molecules need be considered. This is the situation in which the equations of statistical mechanics are so useful