Statistical Mechanics – Thermodynamic Beta

In the last post, we learnt that the Gibbs entropy is additive – the Gibbs entropy of a system comprising multiple subsystems is the sum of the Gibbs entropies of those constituent subsystems. To prove this, we considered two systems in loose thermal contact. Two such systems are allowed to exchange energy, but their state occupation probabilities are uncorrelated.

In this post, I’ll finally make good on my promise to interpret the Lagrange multiplier \beta. To do so, we’ll consider again a pair of closed systems in loose thermal contact, and see how the Gibbs entropy of the pair changes in response to energy exchange between the two systems.

To avoid complications, we’ll demand that

  1. no work is done on either subsystem, so their energy content can change only via the transfer of heat
  2. the two subsystems are otherwise completely isolated, such that no heat can flow to or from their environment

Let’s see how the systems interact.

ltc

The Gibbs entropy S_G of the joint system is related to the Gibbs entropy of the subsystems one and two by

S_G=S_G^1+S_G^2

as we established in the last post. Hence a small change in the system’s Gibbs entropy dS_G is given by

dS_G=dS_G^1+dS_G^2

Now we use the expression for the change in Gibbs entropy we derived earlier:

dS_G=\beta_1 (dU_1-\langle dE\rangle_1)+\beta_2 (dU_2-\langle dE\rangle_2)

Recall dU_i is the change in energy of system i and \langle dE\rangle_i is the mean change in the ith system’s energy levels. The latter process is caused by work being done on this system. But our first demand was that no work is done on either system! So the \langle dE\rangle_i terms both go to 0, leaving

dS_G=\beta_1 dU_1+\beta_2 dU_2

Now we cite the second demand. The total energy of the system U is given by

U=U_1+U_2

That is, the energy of the system is the sum of the energies of the two subsystems. Hence

dU=dU_1+dU_2

But according to the second demand, no heat is allowed to flow into the system from the outside. This, coupled with the constraint that no work is done on either system, means that the total energy of the system is conserved:

dU=0

Hence

dU_1+dU_2=0

dU_1=-dU_2

This equation says that any energy lost by the first system is picked up by the second system, and vice versa. Replacing dU_2 in the equation for dS_G above gives

dS_G=(\beta_1-\beta_2)dU_1

\

We now invoke our fundamental precept: the Gibbs entropy of the system is maximised in equilibrium. For the two systems to come into equilibrium, the Gibbs entropy must increase; if there exists a path which will lead to an increase in its entropy, it will realise that path. This is how we calculated the set of probabilities \{p_\alpha\} in the first place – by demanding that they make the Gibbs entropy as big as possible. Only when the Gibbs entropy of the system is at a maximum will it be satisfied, and come to rest. So

dS_G\ge 0

Note that though the Gibbs entropy of each subsystem may be locally maximised before the two are brought into contact, the entropy of the joint system as a whole is not maximised. The equation above shows that the Gibbs entropy of the joint system can increase via an energy flux, and will do so provided

(\beta_1-\beta_2)dU_1\ge 0

Note the following behaviour:

  • if \beta_1>\beta_2 then dU_1\ge 0
  • if \beta_1<\beta_2 then dU_1\le 0
  • if \beta_1=\beta_2 then dU_1=0

The last statement is the killer. If the values of \beta associated with two systems in loose thermal contact are equal, there will be no energy transfer between the two, and the Gibbs entropy of the joint system S_G is maximised. Thermodynamic \beta is that which equalises between two systems when they reach equilibrium. To this property we would ordinarily give the name ‘temperature’.

But it’s not quite temperature in the sense with which most people are familiar. Take another look at the inequalities above. If \beta_1 is greater than \beta_2, energy will flow into system one. Conversely, if \beta_1 is less than \beta_2, energy flows out of system one. Therefore \beta does not measure ‘hotness’ as temperature does but ‘coldness’; \beta=0 corresponds not to absolute zero, but to a very ‘hot’ temperature. Similarly, \beta=\infty for an object at absolute zero: the object is ‘infinitely cold’.

In fact, thermodynamic \beta and traditional thermodynamic temperature T are related by an inverse proportion:

\displaystyle \beta\propto\frac{1}{T}

Why is it that we’ve come to use T to measure the ‘temperature’ of objects rather than \beta? One good reason is that most of the objects that we encounter in everyday life are ‘cold’. Therefore if \beta were the preferred means of measuring temperature, we would have to deal with big numbers all the time. It is much more natural to use a measure of temperature which is 0 for absolutely ‘cold’ objects and increases as things get hotter. I’ll most likely write a separate post about this, since it’s quite interesting to think about!

\

To supplement this equilibrium interpretation, we can consider the relation between thermodynamic \beta and energy U. Recall the mean energy of the system is defined by

\displaystyle U=\sum_\alpha p_\alpha E_\alpha

where p_\alpha is the probability that the system has energy E_\alpha. Let’s calculate the rate at which the system’s energy changes with respect to \beta:

\displaystyle \frac{\partial U}{\partial \beta} = \frac{\partial}{\partial \beta}\sum_\alpha p_\alpha E_\alpha

\displaystyle \frac{\partial U}{\partial \beta} = \frac{\partial}{\partial \beta}\frac{\sum_\alpha E_\alpha e^{-\beta E_\alpha}}{\sum_\alpha e^{-\beta E_\alpha}}

We use the quotient rule to carry out the differentiation:

\displaystyle \frac{\partial U}{\partial \beta}=\frac{-(\sum_\alpha e^{-\beta E_\alpha})(\sum_\alpha E_\alpha^2 e^{-\beta E_\alpha})+(\sum_\alpha E_\alpha e^{-\beta E_\alpha})(\sum_\alpha E_\alpha e^{-\beta E_\alpha})}{(\sum_\alpha e^{-\beta E_\alpha})^2}

\displaystyle \frac{\partial U}{\partial \beta}=-\Big(\sum_\alpha p_\alpha\Big)\Big(\sum_\alpha p_\alpha E_\alpha^2\Big)+\Big(\sum_\alpha p_\alpha E_\alpha\Big)\Big(\sum_\alpha p_\alpha E_\alpha\Big)

\displaystyle \frac{\partial U}{\partial \beta}=-\langle E^2\rangle+\langle E\rangle^2

If you’re familiar with statistics, you’ll recognise this value as the negative variance of the system’s energy. The variance of the system’s energy is defined by

\text{Var}(E)=\langle(E-\langle E\rangle)^2\rangle

That is, it is the mean value of the squared difference between the system’s energy and its mean energy \langle E\rangle=U. Notice that, because it is the expectation value of a squared quantity, it is necessarily positive. It can be shown that the variance is equal to

\text{Var}(E)=\langle E\rangle^2-\langle E^2\rangle

Hence

\displaystyle \frac{\partial U}{\partial \beta}=-\text{Var}(E)

and so

\displaystyle \frac{\partial U}{\partial \beta}<0

Hence the system’s energy decreases monotonically with increasing \beta. This lends credence to the idea that \beta measures the ‘coldness’ of the system.

\

Here’s a visual aid:

shifty

The animation above shows the distribution of probability for microstates of a particular energy according to the equation for the closed system

\displaystyle p(E)=\frac{e^{-\beta E}}{\sum_E e^{-\beta E}}

The animation shows that as \beta (the system’s chilliness) increases, lower-energy microstates become more probable and higher-energy states less so. If we let \beta increase without limit, the probability distribution would become a sharp, narrow peak at the lowest energy state – the system becomes so cold that access to all high energy states is denied it.

Conversely, as the coldness of the system goes to 0, all microstates become equiprobable. The system becomes so hot that it can explore the full range of its microstates during its thermal fluctuations.

Interestingly, there is never a point at which higher energy levels are more probable than lower energy states (because I am being disingenuous). This possibility will be explored in a later post for a particular system.

\

\

\

Another important result has come out of this discussion.

Suppose \beta_1<\beta_2. This means system two is colder than system one, hence energy will flow into system two. So

dU_1 < 0

If no work is done on the first system during the interaction,

dS_G^1 = \beta_1 dU_1

So assuming \beta_1>0,

dS_G^1 < 0

That is, the Gibbs entropy of the hotter system decreases. This seems to contradict the premise that the Gibbs entropy of a system is always maximal in equilibrium. So what went wrong? Nothing! When we maximised the Gibbs entropy of a closed system, we did so subject to the constraint that the mean energy of the system was fixed. But when two closed systems are brought into contact, their mean energies vary. What we do know is that the mean energy of the two systems together is fixed – it is the entropy of the joint system that is maximised.

This is an important point. The Gibbs entropy does not necessarily increase everywhere to reach equilibrium. It increases globally, but to do so, it may decrease locally in some parts of the system.

\

We’re nearly ready to do look at some real systems. The next post will constitute the last in the series of introductory posts, and will tackle a difficult subject – the interpretation of the Gibbs entropy.

Return to top

Advertisements

One thought on “Statistical Mechanics – Thermodynamic Beta

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s