In the last post, we learnt that the Gibbs entropy is *additive* – the Gibbs entropy of a system comprising multiple subsystems is the *sum* of the Gibbs entropies of those constituent subsystems. To prove this, we considered two systems in *loose thermal contact. *Two such systems are allowed to exchange energy, but their state occupation probabilities are uncorrelated.

In this post, I’ll finally make good on my promise to interpret the Lagrange multiplier . To do so, we’ll consider again a pair of closed systems in loose thermal contact, and see how the Gibbs entropy of the pair changes in response to energy exchange between the two systems.

To avoid complications, we’ll demand that

- no work is done on either subsystem, so their energy content can change only via the transfer of heat
- the two subsystems are otherwise completely isolated, such that no heat can flow to or from their environment

Let’s see how the systems interact.

The Gibbs entropy of the joint system is related to the Gibbs entropy of the subsystems one and two by

as we established in the last post. Hence a small change in the system’s Gibbs entropy is given by

Now we use the expression for the change in Gibbs entropy we derived earlier:

Recall is the change in energy of system and is the mean change in the ^{th} system’s energy levels. The latter process is caused by work being done on this system. But our first demand was that *no work* is done on either system! So the terms both go to 0, leaving

Now we cite the second demand. The total energy of the system is given by

That is, the energy of the system is the sum of the energies of the two subsystems. Hence

*But* according to the second demand, no heat is allowed to flow into the system from the outside. This, coupled with the constraint that no work is done on either system, means that the total energy of the system is conserved:

Hence

This equation says that any energy lost by the first system is picked up by the second system, and vice versa. Replacing in the equation for above gives

We now invoke our fundamental precept: *the Gibbs entropy of the system is maximised in equilibrium. *For the two systems to come into equilibrium, the Gibbs entropy must increase; if there exists a path which will lead to an increase in its entropy, *it will realise that path*. This is how we calculated the set of probabilities in the first place – by demanding that they make the Gibbs entropy as big as possible. Only when the Gibbs entropy of the system is at a maximum will it be satisfied, and come to rest. So

Note that though the Gibbs entropy of each subsystem may be locally maximised before the two are brought into contact, the entropy of the joint system *as a whole* is *not* maximised. The equation above shows that the Gibbs entropy of the joint system can increase via an energy flux, and will do so provided

Note the following behaviour:

- if then
- if then
- if then

The last statement is the killer. If the values of associated with two systems in loose thermal contact are equal, there will be no energy transfer between the two, and the Gibbs entropy of the joint system is maximised. *Thermodynamic * *is that which equalises between two systems when they reach equilibrium.* To this property we would ordinarily give the name ‘temperature’.

But it’s not quite temperature in the sense with which most people are familiar. Take another look at the inequalities above. If is greater than , energy will flow *into* system one. Conversely, if is less than , energy flows *out* of system one. Therefore does not measure ‘hotness’ as temperature does but ‘coldness’; corresponds not to absolute zero, but to a very ‘hot’ temperature. Similarly, for an object at absolute zero: the object is ‘infinitely cold’.

In fact, thermodynamic and traditional thermodynamic temperature are related by an inverse proportion:

Why is it that we’ve come to use to measure the ‘temperature’ of objects rather than ? One good reason is that most of the objects that we encounter in everyday life are ‘cold’. Therefore if were the preferred means of measuring temperature, we would have to deal with big numbers all the time. It is much more natural to use a measure of temperature which is 0 for absolutely ‘cold’ objects and increases as things get hotter. I’ll most likely write a separate post about this, since it’s quite interesting to think about!

To supplement this equilibrium interpretation, we can consider the relation between thermodynamic and energy . Recall the mean energy of the system is defined by

where is the probability that the system has energy . Let’s calculate the rate at which the system’s energy changes with respect to :

We use the quotient rule to carry out the differentiation:

If you’re familiar with statistics, you’ll recognise this value as the negative *variance* of the system’s energy. The variance of the system’s energy is defined by

That is, it is the mean value of the squared difference between the system’s energy and its mean energy . Notice that, because it is the expectation value of a squared quantity, it is necessarily *positive. *It can be shown that the variance is equal to

Hence

and so

Hence *the system’s energy decreases monotonically with increasing *. This lends credence to the idea that measures the ‘coldness’ of the system.

Here’s a visual aid:

The animation above shows the distribution of probability for microstates of a particular energy according to the equation for the closed system

The animation shows that as (the system’s chilliness) increases, lower-energy microstates become more probable and higher-energy states less so. If we let increase without limit, the probability distribution would become a sharp, narrow peak at the lowest energy state – the system becomes so cold that access to all high energy states is denied it.

Conversely, as the coldness of the system goes to 0, all microstates become equiprobable. The system becomes so hot that it can explore the full range of its microstates during its thermal fluctuations.

Interestingly, there is never a point at which higher energy levels are *more* probable than lower energy states (because I am being disingenuous). This possibility will be explored in a later post for a particular system.

Another important result has come out of this discussion.

Suppose . This means system two is colder than system one, hence energy will flow into system two. So

If no work is done on the first system during the interaction,

So assuming ,

That is, the Gibbs entropy of the hotter system *decreases*. This seems to contradict the premise that the Gibbs entropy of a system is always maximal in equilibrium. So what went wrong? Nothing! When we maximised the Gibbs entropy of a closed system, we did so subject to the constraint that the mean energy of the system was fixed. But when two closed systems are brought into contact, their mean energies vary. What we do know is that the mean energy of the two systems *together* is fixed – it is the entropy of the joint system that is maximised.

This is an important point. The Gibbs entropy does not necessarily increase everywhere to reach equilibrium. It increases *globally, *but to do so, it may decrease *locally* in some parts of the system.

We’re nearly ready to do look at some real systems. The next post will constitute the last in the series of introductory posts, and will tackle a difficult subject – the interpretation of the Gibbs entropy.

Reblogged this on WHAT IS THERMODYNAMIC? .