As a gentle introduction to statistical mechanics, let’s consider the simplest possible set of microstates: a pair, with equal and opposite energies:

We’ll be using the equations describing a closed system of non-interacting, distinguishable particles, whose partition function is simply the single-particle partition function raised to the power of the number of particles in the system. We will calculate how the occupation numbers , mean energy and entropy of the system vary with respect to changes in temperature.

Imagine an array of particles, trapped in a Cartesian spider web, which can point either *up* or *down*. When pointing upwards, a particle has energy , and when pointing downwards has energy . The terms up and down are just labels, used to give a pictorial means of telling one state from another; the virtue by which the particles have energy needn’t be specified.

The single-particle partition function describing each particle in the web is

meaning the partition function governing the entire array is

an assertion we established earlier, with the proviso that the particles are distinguishable. What does this tell us about the macrophysical behaviour of the array?

We can first calculate the average number of particles in each state. According to the equations describing a closed system, the probability that a particle in the array occupies a state with energy is

The total number of particles in either state is simply the product of these probabilities with the total number of particles , allowing us to make the following sketch for the occupation numbers:

How does the array respond to changes in temperature? Remember thermodynamic beta measures the ‘coldness’ of the system, since it is inversely proportional to temperature. We see that, for all positive temperatures, there are *always* more particles in the down state than in the up state, with the probability of occupying the up state increasing with increasing temperature.

In the limit that (as the system tends towards *absolute zero*), all of the particles cascade into the lower energy level. As the web is drained of all heat, its occupants obligingly settle into the lower energy state, forgoing access to the up state.

But, as , the two occupation numbers become equal. That is, there are, on average, as many particles in the upper state as in the lower state, because the occupation probabilities are equal. The provision of heat ‘unlocks’ the upper energy state. Note that there are *never* more particles in the upper state than in the lower state for . The two occupation probabilities tend asymptotically towards one another as the temperature tends to infinity, but that of the up state never overtakes that of the down state.

What else can we work out? Statistical mechanics tell us the array’s mean energy is

What can be said about this result? First we note the system’s mean energy is directly proportional to . So, double the number of particles in the web, double the energy of the array. This rather unexciting observation is at least (and indeed at most) reassuring, since it shows that our partition function predicts individual particle energies are additive.

More importantly, how does the system’s energy vary with ? Let’s first imagine the system at absolute zero. Using the expression above,

Physically, this means that every particle in the array must occupy its lower energy state. This is the same conclusion as that reached above. At absolute zero, the system assumes its overall lowest energy state: the one in which all of its constituents assume *their* lowest energy state. What about the opposite limit? How do the resident particles behave as their web is superheated? In the limit that ,

We could have predicted with result from the occupation numbers , but it is reassuring to see the prediction confirmed here. That the mean energy of the system is 0 means there are, on average, as many particles in the upper state as in the lower state. Again, it is interesting, and perhaps surprising, that as thermodynamic temperature tends to infinity, there are never more particles in the upper state than in the lower state. Even though the energy of each individual particle can be positive, the total energy of the system they constitute is *never* positive for .

In-between these two limits, the array’s energy increases with decreasing , because the probability of finding a particle in the upper state increases.

So much for the system’s energy. What about its entropy? Again, it can be determined explicitly using the partition function:

As with the mean energy, we find the system’s Gibbs entropy is additive with respect to , since every term . The graph shows that the entropy increases monotonically with increasing temperature.

Let’s look at the limiting cases again. At absolute zero,

What does it mean for a system’s entropy to be 0? We have established that, at absolute zero, every single particle must be in the lower energy state. This means we can specify *exactly* which microstate the array is in, since we know *exactly* which microstates its resident particles are in.

How does this tie in with ‘entropy as ignorance’? We found that the (Gibbs) entropy is strongly correlated with our lack of knowledge of the system’s microstate. Since at absolute zero we know the microstate of the array exactly, we are minimally ignorant of its microphysical behaviour. Therefore we should expect the entropy to assume its lowest value at absolute zero. The Gibbs entropy is defined as

and is therefore a sum of positive terms. So the smallest value the entropy can take is 0 – this seems to be consistent with .

How about the entropy of an extremely hot web?

We see from the graph that this is the maximum value the entropy can take. Why should the array’s entropy be maximal as its temperature tends to infinity? We found earlier that, in this limit, there are as many particles in the upper state as in the lower state.

Looking at ‘entropy as ignorance’: each particle can be in the upper or lower state with equal probability, therefore we could not be more uncertain of its state! Consequently we are maximally ignorant of the array’s microstate, since its occupants are so unpredictable. Since our ignorance is maximal, so too is the entropy.

Incidentally, we have seen these two results before. The maximum value of the (Gibbs) entropy, a value which we expect it will assume in the limit that , is

where is the number of microstates available to the system. The total number of microstates available to this system is , since the first particle can occupy 2 states, the second particle can (independently) occupy 2 states, the third particle … and so on. This result agrees with that found above.

We also found before that the minimum value it can assume for a closed system is zero:

The behaviour of this simple array seems to confirm our interpretations of and . As , the system has minimal energy and minimal entropy. Every particle sits in the lower energy state, aligned with its neighbours. As heat is added to the system , the energy of the system increases, as does its entropy. Its entropy increases because the probability that any one of the particles is in the upper state increases, so that we are more uncertain of the system’s microstate. In the limit that , the energy and entropy tend towards their maximum values. This is because access to the upper energy levels is ‘unlocked’ by the provision of heat, to such an extent that there are typically as many particles in the upper as in the lower state. In this limit, we become most uncertain of the microstate of the system.

There do exist real systems for which this model is reasonable. Electrons, say, possess an intrinsic magnetic moment by virtue of a mysterious property called *spin*. If an electron is placed in a magnetic field, its magnetic moment may become either aligned or anti-aligned with said field. The relative directions of the moment and the field determine their interaction energy, given by

where is the Bohr magneton and is the magnitude of the field. That the energies are discretised is predicted by quantum mechanics; we won’t worry about the details here. Nevertheless, we’ll generalise the results found in this post later by looking at a system of particles which can assume more than two different alignments with the externally applied field.

There is an interesting point, to which I have alluded before, concerning the range of values that may take. We learnt that even in the limit that the thermodynamic temperature of the system tends to , the energy of the array of particles can never be greater than zero. Physically, this means that there are never, on average, more particles in the upper state than in the lower state. Surely this can’t be right? If the system were close to absolute zero, and we tried to force it to accommodate units of energy, would it reject the additional unit? The answer is no, because we have ignored the possibility that ; this system can have a negative temperature.