We now know how to express our knowledge of the system. This allows us to formulate a new set of occupation probabilities, derived in light of information available to us. All that’s left to ask is: what can we reasonably expect to know about a system?

One of the most fundamental properties of any entity is its energy. A thermodynamic system might have energy by virtue of the jostling of its constituent particles, or of their interparticular attraction, or of their interaction with some external potential field. In any case, a system’s energy is a numerical property which can in principle be measured.

For a system’s mean energy to be fixed, it must in equilibrium with its environment. This does not mean that the system’s energy is precisely fixed; since the microstate of the system is not known exactly, the system’s energy is essentially a randomly distributed variable with some known variance. But the mean energy about which the system’s energy fluctuates *can * be known. It’s this mean value of the system’s energy we can measure as a macroscopic variable.

So we choose to characterise the system’s microstates by their associated energies *and nothing more. *This allows us to place a single, second constraint on the occupation probabilities. The consequences are explored below.

It is now asserted that something is known of the system: its mean energy . Then *two *things are known of the probabilities : they satisfy the equations

We appeal to our old dogma: *maximise the Gibbs entropy*. Using the method of Lagrange multipliers, it is necessary to extremise the function

The quantity is our brand-new Lagrange multiplier. Taking the differential of the above yields

Since the , and are independent, we must demand that

Rearranging the first equation yields

Substitution into the second equation gives

We eliminate between the two equations to give

How does this probability distribution differ from that of the isolated system? The equation for shows that some states are more probable than others, namely those with a lower energy. The probability of a system being in a given state decreases exponentially with its associated energy. Were we to play the game with the screens again, the monitors would be dominated by pictures of low-energy states. The name for this imaginary set of virtual copies of our system is the *canonical ensemble*. Here’s a graph of the probability distribution of microstate energies:

There is nothing inherently ‘special’ about this particular ensemble, the one for which we profess to know only the mean internal energy. We could have asserted that additional properties of the system were known, and derived a different probability distribution. But it turns out that this set of probabilities is enough to reconstruct the thermodynamics of a variety of systems.

The system about which nothing was known we called the ‘isolated system’. What shall we name this one? The kind of system populating the canonical ensemble is known as a *closed system*. Why is this? The system is not ‘closed’ to the same extent that an isolated system is: we assumed our system had come to equilibrium with its environment, meaning energy must be allowed to flow across the system’s boundary. The system is really called closed to distinguish it from another kind of system, the *open system*. This kind of system is characterised by a different set of probabilities, which may be explored in the distant future!

The next item on our agenda is to construct the thermodynamics of the closed system. In doing so, we’ll supply the mysterious parameter with physical significance. In the next post we define an important quantity known as the *partition function*.