updated 24-MAR-1993 by SIC
original by Scott I. Chase
Answer: Absolutely. :-)
Under certain conditions, a closed system can be described by a negative temperature, and, surprisingly, be hotter than the same system at any positive temperature. This article describes how it all works.
Our intuitive notion is that two systems in thermal contact should exchange no heat, on average, if and only if they are at the same temperature. Let's call the two systems S1 and S2. The combined system, treating S1 and S2 together, can be S3. The important question, consideration of which will lead us to a useful quantitative definition of temperature, is "How will the energy of S3 be distributed between S1 and S2?" I will briefly explain this below, but I recommend that you read K&K, referenced below, for a careful, simple, and thorough explanation of this important and fundamental result.
With a total energy E, S has many possible internal states (microstates). The atoms of S3 can share the total energy in many ways. Let's say there are N different states. Each state corresponds to a particular division of the total energy in the two subsystems S1 and S2. Many microstates can correspond to the same division, E1 in S1 and E2 in S2. A simple counting argument tells you that only one particular division of the energy, will occur with any significant probability. It's the one with the overwhelmingly largest number of microstates for the total system S3. That number, N(E1,E2) is just the product of the number of states allowed in each subsystem, N(E1,E2) = N1(E1)*N2(E2), and, since E1 + E2 = E, N(E1,E2) reaches a maximum when N1*N2 is stationary with respect to variations of E1 and E2 subject to the total energy constraint.
For convenience, physicists prefer to frame the question in terms of the logarithm of the number of microstates N, and call this the entropy, S. You can easily see from the above analysis that two systems are in equilibrium with one another when (dS/dE)_1 = (dS/dE)_2, i.e., the rate of change of entropy, S, per unit change in energy, E, must be the same for both systems. Otherwise, energy will tend to flow from one subsystem to another as S3 bounces randomly from one microstate to another, the total energy E3 being constant, as the combined system moves towards a state of maximal total entropy. We define the temperature, T, by 1/T = dS/dE, so that the equilibrium condition becomes the very simple T_1 = T_2.
This statistical mechanical definition of temperature does in fact correspond to your intuitive notion of temperature for most systems. So long as dS/dE is always positive, T is always positive. For common situations, like a collection of free particles, or particles in a harmonic oscillator potential, adding energy always increases the number of available microstates, increasingly faster with increasing total energy. So temperature increases with increasing energy, from zero, asymptotically approaching positive infinity as the energy increases.
The lowest possible energy state, all the spins pointing down, gives the system a total energy of -NuB, and temperature of absolute zero. There is only one configuration of the system at this energy, i.e., all the spins must point down. The entropy is the log of the number of microstates, so in this case is log(1) = 0. If we now add a quantum of energy, size uB, to the system, one spin is allowed to flip up. There are N possibilities, so the entropy is log(N). If we add another quantum of energy, there are a total of N(N-1)/2 allowable configurations with two spins up. The entropy is increasing quickly, and the temperature is rising as well.
However, for this system, the entropy does not go on increasing forever. There is a maximum energy, +NuB, with all spins up. At this maximal energy, there is again only one microstate, and the entropy is again zero. If we remove one quantum of energy from the system, we allow one spin down. At this energy there are N available microstates. The entropy goes on increasing as the energy is lowered. In fact the maximal entropy occurs for total energy zero, i.e., half of the spins up, half down.
So we have created a system where, as we add more and more energy, temperature starts off positive, approaches positive infinity as maximum entropy is approached, with half of all spins up. After that, the temperature becomes negative infinite, coming down in magnitude toward zero, but always negative, as the energy increases toward maximum. When the system has negative temperature, it is hotter than when it is has positive temperature. If you take two copies of the system, one with positive and one with negative temperature, and put them in thermal contact, heat will flow from the negative-temperature system into the positive-temperature system.
Nuclear and electron spin systems can be promoted to negative temperatures by suitable radio frequency techniques. Various experiments in the calorimetry of negative temperatures, as well as applications of negative temperature systems as RF amplifiers, etc., can be found in the articles listed below, and the references therein.