As I said yesterday, I'm going to blow through another entire subfield of physics in a single equation, as our march toward Newton's Birthday continues. Today, it's statistical mechanics, a very rich field of study that we're boiling down to a single equation:
This is Boltzmann's formula for the entropy of a macroscopic system of particles, which says that the entropy s is proportional to the logarithm of the number N of microscopic states consistent with that macroscopic state. The constant kB is there to get the units right.
Why does this get to stand in for the whole field of statistical mechanics?
When thinking about this, I was tempted to go with the definition of the Boltzmann distribution, which my angry quantum prof in grad school gave as the only useful thing to come out of stat mech, saying there was no reason to spend a whole semester on it. This seems to get more to the heart of the matter, though.
The field of statistical physics is basically dedicated to explaining how the world we see around us arises from the interactions of uncountable numbers of microscopic particles (atoms, molecules, electrons, whatever). The key realization that makes it possible to extract predictions without needing to know the state of all 1027 atoms making up some object is that such huge systems can be described statistically-- that there are many possible equivalent states for all the particles making up a system, and which precise one you find yourself in is just a matter of probability.
This equation, then, really captures the central notion of the entire field. The macroscopic property known as entropy is just a measure of how many microscopic arrangements of particles and states could give you the same collection of properties for the macroscopic system. If there are lots of ways to get basically the same situation, the entropy is high; if there are only a few ways of getting a particular set of properties, then it has low entropy.
The provides a very simple foundation for thermodynamics-- the observed fact that entropy always increases (the infamous Second Law of Thermodynamics) is really just a consequence of probability. A system that starts in a low-entropy state has lots of ways to move to a state of higher entropy, but only a few ways to move to a state with the same or lower entropy. Thus, you are more likely to see a system move from low to high entropy than vice versa, and when you're talking about macroscopic objects involving 1020-odd atoms, the probability of seeing entropy spontaneously decrease quickly moves into monkeys-writing-Shakespeare territory.
This doesn't have as many practical consequences as some of the other equations we've dealt with, but it's a tremendously powerful idea in a philosophical sort of way. It provides a framework for talking about the origin and history of the entire universe, including its eventual fate, and you don't get any deeper than that.
So, take a moment to admire the simplicity of the idea at the core of statistical mechanics, and come back tomorrow to see the next equation of the season.
- Log in to post comments
OK. Technical note, "log" here is presumed to be natural log often written "ln", not the common log10 which is not mathematically "natural" but an artifact of our fingers.
It's always astounded me that Boltzmann managed to get this before quantum mechanics, which makes it much easier to understand what is meant by counting microstates.
"the observed fact that entropy always increases (the infamous Second Law of Thermodynamics)"
Well, being picky here what you mean is that the entropy of an isolated system (system cannot exchange energy or matter with environment) always increases. Or perhaps you were referring to the change in entropy of the system + surroundings is positive for spontaneous processes.
Crystalline snowflakes form spontaneously out of the vapor phase. The entropy of the system (H2O) *decreases* in this phase change, but at sufficiently low temperature the process nonetheless happens due to the large decrease in internal energy. The entropy of the environment increases, and the change in entropy of the sum of entropy changes is a net positive.
This doesn't have as many practical consequences as some of the other equations we've dealt with
Without statistical mechanics as a bridge between micro- and macro-scale phenomena, it would be impossible to understand heat, mass, and charge transfer in most engineering systems, especially microchips. The macroscopic forms of the conservation laws always need equations of state or constitutive relations to be useful. I don't really understand why you would say this. (Do you think that graduate quantum prof's bias rubbed off?)
Can entropy be explained in Legos? If I have a bucket of Legos, there are specific Legos in specific positions in the bucket. If I then assemble those Legos into a tower, there are specific Legos in specific positions in the tower. Why is the tower lower entropy?
Is it because in the Lego jumble in the bucket, I don't really care if you move one Lego a little bit, or swap two Legos? A jumble is a jumble is a jumble? It seems weird that the laws of physics would take note of how I feel about a bucket of Legos.
Is it because in the Lego jumble in the bucket, I don't really care if you move one Lego a little bit, or swap two Legos? A jumble is a jumble is a jumble? It seems weird that the laws of physics would take note of how I feel about a bucket of Legos.
It's because none of the macroscopic properties of the pile would change if you swapped a few of them around. And what you measure about the pile are its macroscopic properties.
If you were keeping track of the pile to such an extent that you would notice two bricks being swapped, that'd be a different matter, but that's not the situation you're in with statistical mechanics. I also suspect that the price of apparently lowering the number of indistinguishable states of your Lego pile would be a huge increase in the entropy of the apparatus required to make that measurement, but I'm less sure about how that would play out.
Really nice description of entropy. Much tastier than most low quality chocolates in German advent calendars. =)
Neil @1:
Absolutely, because log without a subscript always implies the subscript is the "natural" one -- e. This basic fact of mathematics has become perverted by calculators, where they couldn't find room to fit the "10" on the key. Using log to denote the log base e (natural logarithm) remains standard in programming languages, so it is good for students to learn this basic bit of mathematical communication.
Dear @CCPhysicist,
I think the log base conventions vary among fields: mathematics, physics, information theory, chemistry. As a matter of fact, one can formulate statistical physics with any base of the logarithm, as well as without the Boltzmann constant k, which is not a universal constant but a conversion factor (resulting from the definition of absolute temperature through the triple point of water).
Here's a very good book about statistical physics developed using the Shannon information entropy:
A FAREWELL TO ENTROPY
Statistical Thermodynamics Based on Information
by Arieh Ben-Naim (2008)
CCPhysicist, the convention that "log", without subscript, implies log_10 has nothing to do with the limitations of calculators. Ages before calculators were invented, people who wanted to carry out precise calculations used "Log tables" for multiplication and division, and these were always "common" or "Briggs" logarithms, not "natural" logarithms. See, for example, any edition of the CRC "Handbook of Chemistry and Physics" prior to ~ 1975 For approximate calculations, you used a slide rule (which is basically a mechanized log table), but for high-precision work you used log tables.