Why Cesium?

The Zeitgeist for today highlights a little New York Times Q & A piece on atomic clocks, answering the question "Why is cesium used in atomic clocks?"

The striking thing about this, to me, is that they don't really answer the question. I mean, they talk about how atomic clocks work in very vague terms (I explained atomic clocks last August, in more detail), but the only thing they have on the "why cesium?" question is this:

It so happens, the United States Naval Observatory explains, that cesium 133 atoms have their 55 electrons distributed in an ideal manner for this purpose. All the electrons but the outermost one are confined to orbits in stable shells. The outermost electron is not disturbed much by the others, so how it reacts to microwaves can be accurately determined.

That's true as far as it goes, but it's still not really an answer. It doesn't distinguish between cesium and any of the otehr alkali metals, for example, which all have one valence electron in the outermost shell, relatively unperturbed by the other electrons. So, why cesium rather than sodium or rubidium?

I'm tempted to say "historical accident," because that's probably as good a reason as any. Cesium is relatively convenient to work with in atomic beam sources (other than its tendency to explode violently when it comes in contact with water), so it was as good a choice as any when it came time to decide on a reference atom.

There is a reason to prefer cesium to the other alkalis, though: it has the largest hyperfine splitting of any of them. The microwave radiation from a cesium clock oscillates 9,192,631,770 times per second, a frequency of 9.192 GHz while the analogous transition in rubidium is only 6.834 GHz. All other things being equal, you get a better clock by using a higher reference frequency.

The figure that determines the quality of the clock is the fractional uncertainty, which is the size of whatever errors you make in the measurement divided by the frequency that you're measuring. The errors are determined by factors in the lab, and are more or less independent of the choice of atom, so you get an improvement of about 30% by using cesium rather than rubidium. That's a big gain in the precision measurement world.

Of course, it turns out that cesium has some other properties that make it sub-optimal. In particular, it has a huge collisional cross section for the states of interest, which means that they need to run cesium clocks at low density in order to avoid a frequency shift due to atoms bumping into one another. But this limits the size of the signal they can get, which also limits the uncertainty of cesium clocks. A colleague at Penn State has made a serious argument that rubidium is a better choice for that reason-- the collisional shift in Rb ("God's atom") is tiny.

Of course, other people have other favorite reference atoms. When I was in grad school, we worked with metastable xenon, and the justification for doing this at NIST was that you might be able to make a clock on a transition in Xe with a frequency of 137,000 GHz, which would be a big jump up from Cs. And there's a group at NIST in Boulder with an even better standard-- a transition in a trapped and laser-cooled mercury ion with a frequency of 1,064,000 GHz (PDF of their paper).

Why haven't we changed to one of those standards? Inertia, basically. There are a huge number of cesium clocks in service, and they're a well-proven and well-tested technology. People know all the ins and outs of working with them, and there are lots of them operating reliably. Newer proposed standards don't have that installed base, and there are still a few kinks to be worked out in terms of getting everyone comfortable with the idea of a change.

Down the road, though, some sort of optical-frequency ionic or atomic transition is likely to replace cesium as the reference for the best. Because as remarkable as the precision of the best current clocks is, precision laser spectroscopy offers a chance to do even better.

Tags

More like this

Element: Cesium (Cs) Atomic Number: 55 Mass: One stable isotope, mass 133 amu. Laser cooling wavelength: 854nm, but see below. Doppler cooling limit: 125 μK. Chemical classification: Yet another alkali metal, column I of the periodic table. This one isn't greyish, though! It's kind of gold color.…
Element: Rubidium (Rb) Atomic Number: 37 Mass: two "stable" isotopes, 85 and 87 amu (rubidium-87 is technically radioactive, but it's half-life is 48 billion years, so it might as well be stable for atomic physics purposes. Laser cooling wavelength: 780 nm Doppler cooling limit: 140 μK Chemical…
Somebody asked a question at the Physics Stack Exchange site about the speed of light and the definition of the meter that touches on an issue I think is interesting enough to expand on a little here. The questioner notes that the speed of light is defined to be 299,792,458 m/s and the meter is…
I was looking at some polling about science over the weekend, and discovered that they helpfully provide an online quiz consisting of the factual questions asked of the general public as part of the survey. Amusingly, one of them is actually more difficult to answer correctly if you know a lot…

Back in the Dark Ages (ie. when the Internet was just a gleam in DARPA's eye) I had a summer job at the Canadian Time Standards Lab, where my country keeps its official atomic clocks (which make their due contribution to UTC). I can tell you that these are the people who put the "anal" in "analysis".

I read one paper by a physicist there about relativistic corrections to be applied to the clock. The definition of the standard second specifies that the cesium atom is at rest. However (at least in the Canadian design), the cesium is moving in a beam at thermal velocities, thus requiring a correction for relativistic time-shift (note that thermal velocity is way below what is normally considered "relativistic"). It seems that when they first designed the clock, they made the simplification of just using the average velocity of the beam, resulting in a correction of about 1Hz (out of 9.2GHz). The paper did a re-analysis, using a more-realistic distribution (Gaussian?) of beam velocities -- and concluding that the correction should be only 0.4Hz.

I put a plug in for the inertia aspect of the atomic clock standard. It takes a lot of political effort to change the standard by which everything else is calculated. By sticking with an OK clock (such as cesium), we are going to do well for a number of years. That will give time for better technologies to mature to the point where we *must* move to them. But for all intents and purposes, optical clocks, trapped ion clocks (such as mercury or even better the hybrid aluminum/beryllium entangled clock) and other new technologies are just not needed yet, except for some very high precision tests of fundamental physics. In the mean time, cesium will work and the cost of changing it is too high.

I read this little piece in the NYT while I was proctoring my final, and I was left wondering what the whole answer was. Thanks for the post.

Cesium, known as Caesium outside the USA, is characterised by a spectrum containing two bright lines in the blue (accounting for its name: from the Latin "caesius" = sky blue). It was discovered spectroscopically in 1860 by Bunsen and Kirchhoff in mineral water from Durkheim. It is silvery gold, soft, ductile, the most electropositive, and most alkaline element. Caesium, gallium, and mercury are the only three metals that are liquid at or around room temperature.

For more on the use in atomic clocks see
http://cho.usno.navy.mil/cesium.html

For more than two dozen inspiring songs about the element cesium, see
http://www.cs.rochester.edu/users/faculty/nelson/cesium/cesium_songs.ht…

Render unto Caesium what is Caesius's.