The Zeitgeist for today highlights a little New York Times Q & A piece on atomic clocks, answering the question “Why is cesium used in atomic clocks?”
The striking thing about this, to me, is that they don’t really answer the question. I mean, they talk about how atomic clocks work in very vague terms (I explained atomic clocks last August, in more detail), but the only thing they have on the “why cesium?” question is this:
It so happens, the United States Naval Observatory explains, that cesium 133 atoms have their 55 electrons distributed in an ideal manner for this purpose. All the electrons but the outermost one are confined to orbits in stable shells. The outermost electron is not disturbed much by the others, so how it reacts to microwaves can be accurately determined.
That’s true as far as it goes, but it’s still not really an answer. It doesn’t distinguish between cesium and any of the otehr alkali metals, for example, which all have one valence electron in the outermost shell, relatively unperturbed by the other electrons. So, why cesium rather than sodium or rubidium?
I’m tempted to say “historical accident,” because that’s probably as good a reason as any. Cesium is relatively convenient to work with in atomic beam sources (other than its tendency to explode violently when it comes in contact with water), so it was as good a choice as any when it came time to decide on a reference atom.
There is a reason to prefer cesium to the other alkalis, though: it has the largest hyperfine splitting of any of them. The microwave radiation from a cesium clock oscillates 9,192,631,770 times per second, a frequency of 9.192 GHz while the analogous transition in rubidium is only 6.834 GHz. All other things being equal, you get a better clock by using a higher reference frequency.
The figure that determines the quality of the clock is the fractional uncertainty, which is the size of whatever errors you make in the measurement divided by the frequency that you’re measuring. The errors are determined by factors in the lab, and are more or less independent of the choice of atom, so you get an improvement of about 30% by using cesium rather than rubidium. That’s a big gain in the precision measurement world.
Of course, it turns out that cesium has some other properties that make it sub-optimal. In particular, it has a huge collisional cross section for the states of interest, which means that they need to run cesium clocks at low density in order to avoid a frequency shift due to atoms bumping into one another. But this limits the size of the signal they can get, which also limits the uncertainty of cesium clocks. A colleague at Penn State has made a serious argument that rubidium is a better choice for that reason– the collisional shift in Rb (“God’s atom”) is tiny.
Of course, other people have other favorite reference atoms. When I was in grad school, we worked with metastable xenon, and the justification for doing this at NIST was that you might be able to make a clock on a transition in Xe with a frequency of 137,000 GHz, which would be a big jump up from Cs. And there’s a group at NIST in Boulder with an even better standard– a transition in a trapped and laser-cooled mercury ion with a frequency of 1,064,000 GHz (PDF of their paper).
Why haven’t we changed to one of those standards? Inertia, basically. There are a huge number of cesium clocks in service, and they’re a well-proven and well-tested technology. People know all the ins and outs of working with them, and there are lots of them operating reliably. Newer proposed standards don’t have that installed base, and there are still a few kinks to be worked out in terms of getting everyone comfortable with the idea of a change.
Down the road, though, some sort of optical-frequency ionic or atomic transition is likely to replace cesium as the reference for the best. Because as remarkable as the precision of the best current clocks is, precision laser spectroscopy offers a chance to do even better.