ABSTRACT

For some 50 years or so, research and development into the use of quantum absorbers as the basis for frequency standards have resulted in their use as the means to define the SI units of length and time. They are particularly attractive due to the fact that absorptions in a particular element can be considered as offering fixed frequencies or energies which are not susceptible to changes in material properties or environmental parameters. Whilst this is true to first order, allowing the establishment of frequency standards having stabilities with orders of magnitude improvement over previous artifact-based standards (e.g. the platinum-iridium metre bar for length and quartz crystal clocks for time), in fact these quantum-based standards themselves are subject to environmental factors at certain levels of stability and much work has gone into quantifying the effects of these. In particular, there has been extensive research into the characterization and improvement of the microwave standard based on the 9.2 GHz 133Cs absorption, since this was adopted as the primary standard of time in 1967. Early caesium standards used thermal Cs beams, achieving accuracies of ∼10−12, whereas recent advances in laser cooling techniques have led to cold Cs atomic fountains, where instabilities of below 1 in 1015 at 1 day and accuracies of 1 in 1015 are already being achieved.