An atomic clock is a clock device that uses an electron transition frequency in the microwave, optical, or ultraviolet region of the electromagnetic spectrum of atoms as a frequency standard for its timekeeping element. Atomic clocks are the most accurate time and frequency standards known, and are used as primary standards for international time distribution services, to control the wave frequency of television broadcasts, and in global navigation satellite systems such as GPS.
The principle of operation of an atomic clock is based on atomic physics; it uses the microwave signal that electrons in atoms emit when they change energy levels. Early atomic clocks were based on masers at room temperature. Currently, the most accurate atomic clocks first cool the atoms to near absolute zero temperature by slowing them with lasers and probing them in atomic fountains in a microwave-filled cavity. An example of this is the NIST-F1 atomic clock, one of the national primary time and frequency standards of the United States.
The accuracy of an atomic clock depends on two factors. The first factor is temperature of the sample atoms—colder atoms move much more slowly, allowing longer probe times. The second factor is the frequency and intrinsic width of the electronic transition. Higher frequencies and narrow lines increase the precision.