The use of cold atoms has led to a substantial increase in the accuracy achievable in many atomic physics measurements. This has most notably been demonstrated in the atomic clock relying on the interference of internal states of weakly interacting atoms in free fall. However, it has also led to an additional layer of experimental complexity which, combined with the physical size of state-of-the-art setups, impose significant limitations on wider practical applications. Progress will be reported on the development of a compact atomic clock based on cold atoms.
Unprecedented precision has also been demonstrated in atom interferometers relying on the detection of differential phase shifts between atomic wavefunctions of e.g. different motional states. Sensitivity to external interactions results in a shift of the atomic phase relative to a lab-frame reference, typically the spatial phase of an optical standing wave. This is a limitation to practical measurements as it requires long temporal stability and has motivated the investigation of an atom interferomenter inherently insensitive to the phase noise of the readout system. This relies on an atomic homodyne detection allowing the entire interferometric signal to be read out in a single shot.