ABSTRACT:The motivation for developing alternative detection techniques for nuclear magnetic resonance (NMR) and magnetic resonance imaging (MRI) is to overcome some of the limitations associated with high-field NMR/MRI instruments. The limitations include poor portability, cryogenic requirements, and high costs. To achieve this goal, a low magnetic field is preferred. Since the sensitivity of inductive detection for conventional NMR and MRI scales linearly with the magnetic field strength, it is not optimal for low-field detection. In this contribution, we describe the concept of using atomic magnetometers as an alternative detection method. Atomic magnetometers possess an ultrahigh sensitivity that is independent of the magnetic field strength, which makes them viable for low-field detection in NMR and MRI. We first introduce the principle of atomic magnetometry and follow this with a discussion of recent progress in the field. To compare the sensitivities of atomic magnetometers of diverse sizes, we define a signal-to-noise ratio for a fixed detection volume to normalize the sensitivity with regard to the cell size. We then focus on two coupling schemes for NMR and MRI detection using atomic magnetometers. Finally, we discuss the challenges involved in implementing this alternative detection technique for NMR and MRI.