Groundwater, an important source
of drinking water globally, is
susceptible to contamination by naturally occurring metals and radionuclides.
Regional trends in groundwater quality are useful in predicting the
occurrence of contaminants but are difficult to translate to local
scales due to complex contaminant–solid phase associations.
Here, the aqueous phase of sequential extractions is analyzed using
multicollector inductively coupled plasma mass spectrometry techniques
to quantify ultratrace radium (Ra) levels in operationally defined
fractions of the aquifer solids. Results demonstrate that local-scale
geochemistry drives Ra partitioning to groundwater in the U.S. Midwestern
Cambrian–Ordovician aquifer system. Analysis of whole-rock
extractions indicates that parent and daughter isotope activity ratios
indicate that Ra remains close to the site of 238U decay
in most stratigraphic units, where the 238U/226Ra ratio is similar to the equilibrium value; however, other stratigraphic
units display isotope ratios indicative of Ra leaching. Additionally,
Ra varies in prevalence across examined stratigraphy, in both whole-rock
and sequential extractions; the average whole-rock 226Ra
activity is 70 ± 10 mBq/cm3 in the Maquoketa shale
in comparison to 6 ± 1 mBq/cm3 in the St Peter sandstone.
This suggests that Ra mobilization depends on both the reactive solid
phases present in the stratigraphy and the influence of local geochemical
conditions on solid phase–contaminant interactions. Variation
in geochemical conditions, such as redox or competitive ion exchange,
affects Ra partitioning to groundwater differently across stratigraphy,
depending on initial solid-phase associations.