Private groundwater wells can be unmonitored sources of contaminated water that can harm human health. Developing models that predict exposure could allow residents to take action to reduce risk. Machine learning models have been successful in predicting nitrate contamination using geospatial information such as proximity to nitrate sources, but previous models have not considered meteorological factors that change temporally. In this study, we test random forest (regression and classification) and linear regression models to predict nitrate contamination using rainfall, temperature, and readily available soil parameters. We trained and tested models for (1) all of North Carolina, (2) each geographic region in North Carolina, (3) a three‐county region with a high density of animal agriculture, and (4) a three‐county region with a low density of animal agriculture. All regression models had poor predictive performance (R2 < 0.09). The random forest classification model for the coastal plain showed fair agreement (Cohen's κ = 0.23) when trying to predict whether contamination occurred. All other classification models had slight or poor predictive performance. Our results show that temporal changes in rainfall and temperature, or in combination with soil data, are not enough to predict nitrate contamination in most areas of North Carolina. The low level of contamination (<25%) measured during the study could have contributed to the poor performance of the models.