The linear no-threshold (LNT) model has been the regulatory “law of the land” for decades. Despite the long-standing use of LNT, there is significant ongoing scientific disagreement on the applicability of LNT to low-dose radiation risk. A review of the low-dose risk literature of the last 10 y does not provide a clear answer, but rather the body of literature seems to be split between LNT, non-linear risk functions (e.g., supra- or sub-linear), and hormetic models. Furthermore, recent studies have started to explore whether radiation can play a role in the development of several non-cancer effects, such as heart disease, Parkinson’s disease, and diabetes, the mechanisms of which are still being explored. Based on this review, there is insufficient evidence to replace LNT as the regulatory model despite the fact that it contributes to public radiophobia, unpreparedness in radiation emergency response, and extreme cleanup costs both following radiological or nuclear incidents and for routine decommissioning of nuclear power plants. Rather, additional research is needed to further understand the implications of low doses of radiation. The authors present an approach to meaningfully contribute to the science of low-dose research that incorporates machine learning and Edisonian approaches to data analysis.