The fact that we can build models from data, and therefore refine our models with more data from experiments, is usually given for granted in scientific inquiry. However, how much information can we extract, and how precise can we expect our learned model to be, if we have only a finite amount of data at our disposal? Nuclear physics demands an high degree of precision from models that are inferred from the limited number of nuclei that can be possibly made in the laboratories.
In manuscript I will introduce some concepts of computational science, such as statistical theory of learning and Hamiltonian complexity, and use them to contextualise the results concerning the amount of data necessary to extrapolate a mass model to a given precision.