Interest in the psychology of misinformation has exploded in recent years. Despite ample research, to date there is no psychometrically validated instrument to measure people’s ability to detect misinformation. To overcome this, we introduce the Verification done framework, an overarching interpretation schema, that simultaneously considers overall veracity discernment, and the distinct, measurable abilities (real news detection, fake news detection), and biases (distrust—negative judgement bias; naïvité—positive judgement bias) that it is composed of—thus offering a holistic and nuanced assessment. We then conduct three studies with six independent samples (Ntotal = 7,291) to develop, validate, and apply the Misinformation Susceptibility Test (MIST), the first psychometrically validated measurement instrument of veracity discernment ability. In Study 1 (N = 409), we use a neural network language model to generate items for our test, and use factor analysis and item-response theory to create the MIST-20 (20 items; <2 minutes) and MIST-8 (8 items; <1 minute). In Study 2 (N = 6,461), we confirm model fit in four representative samples (US, UK), from three different sampling platforms—Respondi, CloudResearch, and Prolific. We also explore the MIST’s nomological net, which demonstrates good convergent and discriminant validity, and generate age-, region-, and country-specific norm tables. In Study 3 (N = 421), we demonstrate how the MIST can be used in practice to test the effectiveness of interventions using the Verification done framework. We provide instructions for researchers and practitioners on how to implement the MIST as a screening tool, covariate, or framework for evaluating effects.