Engineering's design‐oriented and problem‐solving emphases pose considerable challenges to the use of multiple choice assessments, and thus the use of digitized forms of assessment within Engineering curricula warrants a substantive evaluation of their validity. To address this research need, novel computer‐based assessment infrastructures and delivery protocols were launched via a crossover study as defined below. The objective of the study was to investigate the impact of proctored digitized examination that was delivered using a secure or “lockdown” internet browser interface. Measures tracked in the study included test score validity, learning achievement, and student perceptions within a large‐size undergraduate Mechanical and Aerospace Engineering course. Using a crossover‐based study design, the class was divided into two cohorts. Half of each cohort's formative assessments were delivered in a computer‐based testing center via and the other half of the assessments were delivered on paper. Within each cohort, the intervention of computerized delivery was interchanged with paper‐based delivery during successive topic modules on a mutually‐exclusive basis. An identical paper‐based final exam was delivered to all students. The data collected was analyzed and de‐correlated to investigate the topic‐specific learning outcomes of digitized assessment. Test score validity and learning outcomes were studied using independent sample t‐test statistical analysis. Results indicate that well‐formed digitized exams can determine scores differing as little as 0.6% from paper‐based assessment. Moreover, when the labor hours of the grader which were freed up were reallocated to tutoring and remediation of formative assessments, the as much as 16.9% higher learning outcomes were obtained during summative assessment as compared to paper‐based delivery.