Background Increasing attention is being paid to improvement in undergraduate science, technology, engineering, and mathematics (STEM) education through increased adoption of research-based instructional strategies (RBIS), but high-quality measures of faculty instructional practice do not exist to monitor progress.Purpose/Hypothesis The measure of how well an implemented intervention follows the original is called fidelity of implementation. This theory was used to address the research questions: What is the fidelity of implementation of selected RBIS in engineering science courses? That is, how closely does engineering science classroom practice reflect the intentions of the original developers? Do the critical components that characterize an RBIS discriminate between engineering science faculty members who claimed use of the RBIS and those who did not?Design/Method A survey of 387 U.S. faculty teaching engineering science courses (e.g., statics, circuits, thermodynamics) included questions about class time spent on 16 critical components and use of 11 corresponding RBIS. Fidelity was quantified as the percentage of RBIS users who also spent time on corresponding critical components. Discrimination between users and nonusers was tested using chi square.Results Overall fidelity of the 11 RBIS ranged from 11% to 80% of users spending time on all required components. Fidelity was highest for RBIS with one required component: casebased teaching, just-in-time teaching, and inquiry learning. Thirteen of 16 critical components discriminated between users and nonusers for all RBIS to which they were mapped.Conclusions Results were consistent with initial mapping of critical components to RBIS. Fidelity of implementation is a potentially useful framework for future work in STEM undergraduate education.