Assessments that aim to evaluate student understanding of chemical reactions and reaction mechanisms should ask students to construct written or oral explanations of mechanistic representations; students can reproduce pictorial mechanism representations with minimal understanding of the meaning of the representations. Grading such assessments is time-consuming, which is a limitation for use in large-enrollment courses and for timely feedback for students. Lexical analysis and logistic regression techniques can be used to evaluate student written responses in STEM courses. In this study, we use lexical analysis and logistic regression techniques to score a constructed-response item which aims to evaluate student explanations about what is happening in a unimolecular nucleophilic substitution (i.e., SN1) reaction and why. We identify three levels of student explanation sophistication (i.e., descriptive only, surface level why, and deeper why), and qualitatively describe student reasoning about four main aspects of the reaction: leaving group, carbocation, nucleophile and electrophile, and acid–base proton transfer. Responses scored as Level 1 (N = 113, 11%) include only a description of what is happening in the reaction and do not address the why for any of the four aspects. Level 2 responses (N = 549, 53%) describe why the reaction is occurring at a surface level (i.e., using solely explicit features or mentioning implicit features without deeper explanation) for at least one aspect of the reaction. Level 3 responses (N = 379, 36%) explain the why at a deeper level by inferring implicit features from explicit features explained using electronic effects for at least one reaction aspect. We evaluate the predictive accuracy of two binomial logistic regression models for scoring the responses with these levels, achieving 86.9% accuracy (with the testing data set) when compared to human coding. The lexical analysis methodology and emergent scoring framework could be used as a foundation from which to develop scoring models for a broader array of reaction mechanisms.
The Lewis acid−base model is key to identifying and explaining the formation and breaking of bonds in a large number of reaction mechanisms taught in the sophomore-level year-long organic chemistry course. Understanding the model is, thus, essential to success in organic chemistry coursework. Concept-inventories exist to identify misunderstandings and misconceptions of acid−base theories; open-ended problems, though, have been shown to provide a more nuanced and holistic understanding of how students use acid−base models to explain reactions. The time necessary to score such problems, however, limits their use, especially in large student enrollment courses. Given the efficacy of open-ended problems, there is occasion for the development of methods to efficiently and effectively analyze open-ended assessment responses. In this study, we establish the importance of assessing "use of the Lewis acid−base model to explain a chemical reaction" by determining the association of model use with summative examination performance. In addition, we generate and evaluate a binomial logistic regression model based on lexical analysis techniques for predicting Lewis acid−base model use in explanations of an acid−base proton-transfer reaction. Our work results in a predictive model that can be used to score the open-ended problem used in our study.
Reaction mechanisms are central to organic chemistry and organic chemistry education. Assessing understanding of reaction mechanisms can be evaluated holistically, wherein the entire mechanism is considered; however, we assert that such an evaluation does not account for how learners variably understand mechanistic components (e.g., nucleophile, electrophile) or steps (e.g., nucleophilic attack, proton transfer). For example, a learner may have proficiency of proton transfer steps without sufficient proficiency of a step where a nucleophile and electrophile interact. Herein, we report the development of a generalized rubric to assess the level of explanation sophistication for nucleophiles in written explanations of organic chemistry reaction mechanisms from postsecondary courses. This rubric operationalizes and applies chemistry education research findings by articulating four hierarchical levels of explanation sophistication: absent, descriptive, foundational, and complex. We provide evidence for the utility of the rubric in an assortment of contexts: (a) stages of an organic chemistry course (i.e., first or second semester), (b) across nucleophile and reaction types, and (c) across prompt variations. We, as well, present a case study detailing how this rubric could be applied in a course to collect assessment data to inform learning and instruction. Our results demonstrate the practical implementation of this rubric to assess understanding of nucleophiles and offer avenues for establishing rubrics for additional mechanistic components, and understanding and evaluating curricula.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.