In a first-semester general chemistry course, metacognitive training was implemented as part of an online homework system. Students completed weekly quizzes and multiple practice tests to regularly assess their abilities on the chemistry principles. Before taking these assessments, students predicted their score, receiving feedback after completing the assessment on their prediction accuracy. They also received detailed information regarding their ability for each assessment topic and used this information to create a future study plan. During this study plan, students indicated their general ability by chemistry topic and selected areas they would focus their studying upon. A control section completed the same assessments and received the same feedback of ability by topic, but students did not predict scores or create study plans. Results indicate identical initial assessment performance between the two chemistry course sections. However, metacognitive training resulted in improved assessment performance on each subsequent midterm exam and on the American Chemical Society (ACS) general chemistry final exam. After factoring out the effect of teacher differences, metacognitive training improved student ACS final exam average performance by approximately 4% when compared to the control section. Additionally, metacognitive training targeted the bottom quartile of the course by improving their ACS final exam average performance by approximately 10% when compared to the control section.
Five years of longitudinal data for general chemistry student assessments at the University of Georgia have been analyzed using item response theory (IRT). Our analysis indicates that minor changes in question wording on exams can make significant differences in student performance on assessment questions. This analysis encompasses data from over 6100 students, giving an extremely small statistical uncertainty. IRT provided us with a new insight into student performance on our assessments that is also important to the chemical education community. In this paper, IRT, in conjunction with computerized testing, indicates how nuances in question wording impact student performance on assessments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.