Discussions of diagnostic tools that gauge students’ conceptual understanding permeate the literature. Many instructors report their class’ normalized gain to characterize the change in scores from pre-test to post-test. We describe a new procedure for characterizing these changes. This procedure, which we call the normalized change, c, involves the ratio of the gain to the maximum possible gain or the loss to the maximum possible loss. We also advocate reporting the average of a class’ normalized changes and utilizing a particular statistical and graphical approach for comparing average c values.
In 1993, Rensselaer introduced the first Studio Physics course. Two years later, the Force Concept Inventory ͑FCI͒ was used to measure the conceptual learning gain ͗g͘ in the course. This was found to be a disappointing 0.22, indicating that Studio Physics was no more effective at teaching basic Newtonian concepts than a traditional course. Our study verified that result, ͗g FCI,98 ͘ϭ0.18 Ϯ0.12 (s.d.), and thereby provides a baseline measurement of conceptual learning gains in Studio Physics I for engineers. These low gains are especially disturbing because the studio classroom appears to be interactive and instructors strive to incorporate modern pedagogies. The goal of our investigation was to determine if incorporation of research-based activities into Studio Physics would have a significant effect on conceptual learning gains. To measure gains, we utilized the Force Concept Inventory and the Force and Motion Conceptual Evaluation ͑FMCE͒. In the process of pursuing this goal, we verified the effectiveness of Interactive Lecture Demonstrations ͓͗g FCI ͘ ϭ0.35Ϯ0.06 (s.d.) and ͗g FMCE ͘ϭ0.45Ϯ0.03 (s.d.)͔ and Cooperative Group Problem Solving (͗g FCI ͘ϭ0.36 and ͗g FMCE ͘ϭ0.36), and examined the feasibility of using these techniques in the studio classroom. Further, we have assessed conceptual learning in the standard Studio Physics course ͓͗g FCI,98 ͘ϭ0.18Ϯ0.12 (s.d.) and ͗g FMCE,98 ͘ϭ0.21Ϯ0.05 (s.d.)͔. In this paper, we will clarify the issues noted above. We will also discuss difficulties in implementing these techniques for first time users and implications for the future directions of the Studio Physics courses at Rensselaer.
In this paper we compare and contrast student’s pretest/post-test performance on the Halloun-Hestenes force concept inventory (FCI) to the Thornton-Sokoloff force and motion conceptual evaluation (FMCE). Both tests are multiple-choice assessment instruments whose results are used to characterize how well a first term, introductory physics course promotes conceptual understanding. However, the two exams have slightly different content domains, as well as different representational formats; hence, one exam or the other might better fit the interests of a given instructor or researcher. To begin the comparison, we outline how to determine a single-number score for the FMCE and present ranges of normalized gains on this exam. We then compare scores on the FCI and the FMCE for approximately 2000 students enrolled in the Studio Physics course at Rensselaer Polytechnic Institute over a period of eight years (1998–2006) that encompassed significant evolution of the course and many different instructors. We found that the mean score on the FCI is significantly higher than the mean score on the FMCE, however there is a very strong relationship between scores on the two exams. The slope of a best fit line drawn through FCI versus FMCE data is approximately 0.54, and the correlation coefficient is approximately r=0.78 , for preinstructional and postinstructional testings combined. In spite of this strong relationship, the assessments measure different normalized gains under identical circumstances. Additionally, students who scored well on one exam did not necessarily score well on the other. We use this discrepancy to uncover some subtle, but important, differences between the exams. We also present ranges of normalized gains for the FMCE in a variety of instructional settings
To gauge the impact of instruction on students' general expectations about physics and their attitudes about problem solving, we administered two different, but related, survey instruments to students in the first semester of introductory, calculus-based physics at McDaniel College. The surveys we used were the Maryland Physics Expectation Survey (MPEX) and the Attitudes about Problem Solving Survey (APSS). We found that the McDaniel College students' overall responses were more "expert-like" post-instruction: on the MPEX, the students' Overall agree/disagree score started at 59/18 and ended at 63/17, and on the APSS, the students' agreement-score went from 63 to 79. (All scores are out of 100%.) All of the students to whom we administered the MPEX and a significant subgroup to whom we administered the APSS realized these improvements without experiencing any explicit instructional intervention in this course aimed toward improving attitudes and expectations. These results contrast much of the previously reported findings in this area.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.