2014
DOI: 10.1103/physrevstper.10.010105
|View full text |Cite
|
Sign up to set email alerts
|

Seeking missing pieces in science concept assessments: Reevaluating the Brief Electricity and Magnetism Assessment through Rasch analysis

Abstract: Discipline-based science concept assessments are powerful tools to measure learners' disciplinary core ideas. Among many such assessments, the Brief Electricity and Magnetism Assessment (BEMA) has been broadly used to gauge student conceptions of key electricity and magnetism (E&M) topics in college-level introductory physics courses. Differing from typical concept inventories that focus only on one topic of a subject area, BEMA covers a broad range of topics in the electromagnetism domain. In spite of this fa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
32
0
1

Year Published

2015
2015
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 30 publications
(34 citation statements)
references
References 37 publications
1
32
0
1
Order By: Relevance
“…This is a basic fact of EM, but the act of discovering vs. simply being told "it is so," likely increases the impact this has on a students' perception. With this premise in mind, all Physics 261 students were given the BEMA; designed as a tool for measuring introductory students' conceptual knowledge of EM, BEMA is purported to be a useful component of PER (Ding, 2014;Pollock & Finkelstein, 2014;Ding et al, 2006). Results for this study (for various reasons discussed below) were not as conclusive as hoped for, but certainly substantial enough to warrant further research.…”
Section: Literature Reviewmentioning
confidence: 88%
See 2 more Smart Citations
“…This is a basic fact of EM, but the act of discovering vs. simply being told "it is so," likely increases the impact this has on a students' perception. With this premise in mind, all Physics 261 students were given the BEMA; designed as a tool for measuring introductory students' conceptual knowledge of EM, BEMA is purported to be a useful component of PER (Ding, 2014;Pollock & Finkelstein, 2014;Ding et al, 2006). Results for this study (for various reasons discussed below) were not as conclusive as hoped for, but certainly substantial enough to warrant further research.…”
Section: Literature Reviewmentioning
confidence: 88%
“…The conceptual complexity of the subject matter can be particularly challenging for physics students, especially during the second semester, when electricity and magnetism (EM) is the main focus. To accurately assess their perceptions and learning gains, two key elements are necessary: (a) collection of student feedback through self-assessment, focus groups, or individual interview; and (b) assessment of knowledge and skills through written or practical exams or exercises (Lindsey & Nagel, 2015;Zwicki, Hirokawa, Finkelstein, & Lewandowski, 2014;Thacker et al, 2014;Ding, 2014;Ding, Chabay, Sherwood, & Beichner, 2006;Seymour, Wiese, Hunter, & Daffinrud, 2000). Additionally, the importance of ensuring an adequate (and current) knowledge base for instructor's, their ability to assess students' levels of comprehension and explain concepts in a meaningful and productive way has been illustrated (Eylon & Bagno, 2006).…”
Section: Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation
“…Two common perspectives on test development are Classical Test Theory (CTT) [32] and Item Response Theory (IRT) [33]. The majority of conceptual assessments in physics, at both the introductory and upper-division levels have been validated using CTT, while only a small number have been developed or analyzed using IRT [34][35][36][37]. One significant drawback of CTT is that all test statistics are population dependent.…”
Section: Validationmentioning
confidence: 99%
“…These include, in particular, different forms of a written test, extensively described and compared in the literature, such as free-and multipleresponse tests (Wilcox & Pollock, 2014), concept tests (such as the Test of Understanding Graphs in Kinematics (Maries & Singh, 2013), Force Concept Inventory (Hestenes et al, 1992) or Brief electricity and magnetism assessment (Ding et al, 2006) and others (Hitt et al, 2014;, constructed-response tests (Slepkov & Shiell, 2014), essay tests (Kruglak, 1955), laboratory skills tests (Doran et al, 1993) and others. Also, many modifications and extensions of these tests have already been proposed in the literature, improving upon their original form (Ding, 2014;Docktor et al, 2015;Wooten et al, 2014;Zwolak & Manogue, 2015). On the other hand, some authors propose to blend formative and summative assessment techniques.…”
Section: Introductionmentioning
confidence: 99%