2019
DOI: 10.1186/s40594-019-0169-0
|View full text |Cite
|
Sign up to set email alerts
|

Multiple-true-false questions reveal more thoroughly the complexity of student thinking than multiple-choice questions: a Bayesian item response model comparison

Abstract: Background: Within undergraduate science courses, instructors often assess student thinking using closed-ended question formats, such as multiple-choice (MC) and multiple-true-false (MTF), where students provide answers with respect to predetermined response options. While MC and MTF questions both consist of a question stem followed by a series of options, MC questions require students to select just one answer, whereas MTF questions enable students to evaluate each option as either true or false. We employed… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
26
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 42 publications
(28 citation statements)
references
References 49 publications
2
26
0
Order By: Relevance
“…We designed our assessments to measure learning of the targeted objectives. The assessments included multiple-choice and multiple-true-false questions, which required students to evaluate multiple statements describing a single scenario and helped us to diagnose misconceptions [20][21][22]. We scored multiple-choice questions with an equal weight to individual multiple-true-false statements.…”
Section: Data Collection and Analysismentioning
confidence: 99%
“…We designed our assessments to measure learning of the targeted objectives. The assessments included multiple-choice and multiple-true-false questions, which required students to evaluate multiple statements describing a single scenario and helped us to diagnose misconceptions [20][21][22]. We scored multiple-choice questions with an equal weight to individual multiple-true-false statements.…”
Section: Data Collection and Analysismentioning
confidence: 99%
“…, 2017 ; Couch et al. , 2018 ; Brassil and Couch, 2019 ). Each question in our assessments was unique, and successive assessments were not designed to evaluate longitudinal learning of a concept (e.g., the “energy charge” concept was measured in the Glycolysis, TCA, and ETC assessments, but the questions were unique to each system).…”
Section: Methodsmentioning
confidence: 99%
“…We used multiple true-false (MTF) questions consisting of a question stem that is presented together with a series of statements that students evaluate as true or false (Supporting Files S4 and S8). We selected MTF questions because they can reveal student misconceptions that remain undetected in free-response and multiple-choice question formats (Hubbard et al, 2017;Couch et al, 2018;Brassil and Couch, 2019). Each question in our assessments was unique, and successive assessments were not designed to evaluate longitudinal learning of a concept (e.g., the "energy charge" concept was measured in the Glycolysis, TCA, and ETC assessments, but the questions were unique to each system).…”
Section: Assessment Design and Evaluationmentioning
confidence: 99%
“…For our assessments, we used multiple-true-false (MTF) questions that consist of a question stem that is presented together with a series of statements that are evaluated by students as being true or false. We selected MTF questions because they can reveal student misconceptions that remain undetected in freeresponse and multiple-choice question formats (53)(54)(55). The number of statements for each question is listed in Tables 1 and 2. Before class, students were asked to complete a Cell Collective training module to familiarize themselves with the technology and modeling concepts and to complete the pre-assessment questions online.…”
Section: Methodsmentioning
confidence: 99%