2013
DOI: 10.1002/j.2333-8504.2013.tb02340.x
|View full text |Cite
|
Sign up to set email alerts
|

Constructed‐response Dif Evaluations for Mixed‐format Tests

Abstract: In this study, differential item functioning (DIF) methods utilizing 14 different matching variables were applied to assess DIF in the constructed‐response (CR) items from 6 forms of 3 mixed‐format tests. Results suggested that the methods might produce distinct patterns of DIF results for different tests and testing programs, in that the DIF methods' results might be similar for tests with multiple‐choice (MC) and CR scores that are similar in their measurement characteristics but would exhibit larger variati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2017
2017
2017
2017

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 12 publications
0
1
0
Order By: Relevance
“…2.16 have been considered for polytomously scored items by ETS researchers, including Dorans and Schmitt (1993), Moses et al (2013), and Zwick et al (1997). At the time of this writing, there is great interest in developing more innovative items that utilize computer delivery and are more interactive in how they engage examinees.…”
Section: Analyses Of Alternate Item Types and Scoresmentioning
confidence: 99%
“…2.16 have been considered for polytomously scored items by ETS researchers, including Dorans and Schmitt (1993), Moses et al (2013), and Zwick et al (1997). At the time of this writing, there is great interest in developing more innovative items that utilize computer delivery and are more interactive in how they engage examinees.…”
Section: Analyses Of Alternate Item Types and Scoresmentioning
confidence: 99%