2018
DOI: 10.1111/jedm.12161
|View full text |Cite
|
Sign up to set email alerts
|

Cross‐Country Heterogeneity in Students’ Reporting Behavior: The Use of the Anchoring Vignette Method

Abstract: Self‐reports are an indispensable source of information in education research but they are often affected by heterogeneity in reporting behavior. Failing to correct for this heterogeneity can lead to invalid comparisons across groups. The researchers use the parametric anchoring vignette method to correct for cross‐country incomparability of students’ reports on teacher's classroom management. Their analysis is based on the data from the Programme for International Student Assessment 2012. The results show sig… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
15
0
3

Year Published

2018
2018
2021
2021

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 16 publications
(20 citation statements)
references
References 26 publications
2
15
0
3
Order By: Relevance
“…In contrast, ERS correction and ipsatization attenuated the negative correlations of math achievement and both constructs, suggesting that they are better at capturing and controlling for response amplification or moderation among individuals. Moreover, the marked changes that resulted from correcting for anchoring vignettes in the current study is in line with previous studies (e.g., He, van de Vijver, et al, 2017; Kyllonen & Bertling, 2014; Vonkova, Zamarro, et al, 2018), which also speaks to the robustness of different operationalizations and modelling approaches in such corrections. Anchoring vignettes appear to be a promising approach to enhance data comparability, given that the psychological meaning of target constructs was not changed in this study, and that anchoring vignettes rescaled scores tend to show higher measurement invariance in multigroup confirmatory factor analysis of PISA data (e.g., Marksteiner et al, 2019).…”
Section: Discussionsupporting
confidence: 92%
See 1 more Smart Citation
“…In contrast, ERS correction and ipsatization attenuated the negative correlations of math achievement and both constructs, suggesting that they are better at capturing and controlling for response amplification or moderation among individuals. Moreover, the marked changes that resulted from correcting for anchoring vignettes in the current study is in line with previous studies (e.g., He, van de Vijver, et al, 2017; Kyllonen & Bertling, 2014; Vonkova, Zamarro, et al, 2018), which also speaks to the robustness of different operationalizations and modelling approaches in such corrections. Anchoring vignettes appear to be a promising approach to enhance data comparability, given that the psychological meaning of target constructs was not changed in this study, and that anchoring vignettes rescaled scores tend to show higher measurement invariance in multigroup confirmatory factor analysis of PISA data (e.g., Marksteiner et al, 2019).…”
Section: Discussionsupporting
confidence: 92%
“…For instance, when overclaiming in math familiarity was corrected for in the PISA data, the weak and nonsignificant correlation between math familiarity and math achievement became strong and significant in 64 countries (Vonkova, Papajoanu, et al, 2018). Additionally, after performing an anchoring vignette correction on the Classroom Management scale in the PISA data, it was revealed that there may be different implicit standards of self-assessment across countries, and this resulted in substantial change in correlations with students test scores and public expenditure per pupil (Vonkova, Zamarro, et al, 2018). The famous motivation-achievement paradox (i.e., a positive correlation between Likert-scale data assessing motivational factors and achievement at the individual level, but negative correlation when scores were aggregated at the country level) was partially explained or alleviated when anchoring vignettes, overclaiming, and ERS corrections were applied to the Likert-scale data (e.g., He & van de Vijver, 2015b; Kyllonen & Bertling, 2014).…”
Section: Correction Procedures For Enhancing Data Comparabilitymentioning
confidence: 99%
“…Our results when comparing the adjusted and the unadjusted familiarity scores with math achievement (high significant correlation between adjusted math familiarity and math scores as opposed to low nonsignificant correlation for the unadjusted score) are in line with the findings of other studies. These studies compared math achievement with variables adjusted using different methods indicative of culturally preferred scale usage like the anchoring vignette method and extreme/midpoint response style analysis (He & van de Vijver, 2016; Kyllonen & Bertling, 2013; Vonkova et al, 2018). Our results further support the previous findings suggesting that the methods indeed help to adjust for different scale usage among different cultures.…”
Section: Discussionmentioning
confidence: 99%
“…In the case of the PISA survey, they document many paradoxical findings when analyzing the data at a country-level including negative correlations between country-level mathematics achievements and mathematics self-concept, mathematics interest, or attitudes toward school. Vonkova, Zamarro, and Hitt (2018) document a negative relationship between students’ assessments of teacher’s classroom management and students’ test results at a country-level. He and van de Vijver (2016) focus on the motivation-achievement paradox in international educational achievement tests, where a negative correlation between aggregated country-level achievement and motivation can often be found.…”
mentioning
confidence: 96%
“…It may, for example, be the case that two students with the same level of school behavior evaluate their behavior differently -one as excellent, the other, only as good (Vonkova, Bendl, & Papajoanu, 2017). Evidence for differential use of scale has been a long-term concern, not only in education research (Buckley, 2009;Chen, Lee, & Stevenson, 1995;Vonkova, Zamarro, & Hitt 2018) but also in other social sciences research Bago d'Uva et al, 2011;Kapteyn, Smith, & van Soest, 2007;King et al, 2004;Vonkova & Hullegie, 2011). Thus, even though questionnaires offer a relatively cheap and easy way to obtain large-scale data about school discipline, their results must be interpreted with caution (for a summary of strengths and weaknesses of questionnaires see Table 8).…”
Section: Peer Reportsmentioning
confidence: 99%