A question asking for respondents’ sex is one of the standard sociodemographic characteristics collected in a survey. Until now, it typically consisted of a simple question (e.g., “Are you…?”) with two answer categories (“male” and “female”). In 2019, Germany implemented the additional sex designation divers for intersex people. In survey methodology, this has led to an ongoing discussion how to include a third category in questionnaires. We investigate respondents’ understanding of the third category, and whether introducing it affects data quality. Moreover, we investigate the understanding of the German term Geschlecht for sex and gender. To answer our research questions, we implemented different question wordings asking for sex/gender in a non-probability-based online panel in Germany and combined them with open-ended questions. Findings and implications for surveys are discussed.
AbstractSurveys measuring the same concept using the same measure on the same population at the same point in time should result in highly similar results. If this is not the case, this is a strong sign of lacking reliability, resulting in non-comparable data across surveys. Looking at the education variable, previous research has identified inconsistencies in the distributions of harmonised education variables, using the International Standard Classification of Education (ISCED), across surveys within the same countries and years. These inconsistencies are commonly explained by differences in the measurement, especially in the response categories of the education question, and in the harmonisation when classifying country-specific education categories into ISCED. However, other methodological characteristics of surveys, which we regard as ‘containers’ for several characteristics, may also contribute to this finding. We compare the education distributions of nine cross-national surveys with the European Union Labour Force Survey (EU-LFS), which is used as benchmark. This study analyses 15 survey characteristics to better explain the inconsistencies. The results confirm a predominant effect of the measurement instrument and harmonisation. Different sampling designs also explain inconsistencies, but to a lesser degree. Finally, we discuss the results and limitations of the study and provide ideas for improving data comparability.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.