As occupational data play a crucial part in many social and economic analyses, information on the reliability of these data and, in particular on the role of coding agencies, is important. Based on our review of previous research, we develop four hypotheses, which we test using occupation-coded data from the German General Social Survey and the field test data from the German Programme for the International Assessment of Adult Competencies. Because the same data were coded by several agencies, their coding results could be directly compared. As the surveys used different instruments, and interviewer training differed, the effects of these factors could also be evaluated.
Our main findings are: the percentage of uncodeable responses is low (1.8–4.9%) but what is classified as “uncodeable” varies between coding agencies. Inter-agency coding reliability is relatively low κ ca. 0.5 at four-digit level, and codings sometimes differ systematically between agencies. The reliability of derived status scores is satisfactory (0.82–0.90). The previously reported negative relationship between answer length and coding reliability could be replicated and effects of interviewer training demonstrated. Finally, we discuss the importance of establishing common coding rules and present recommendations to overcome some of the problems in occupation coding.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.