Standard-Nutzungsbedingungen:Die Dokumente auf EconStor dürfen zu eigenen wissenschaftlichen Zwecken und zum Privatgebrauch gespeichert und kopiert werden.Sie dürfen die Dokumente nicht für öffentliche oder kommerzielle Zwecke vervielfältigen, öffentlich ausstellen, öffentlich zugänglich machen, vertreiben oder anderweitig nutzen.Sofern die Verfasser die Dokumente unter Open-Content-Lizenzen (insbesondere CC-Lizenzen) zur Verfügung gestellt haben sollten, gelten abweichend von diesen Nutzungsbedingungen die in der dort genannten Lizenz gewährten Nutzungsrechte. Terms of use: Documents in EconStor may be saved and copied for your personal and scholarly purposes.You are not to copy documents for public or commercial purposes, to exhibit the documents publicly, to make them publicly available on the internet, or to distribute or otherwise use the documents in public. If the documents have been made available under an OpenContent Licence (especially Creative Commons Licences), you may exercise further usage rights as specified in the indicated licence. www.econstor.euThis is a pre-copyedited, author-produced PDF of an article accepted for publication in "Public Opinion Quarterly" following peer review. In survey research, a consensus has grown regarding the effectiveness of incentives encouraging survey participation across different survey modes and target populations. Most of this research has been based on surveys from the US, whereas few studies have provided evidence that these results can be generalized to other contexts. This paper is the first to present comprehensive information concerning the effects of incentives on response rates and nonresponse bias across large-scale surveys in Germany. The context could be viewed as a critical test for incentive effects because Germany's population is among the most survey-critical in the world, with very low response rates. Our results suggest positive incentive effects on response rates and patterns of effects that are similar to those in previous research: The effect increased with the monetary value of the incentive; cash incentives affected response propensity more strongly than lottery tickets do; and prepaid incentives could be more cost-effective than conditional incentives. We found mixed results for the effects of incentives on nonresponse bias. Regarding large-scale panel surveys, we could not unequivocally confirm that incentives increased response rates in later panel waves. 6 Survey researchers have been increasingly concerned with decreasing response rates, a change that has been reported in developed countries over the last several decades (Atrostic et al. 2001; de Leeuw and de Heer 2002;Brick and Williams 2013). Decreasing response rates can lead to biased estimates if the nonresponse is not at random (Rubin 1976). Even when nonresponse is not selective, increasing the sample size as a direct countermeasure incurs higher costs. To increase survey response, several methods have been developed, such as advance letters, special contacting procedure...
More and more surveys are conducted online. While web surveys are generally cheaper and tend to have lower measurement error in comparison to other survey modes, especially for sensitive questions, potential advantages might be offset by larger nonresponse bias. This article compares the data quality in a web survey administration to another common mode of survey administration, the telephone. The unique feature of this study is the availability of administrative records for all sampled individuals in combination with a random assignment of survey mode. This specific design allows us to investigate and compare potential bias in survey statistics due to 1) nonresponse error, 2) measurement error, and 3) combined bias of these two error sources and hence, an overall assessment of data quality for two common modes of survey administration, telephone and web. Our results show that overall mean estimates on the web are more biased compared to the telephone mode. Nonresponse and measurement bias tend to reinforce each other in both modes, with nonresponse bias being somewhat more pronounced in the web mode. While measurement error bias tends to be smaller in the web survey implementation, interestingly, our results also show that the web does not consistently outperform the telephone mode for sensitive questions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.