Telephone surveys have been a ubiquitous method of collecting survey data, but the environment for telephone surveys is changing. Many surveys are transitioning from telephone to self-administration or combinations of modes for both recruitment and survey administration. Survey organizations are conducting these transitions from telephone to mixed modes with only limited guidance from existing empirical literature and best practices. This article summarizes findings by an AAPOR Task Force on how these transitions have occurred for surveys and research organizations in general. We find that transitions from a telephone to a self-administered or mixed-mode survey are motivated by a desire to control costs, to maintain or improve data quality, or both. The most common mode to recruit respondents when transitioning is mail, but recent mixed-mode studies use only web or mail and web together as survey administration modes. Although early studies found that telephone response rates met or exceeded response rates to the self-administered or mixed modes, after about 2013, response rates to the self-administered or mixed modes tended to exceed those for the telephone mode, largely because of a decline in the telephone mode response rates. Transitioning offers opportunities related to improved frame coverage and geographic targeting, delivery of incentives, visual design of an instrument, and cost savings, but challenges exist related to selecting a respondent within a household, length of a questionnaire, differences across modes in use of computerization to facilitate skip patterns and other questionnaire design features, and lack of an interviewer for respondent motivation and clarification. Other challenges related to surveying youth, conducting surveys in multiple languages, collecting nonsurvey data such as biomeasures or consent to link to administrative data, and estimation with multiple modes are also prominent.
The literature on survey data fabrication is fairly thin, given the serious threat it poses to data quality. Recent contributions have focused on detecting interviewer fabrication, with an emphasis on statistical detection methods as a way to efficiently target reinterviews. We believe this focus to be too narrow. The paper looks at the problem of fabrication in a different way, exploring new data that shows the problem goes beyond interviewer curbstoning. A surprising amount of apparent fabrication is easily detected through comparatively rudimentary methods such as analysis of duplicate data. We then examine the motivations behind survey data fabrication and explore the utility of fraud investigation frameworks in detecting survey data fabrication. We finish with a brief discussion of the importance of additional research in this area and suggest questions worth exploring further. This paper is a synthesis of presentations given by the authors at an event sponsored by the Washington Statistical Society.
Increasing nonresponse rates in federal surveys and potentially biased survey estimates are a growing concern, especially with regard to establishment surveys. Unlike household surveys, not all establishments contribute equally to survey estimates. With regard to agricultural surveys, if an extremely large farm fails to complete a survey, the United States Department of Agriculture (USDA) could potentially underestimate average acres operated among other things. In order to identify likely nonrespondents prior to data collection, the USDA’s National Agricultural Statistics Service (NASS) began modeling nonresponse using Census of Agriculture data and prior Agricultural Resource Management Survey (ARMS) response history. Using an ensemble of classification trees, NASS has estimated nonresponse propensities for ARMS that can be used to predict nonresponse and are correlated with key ARMS estimates.
Nonresponse rates have been growing over time leading to concerns about survey data quality. Adaptive designs seek to allocate scarce resources by targeting specific subsets of sampled units for additional effort or a different recruitment protocol. In order to be effective in reducing nonresponse, the identified subsets of the sample need two key features: 1) their probabilities of response can be impacted by changing design features, and 2) once they have responded, this can have an impact on estimates after adjustment. The National Agricultural Statistics Service (NASS) is investigating the use of adaptive design techniques in the Crops Acreage, Production, and Stocks Survey (Crops APS). The Crops APS is a survey of establishments which vary in size and, hence, in their potential impact on estimates. In order to identify subgroups for targeted designs, we conducted a simulation study that used Census of Agriculture (COA) data as proxies for similar survey items. Different patterns of nonresponse were simulated to identify subgroups that may reduce estimated nonresponse bias when their response propensities are changed.
There are many methods that can be used to test questionnaires, each with its own strengths and weaknesses. The best approaches to questionnaire testing combine different methods to both broaden and strengthen the results. The US Census of Agriculture (COA) is conducted every five years and collects detailed information on agricultural production, inventories, practices, and operator demographics from agricultural establishments. Preceding each COA, evaluation and testing is done to test new items in the questionnaire and improve data quality for the subsequent COA. This article will describe how a multi-method approach, which we call Bento Box Testing, was applied to establishment questionnaire testing leading up to the 2017 COA. Testing included solicitation of expert opinion, historical data review, cognitive testing, a large scale field test, and qualitative follow-up interviews. The benefits of these testing methods, considerations for establishment survey testing, and how their results in combination provide a stronger evaluation are discussed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.