Internet surveys of American Evaluation Association (AEA) members are a common method for studying evaluation practice. Response rates obtained from Internet surveys of AEA members are, however, frequently very small. To investigate whether or not material incentives increase response rates to Internet surveys of AEA members, a between-subjects three-treatment and one control randomized experiment in which a randomly selected sample of AEA members were randomly assigned to a no-incentive control condition, lottery condition, token incentive condition, or philanthropic donation incentive condition was utilized. The overall response rate to the survey was 39.66% and the response rates for each of the four conditions were control = 36.24%, lottery = 44.39%, token incentive = 43.28%, and philanthropic donation = 34.67%, respectively. The cost-effectiveness of each of the four conditions also was examined, demonstrating that the lottery was the most cost-effective. Other factors potentially influencing response or nonresponse decisions also are discussed.
Background Billions of dollars are spent annually on grant-funded STEM (science, technology, engineering, and mathematics) education programs. These programs help students stay on track toward STEM careers when standard educational practices do not adequately prepare them for these careers. It is important to know that reliable and accurate student participation and completion data are being collected about these programs. This multiple case study investigates how student data are collected and reported for a national STEM education program in the United States, the National Science Foundation (NSF) Advanced Technological Education (ATE) program. Our overall aim is to provide insights to funding agencies, STEM education faculty, and others who are interested in addressing issues related to the collection and reporting of student participation and completion data within their own contexts. Emphasis is placed on the barriers encountered in collecting participation and completion data, particularly with regard to unduplicated participation counts and marketable credential data. The ATE program was selected for this study because there is already a mechanism (known as the ATE Survey) in place for annually collecting systematic data across all projects within the program. Results A multiple case study, including interviews of primary investigators, allowed for in-depth analysis of the ATE Survey’s point-in-time data on project-level participation in various activities, and for identification of the following barriers to tracking student-level data: lack of time and help to gather these data, lack of a consistent system for tracking students across different institutions, and a perceived lack of guidance from the funding agency about what data to track. We also saw that different data are needed from different projects to determine a project’s true impact. Defining “success” the same way across all projects is inadequate. Conclusions Although, due to the limited sample size, these findings cannot be generalized to the larger ATE population, they provide specific insights into the various barriers that projects encounter in collecting participation and completion data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.