2010
DOI: 10.1080/02602930802687752
|View full text |Cite
|
Sign up to set email alerts
|

A recipe for effective participation rates for web‐based surveys

Abstract: One way many universities have approached the process of better understanding and meeting the needs of their students is through student evaluations. The evaluation data provide not only diagnostic feedback but also useful information in terms of the quality of learning and teaching experiences. In an effective quality cycle, the data gathered are analysed and used to make improvements. This is often referred to as 'closing the loop'. However, for any evaluation data to be of value an important prerequisite fo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
35
0
1

Year Published

2014
2014
2017
2017

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 53 publications
(38 citation statements)
references
References 9 publications
2
35
0
1
Order By: Relevance
“…In addition, the difference in responses between online and telephonic surveys was discussed. Although the literature discusses many strategies to improve response rates to online surveys (e.g., see Porter [2004b], Nair, Adams andMertova [2008], Nulty [2008], and Bennett and Nair [2010], as well as Alderman, Towers and Bannah [2012, 272] for additional references), such strategies pertain mostly to student evaluation surveys conducted shortly before, during or immediately after graduation, while graduate destination surveys pose the difficulty of tracing students several years after graduation. The purpose here is therefore not to repeat a discussion of strategies already known to improve response rates, but to conclude with three methodological suggestions that add to the literature on student surveys in general and graduate destination surveys in particular.…”
Section: Resultsmentioning
confidence: 99%
“…In addition, the difference in responses between online and telephonic surveys was discussed. Although the literature discusses many strategies to improve response rates to online surveys (e.g., see Porter [2004b], Nair, Adams andMertova [2008], Nulty [2008], and Bennett and Nair [2010], as well as Alderman, Towers and Bannah [2012, 272] for additional references), such strategies pertain mostly to student evaluation surveys conducted shortly before, during or immediately after graduation, while graduate destination surveys pose the difficulty of tracing students several years after graduation. The purpose here is therefore not to repeat a discussion of strategies already known to improve response rates, but to conclude with three methodological suggestions that add to the literature on student surveys in general and graduate destination surveys in particular.…”
Section: Resultsmentioning
confidence: 99%
“…Thirdly, studying low response rates enables us to increase the quality of the data, strengthen any interpretation we make of them and recommend legitimate use (outcome validity) of the results (Adams & Umbach, 2012). Finally, for evaluation data to achieve value, response rates need to be sufficiently high so that they are representative of the student cohort (Bennett & Nair, 2010). In these same terms, Schiekirka and Raupach (2015) argue that as participation in evaluation activities is not mandatory at all universities, self-selection of students providing course ratings might produce biased samples.…”
Section: Literature Reviewmentioning
confidence: 99%
“…To do so, this researcher built a website to advertise the nature and scope of study. This served multiple purposes such as acting as a simple access point for all related information, along with indicating the initial announcement and subsequent open response period (Andrews, Nonnecke & Preece, 2003;Archer, 2008;Bennett, & Nair, 2010). The survey was advertised on 13 internet/social media forums that cater to expatriates locally (in addition to word of mouth).…”
Section: Visibilitymentioning
confidence: 99%
“…This researcher also had the survey items reviewed and piloted by several known acquaintances who fit the definition of expatriate distance student as a formative evaluation for wording, clarity, and to point out any discrepancies or errors (Bennett & Nair, 2010;Burford et al, 2009;Morrison, Ross, Kalman & Kemp, 2011). By observing and timing trial runs, the length of time needed to complete the survey was documented and advertised as an effort to increase participation (Andrews et al, 2003;Archer, 2008;Sinkowitz-Cochran, 2013;Trouteaud, 2004).…”
Section: Visibilitymentioning
confidence: 99%