2018
DOI: 10.7717/peerj.5953
|View full text |Cite
|
Sign up to set email alerts
|

Reassessing public opinion of captive cetacean attractions with a photo elicitation survey

Abstract: BackgroundCaptive cetacean attractions are growing in number globally, their operators citing entertainment, education, and conservation as benefits. Those for and against developing such attractions claim public support. Previous public opinion research, however, shows little consensus, partly due to the introduction of biases in study design that influence participants’ responses. Those involved in, or concerned with, developing and licensing these attractions need to better understand what drives the lack o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
23
0

Year Published

2019
2019
2025
2025

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(23 citation statements)
references
References 63 publications
0
23
0
Order By: Relevance
“…We then conducted semi-structured interviews across Alberta's BMAs to gather first-hand accounts, perspectives and experiences with grizzly bear recovery policy from the people who live, work and recreate in these areas (BMA; Laswell, 1971;Clark, 2002;Yin, 2014). We used a key informant list, generated by the provincial governments' carnivore specialist, to develop an initial interview sample of government biologists, landowners (e.g., cattle ranchers, crop farmers), natural resource sector personnel (forestry, petroleum industry, mining), and environmental non-government organizations (ENGOs; Noy, 2008;Drury et al, 2011). Additional participants were identified via chain referral, which enabled us to collect first-hand interview data grounded in the participants' own words, from a diverse range of people across Alberta's BMAs (Biernacki and Waldorf, 1981;Noy, 2008;Goldman et al, 2010;Bixler, 2013;Vernon and Clark, 2015).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…We then conducted semi-structured interviews across Alberta's BMAs to gather first-hand accounts, perspectives and experiences with grizzly bear recovery policy from the people who live, work and recreate in these areas (BMA; Laswell, 1971;Clark, 2002;Yin, 2014). We used a key informant list, generated by the provincial governments' carnivore specialist, to develop an initial interview sample of government biologists, landowners (e.g., cattle ranchers, crop farmers), natural resource sector personnel (forestry, petroleum industry, mining), and environmental non-government organizations (ENGOs; Noy, 2008;Drury et al, 2011). Additional participants were identified via chain referral, which enabled us to collect first-hand interview data grounded in the participants' own words, from a diverse range of people across Alberta's BMAs (Biernacki and Waldorf, 1981;Noy, 2008;Goldman et al, 2010;Bixler, 2013;Vernon and Clark, 2015).…”
Section: Methodsmentioning
confidence: 99%
“…Face-to-face interviews were preferred, though telephone sessions were made available if there were constraints to meeting in-person (Novick, 2008). A semi-structured interview guide informed by similar studies was used, with latitude to explore topics more deeply as they emerged through the interview (Drury et al, 2011;Bennett, 2016). An iterative process of collection-transcription-analysis was used to determine corroboration and saturation of interview data, which included comparing and contrasting data to develop provisional descriptions of the problem perspectives (Patton, 1990;Clark et al, 2008;Rust and Taylor, 2016).…”
Section: Methodsmentioning
confidence: 99%
“…Instead of possible bias introduced in the wording and response alternatives of closed-ended questions, the locus for bias in openended questions is transferred to response interpretation and coding (Burns, 1989;Campbell, 75 Quincy, Osserman, & Pederson, 2013;Passer, 2017). In the case of Wassermann et al (2018), 76 such bias may have been introduced on two levels. First, rather than using a recording device to produce an accurate transcript of 78 respondents' comments, surveyors simply "took notes" on these comments.…”
Section: Wassermann Et Al (2018) Argued That Asking Respondents An Omentioning
confidence: 99%
“…Second, these summarized notes on respondents' comments were then coded as expressing either positive or negative views of each type of marine mammal attraction, and also for any expressed reasons for these opinions, such as appropriate for children (positive) or animal welfare concerns (negative). Wassermann et al (2018) neither provided details about their criteria for classifying these opinions (i.e., operational definitions) nor any measure of intercoder reliability. The lack of these details is extremely problematic, as studies have shown that the high degree of inference needed to categorize open-ended responses can lead to a high probability of initial error and bias in interpretation, often resulting in low levels of agreement during initial coding (e.g., Burns, 1989;Carey et al, 1996;Hagelin, 1999;Hruschka et al, 2004;Passer, 2017).…”
Section: Wassermann Et Al (2018) Argued That Asking Respondents An Omentioning
confidence: 99%
See 1 more Smart Citation