2020
DOI: 10.1002/ece3.6722
|View full text |Cite
|
Sign up to set email alerts
|

Camera settings and biome influence the accuracy of citizen science approaches to camera trap image classification

Abstract: This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

1
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(7 citation statements)
references
References 42 publications
1
6
0
Order By: Relevance
“…Instead, in the task of detecting presence or absence of iguanas within an image, we found that the answer selected by five volunteers or more (from 20 or 30 total classifications) — which we refer to as the minimum threshold — was the most likely to be correct. This was somewhat unexpected, since previous work has shown that accuracy increased asymptotically with number of classifications per image (Swanson et al, 2016); however, other projects hosted by Zooniverse have found outcomes more similar to ours (Egna et al, 2020; Hennon et al, 2015; Lawson et al, 2022). We suspect that for some challenging images, highly skilled volunteers were required to identify the iguanas, and that these volunteers were relatively rare.…”
Section: Discussionsupporting
confidence: 51%
See 1 more Smart Citation
“…Instead, in the task of detecting presence or absence of iguanas within an image, we found that the answer selected by five volunteers or more (from 20 or 30 total classifications) — which we refer to as the minimum threshold — was the most likely to be correct. This was somewhat unexpected, since previous work has shown that accuracy increased asymptotically with number of classifications per image (Swanson et al, 2016); however, other projects hosted by Zooniverse have found outcomes more similar to ours (Egna et al, 2020; Hennon et al, 2015; Lawson et al, 2022). We suspect that for some challenging images, highly skilled volunteers were required to identify the iguanas, and that these volunteers were relatively rare.…”
Section: Discussionsupporting
confidence: 51%
“…Multiple studies have found that researchers can obtain accurate data from volunteers by properly aggregating multiple independent responses for one subject (e.g. an image or audio file; Anton et al, 2018; Egna et al, 2020; Swanson et al, 2016; Torney et al, 2019). Recent research has identified the number of independent classifications required to give accurate results, thereby streamlining the approach whilst still obtaining scientifically valid outcomes (Swanson et al, 2015).…”
Section: Discussionmentioning
confidence: 99%
“…Camera‐trapping citizen science projects have burgeoned recently and been shown to provide ecologically meaningful data (Hsing et al., 2018; Lasky et al., 2021; McShea et al., 2016; Swanson et al., 2016). Factors such as camera settings and location can impact classification accuracy; specifically, sequences with multiple photos have been found to have higher classification accuracy than those with single photos (Egna et al., 2020). Camera trap videos could allow for easier species identification because movement can make animals easier to locate within the footage, and because more information is available to an observer, such as different views of an animal, their gait or movement profile, and sound.…”
Section: Introductionmentioning
confidence: 99%
“…However, probably owing to the concerns outlined above, most camera trap citizen science projects use photographs and there has been little assessment of citizen science classification accuracy of videos (but see McCarthy et al., 2021). Gaining adequate numbers of classifications is important for timely processing and for combining multiple classifications to achieve higher confidence in classification accuracy (Anton et al., 2018; Egna et al., 2020; Hsing et al., 2018; Swanson et al., 2016). Attracting participants and maintaining engagement are, therefore, important considerations for citizen science projects (Meek & Zimmermann, 2016).…”
Section: Introductionmentioning
confidence: 99%
“…One potential limitation of this approach is the unknown reliability of the information obtained because of the risk of species misidentification (Miller et al 2012, Egna et al 2020). Some methods can reduce this limitation, such as checking the reliability of responses by asking the interviewees to identify the species in pictures (Wiggins et al 2011, Clare et al 2019, Balázs et al 2021).…”
mentioning
confidence: 99%