2013
DOI: 10.1080/17538947.2013.839008
|View full text |Cite
|
Sign up to set email alerts
|

Rating crowdsourced annotations: evaluating contributions of variable quality and completeness

Abstract: Crowdsourcing has become a popular means to acquire data about the Earth and its environment inexpensively, but the data-sets obtained are typically imperfect and of unknown quality. Two common imperfections with crowdsourced data are the contributions from cheats or spammers and missing cases. The effect of the latter two imperfections on a method to evaluate the accuracy of crowdsourced data via a latent class model was explored. Using simulated and real data-sets, it was shown that the method is able to der… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 50 publications
0
9
0
Order By: Relevance
“…Our current version of database should be seen as a first reference point to provide quantitative and qualitative information of limestone hills in Peninsular Malaysia that can assist users to verify and build upon (e.g. Foody, 2014). We acknolwedge that our database may not have had a rigorous ground-truthing of data quality to be considered a ‘final’ map because there are many hills that still require further verification.…”
Section: Resultsmentioning
confidence: 99%
“…Our current version of database should be seen as a first reference point to provide quantitative and qualitative information of limestone hills in Peninsular Malaysia that can assist users to verify and build upon (e.g. Foody, 2014). We acknolwedge that our database may not have had a rigorous ground-truthing of data quality to be considered a ‘final’ map because there are many hills that still require further verification.…”
Section: Resultsmentioning
confidence: 99%
“…Similarly, See et al () employed this method to evaluate the accuracy and consistency of volunteers when labeling land cover and determining the human impact on the environment. In contrast, Foody () explored the redundancy of contributions to situations in which a large proportion of data is provided by poor sources and/or is incomplete.…”
Section: Taxonomy Of Quality Assessment Methodsmentioning
confidence: 99%
“…[45]). We acknowledge that our data lacked rigorous enough ground-checking for the map to be considered final: there are many hills that require further verification.…”
Section: Implications For Conservationmentioning
confidence: 99%