2023
DOI: 10.2196/38412
|View full text |Cite
|
Sign up to set email alerts
|

Agreement Between Experts and an Untrained Crowd for Identifying Dermoscopic Features Using a Gamified App: Reader Feasibility Study

Abstract: Background Dermoscopy is commonly used for the evaluation of pigmented lesions, but agreement between experts for identification of dermoscopic structures is known to be relatively poor. Expert labeling of medical data is a bottleneck in the development of machine learning (ML) tools, and crowdsourcing has been demonstrated as a cost- and time-efficient method for the annotation of medical images. Objective The aim of this study is to demonstrate that c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
13
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 11 publications
(13 citation statements)
references
References 56 publications
0
13
0
Order By: Relevance
“…Several such projects can take months to obtain high-quality labels (Cocos et al, 2017 ). Third, cost is a major factor in being able to determine the viability of such a project (Kentley et al, 2023 ; Ørting et al, 2020 ). By paying the crowd-sourced workers a total of in daily rewards over 14 days, Centaur Labs obtained 143,209 classification labels.…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…Several such projects can take months to obtain high-quality labels (Cocos et al, 2017 ). Third, cost is a major factor in being able to determine the viability of such a project (Kentley et al, 2023 ; Ørting et al, 2020 ). By paying the crowd-sourced workers a total of in daily rewards over 14 days, Centaur Labs obtained 143,209 classification labels.…”
Section: Discussionmentioning
confidence: 99%
“…This bolsters some of the wisdom of the crowd findings where novices, such as undergraduate psychology students, could learn to classify white blood cell images which when combined together exceeded expert performance (Hasan et al, 2023 ). Non-experts recruited in DiagnosUs with Centaur Labs showed that with a little training, crowds could identify complex lesion attributes (Kentley et al, 2023 ). This opens up the possibility of expanding the scope of citizen science projects (Cohn, 2008 ; Sullivan et al, 2014 ).…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations