Proceedings of the 2018 ACM Conference on Economics and Computation 2018
DOI: 10.1145/3219166.3219172
|View full text |Cite
|
Sign up to set email alerts
|

Eliciting Expertise without Verification

Abstract: A central question 1 of crowdsourcing is how to elicit expertise from agents. This is even more difficult when answers cannot be directly verified. A key challenge is that sophisticated agents may strategically withhold effort or information when they believe their payoff will be based upon comparison with other agents whose reports will likely omit this information due to lack of effort or expertise.Our work defines a natural model for this setting based on the assumption that more sophisticated agents know t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
2
2
2

Relationship

2
4

Authors

Journals

citations
Cited by 17 publications
(20 citation statements)
references
References 29 publications
0
20
0
Order By: Relevance
“…Cai et al (2015) propose a continuous effort model for data elicitation, where exerting effort gives agents a lower variance signal. However, in our setting, lower effort reviews may not only be noisier, they may also be systematically biased (Kong and Schoenebeck, 2018). Gao et al (2016) caution that in practice (and as highly relevant in our case), agents have a variety of low-effort signals such as paper length, topic, etc., potentially leading to low-effort coordination into uninformative equilibria.…”
Section: Information Elicitation Without Verificationmentioning
confidence: 81%
See 4 more Smart Citations
“…Cai et al (2015) propose a continuous effort model for data elicitation, where exerting effort gives agents a lower variance signal. However, in our setting, lower effort reviews may not only be noisier, they may also be systematically biased (Kong and Schoenebeck, 2018). Gao et al (2016) caution that in practice (and as highly relevant in our case), agents have a variety of low-effort signals such as paper length, topic, etc., potentially leading to low-effort coordination into uninformative equilibria.…”
Section: Information Elicitation Without Verificationmentioning
confidence: 81%
“…The IEWV literature is focused on the problem of eliciting truthful private 'signals' when the mechanism designer is unable to verify the responses; this may be because the ground truth is difficult or impossible for the mechanism to access or there is simply no ground truth since the responses are subjective. The key application domains of interest include tasks like crowdsourcing data, product ratings, community sensing, and peer grading (Cai et al, 2015;Radanovic et al, 2016;Dasgupta and Ghosh, 2013;Kong and Schoenebeck, 2018). More recent work is also interested in ensuring that private signals are obtained with effort.…”
Section: Information Elicitation Without Verificationmentioning
confidence: 99%
See 3 more Smart Citations