Crowdsourcing is characterized by the externalization of tasks to a crowd of workers. In some platforms the tasks are easy, open access and remunerated by micropayment. The crowd is very diversified due to the simplicity of the tasks, but the payment can attract malicious workers. It is essential to identify these malicious workers in order not to consider their answers. In addition, not all workers have the same qualification for a task, so it might be interesting to give more weight to those with more qualifications. In this paper we propose a new method for characterizing the profile of contributors and aggregating answers using the theory of belief functions to estimate uncertain and imprecise answers. In order to evaluate the contributor profile we consider both his qualification for the task and his behaviour during its achievement thanks to his reflection.
In the late 1990s, Philippe Smets hypothesizes that the more imprecise humans are, the more certain they are. The modeling of human responses by belief functions has been little discussed. In this context, it is essential to validate the hypothesis of Ph. Smets. This paper focuses on the experimental validation of this hypothesis in the context of crowdsourcing. Crowdsourcing is the outsourcing of tasks to users of dedicated platforms. Two crowdsourcing campaigns have been carried out. For the first one, the user could be imprecise in his answer, for the second one he had to be precise. For both experiments, the user had to indicate his certainty in his answer. The results show that by being imprecise, users are more certain of their answers.
The theory of belief functions allows the fusion of imperfect data from different sources. Unfortunately, few real, imprecise and uncertain datasets exist to test approaches using belief functions. We have built real birds datasets thanks to the collection of numerous human contributions that we make available to the scientific community. The interest of our datasets is that they are made of human contributions, thus the information is therefore naturally uncertain and imprecise. These imperfections are given directly by the persons. This article presents the data and their collection through crowdsourcing and how to obtain belief functions from the data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.