This study aims to identify an appropriate conceptual framework to evaluate crowdsourcing platforms from an open innovation perspective employing a combination of qualitative and quantitative methods. The initial indices of the performance evaluation framework in the crowdsourcing platforms are obtained through the Delphi method and interviews with experts. Then, using these factors, a statistical questionnaire is designed and distributed among users of crowdsourcing platforms to confirm or reject the factors. Finally, the aspects of the performance evaluation framework of crowdsourcing platforms are specified from the perspective of open innovation. Using fuzzy hierarchical analysis, these aspects are prioritized in order of importance: Collaboration, Project design, Moderation, Terms and conditions, UI/UX (user interface and user experience), and Key statistics. Concerning the principle of crowdsourcing, which is based on crowd participation and crowd intelligence of users, Collaboration and Project design turned out to be the significant factors in evaluating a crowdsourcing platform.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.