Crowdsourcing is an appealing concept for achieving good enough requirements and just-in-time requirements engineering (RE). A promising form of crowdsourcing in RE is the use of feedback on software systems, generated through a large network of anonymous users of these systems over a period of time. Prior research indicated implicit and explicit user feedback as key to RE-practitioners to discover new and changed requirements and decide on software features to add, enhance, or abandon.However, a structured account on the types and characteristics of user feedback useful for RE purposes is still lacking. This research fills the gap by providing a mapping study of literature on crowdsourced user feedback employed for RE purposes. On the basis of the analysis of 44 selected papers, we found nine pieces of metadata that characterized crowdsourced user feedback and that were employed in seven specific RE activities. We also found that the published research has a strong focus on crowdgenerated comments (explicit feedback) to be used for RE purposes, rather than employing application logs or usage-generated data (implicit feedback). Our findings suggest a need to broaden the scope of research effort in order to leverage the benefits of both explicit and implicit feedback in RE.
KEYWORDScrowdsourced feedback, evidence-based software engineering, large-scale user involvement, requirements engineering, systematic mapping study, user feedback 1 | INTRODUCTION Crowd-based requirements engineering (RE) is the practice of large-scale user involvement in RE activities. Users are unknown volunteers and massive in number. Their involvement can take a variety of forms. For example, users generate information that becomes freely available for requirements specialists to use for requirements elicitation purposes, or participate in distributed problem-solving where they find workarounds in an application, which in turn may shape the requirements for a subsequent application release. To IT-consulting companies and software development organizations, this opportunity for large-scale user involvement means a way to get good-enough requirements and to achieve just-intime RE, which implies a significant potential to reduce the cost of RE processes. One promising form of crowdsourcing in RE-that is exploited by businesses and that attracts much research attention-is the use of feedback volunteered by large networks of anonymous users of software systems over a period of time in an RE activity, such as elicitation, validation, or prioritization.This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.