A systematic mapping study on crowdsourced requirements engineering using user feedback

Chong Wang*, Maya Daneva, Marten van Sinderen, Peng Liang

*Corresponding author for this work

Research output: Contribution to journalReview articleAcademicpeer-review

2 Citations (Scopus)
10 Downloads (Pure)

Abstract

Crowdsourcing is an appealing concept for achieving good enough requirements and just-in-time requirements engineering (RE). A promising form of crowdsourcing in RE is the use of feedback on software systems, generated through a large network of anonymous users of these systems over a period of time. Prior research indicated implicit and explicit user feedback as key to RE-practitioners to discover new and changed requirements and decide on software features to add, enhance, or abandon. However, a structured account on the types and characteristics of user feedback useful for RE purposes is still lacking. This research fills the gap by providing a mapping study of literature on crowdsourced user feedback employed for RE purposes. On the basis of the analysis of 44 selected papers, we found nine pieces of metadata that characterized crowdsourced user feedback and that were employed in seven specific RE activities. We also found that the published research has a strong focus on crowd-generated comments (explicit feedback) to be used for RE purposes, rather than employing application logs or usage-generated data (implicit feedback). Our findings suggest a need to broaden the scope of research effort in order to leverage the benefits of both explicit and implicit feedback in RE.

Original languageEnglish
Article numbere2199
JournalJournal of software: Evolution and Process
Volume31
Issue number10
DOIs
Publication statusPublished - 1 Oct 2019

    Fingerprint

Keywords

  • Crowdsourced feedback
  • Evidence-based software engineering
  • Large-scale user involvement
  • Requirements engineering
  • Systematic mapping study
  • User feedback

Cite this