Abstract
Crowdsourcing is an appealing concept for achieving good enough requirements and just-in-time requirements engineering (RE). A promising form of crowdsourcing in RE is the use of feedback on software systems, generated through a large network of anonymous users of these systems over a period of time. Prior research indicated implicit and explicit user feedback as key to RE-practitioners to discover new and changed requirements and decide on software features to add, enhance, or abandon. However, a structured account on the types and characteristics of user feedback useful for RE purposes is still lacking. This research fills the gap by providing a mapping study of literature on crowdsourced user feedback employed for RE purposes. On the basis of the analysis of 44 selected papers, we found nine pieces of metadata that characterized crowdsourced user feedback and that were employed in seven specific RE activities. We also found that the published research has a strong focus on crowd-generated comments (explicit feedback) to be used for RE purposes, rather than employing application logs or usage-generated data (implicit feedback). Our findings suggest a need to broaden the scope of research effort in order to leverage the benefits of both explicit and implicit feedback in RE.
Original language | English |
---|---|
Article number | e2199 |
Journal | Journal of software: Evolution and Process |
Volume | 31 |
Issue number | 10 |
Early online date | 15 Jul 2019 |
DOIs | |
Publication status | Published - 1 Oct 2019 |
Keywords
- Crowdsourced feedback
- Evidence-based software engineering
- Large-scale user involvement
- Requirements engineering
- Systematic mapping study
- User feedback