A systematic mapping study on crowdsourced requirements engineering using user feedback

Chong Wang*, Maya Daneva, Marten van Sinderen, Peng Liang

*Corresponding author for this work

Research output: Contribution to journalReview articleAcademicpeer-review

1 Citation (Scopus)
7 Downloads (Pure)

Abstract

Crowdsourcing is an appealing concept for achieving good enough requirements and just-in-time requirements engineering (RE). A promising form of crowdsourcing in RE is the use of feedback on software systems, generated through a large network of anonymous users of these systems over a period of time. Prior research indicated implicit and explicit user feedback as key to RE-practitioners to discover new and changed requirements and decide on software features to add, enhance, or abandon. However, a structured account on the types and characteristics of user feedback useful for RE purposes is still lacking. This research fills the gap by providing a mapping study of literature on crowdsourced user feedback employed for RE purposes. On the basis of the analysis of 44 selected papers, we found nine pieces of metadata that characterized crowdsourced user feedback and that were employed in seven specific RE activities. We also found that the published research has a strong focus on crowd-generated comments (explicit feedback) to be used for RE purposes, rather than employing application logs or usage-generated data (implicit feedback). Our findings suggest a need to broaden the scope of research effort in order to leverage the benefits of both explicit and implicit feedback in RE.

Original languageEnglish
Article numbere2199
JournalJournal of software: Evolution and Process
Volume31
Issue number10
DOIs
Publication statusPublished - 1 Oct 2019

Fingerprint

Requirements engineering
Feedback
Metadata

Keywords

  • Crowdsourced feedback
  • Evidence-based software engineering
  • Large-scale user involvement
  • Requirements engineering
  • Systematic mapping study
  • User feedback

Cite this

@article{a394cb89c048477e9c599dfe02b0e9f4,
title = "A systematic mapping study on crowdsourced requirements engineering using user feedback",
abstract = "Crowdsourcing is an appealing concept for achieving good enough requirements and just-in-time requirements engineering (RE). A promising form of crowdsourcing in RE is the use of feedback on software systems, generated through a large network of anonymous users of these systems over a period of time. Prior research indicated implicit and explicit user feedback as key to RE-practitioners to discover new and changed requirements and decide on software features to add, enhance, or abandon. However, a structured account on the types and characteristics of user feedback useful for RE purposes is still lacking. This research fills the gap by providing a mapping study of literature on crowdsourced user feedback employed for RE purposes. On the basis of the analysis of 44 selected papers, we found nine pieces of metadata that characterized crowdsourced user feedback and that were employed in seven specific RE activities. We also found that the published research has a strong focus on crowd-generated comments (explicit feedback) to be used for RE purposes, rather than employing application logs or usage-generated data (implicit feedback). Our findings suggest a need to broaden the scope of research effort in order to leverage the benefits of both explicit and implicit feedback in RE.",
keywords = "Crowdsourced feedback, Evidence-based software engineering, Large-scale user involvement, Requirements engineering, Systematic mapping study, User feedback",
author = "Chong Wang and Maya Daneva and {van Sinderen}, Marten and Peng Liang",
year = "2019",
month = "10",
day = "1",
doi = "10.1002/smr.2199",
language = "English",
volume = "31",
journal = "Journal of software: Evolution and Process",
issn = "2047-7481",
publisher = "Wiley",
number = "10",

}

A systematic mapping study on crowdsourced requirements engineering using user feedback. / Wang, Chong; Daneva, Maya; van Sinderen, Marten; Liang, Peng.

In: Journal of software: Evolution and Process, Vol. 31, No. 10, e2199, 01.10.2019.

Research output: Contribution to journalReview articleAcademicpeer-review

TY - JOUR

T1 - A systematic mapping study on crowdsourced requirements engineering using user feedback

AU - Wang, Chong

AU - Daneva, Maya

AU - van Sinderen, Marten

AU - Liang, Peng

PY - 2019/10/1

Y1 - 2019/10/1

N2 - Crowdsourcing is an appealing concept for achieving good enough requirements and just-in-time requirements engineering (RE). A promising form of crowdsourcing in RE is the use of feedback on software systems, generated through a large network of anonymous users of these systems over a period of time. Prior research indicated implicit and explicit user feedback as key to RE-practitioners to discover new and changed requirements and decide on software features to add, enhance, or abandon. However, a structured account on the types and characteristics of user feedback useful for RE purposes is still lacking. This research fills the gap by providing a mapping study of literature on crowdsourced user feedback employed for RE purposes. On the basis of the analysis of 44 selected papers, we found nine pieces of metadata that characterized crowdsourced user feedback and that were employed in seven specific RE activities. We also found that the published research has a strong focus on crowd-generated comments (explicit feedback) to be used for RE purposes, rather than employing application logs or usage-generated data (implicit feedback). Our findings suggest a need to broaden the scope of research effort in order to leverage the benefits of both explicit and implicit feedback in RE.

AB - Crowdsourcing is an appealing concept for achieving good enough requirements and just-in-time requirements engineering (RE). A promising form of crowdsourcing in RE is the use of feedback on software systems, generated through a large network of anonymous users of these systems over a period of time. Prior research indicated implicit and explicit user feedback as key to RE-practitioners to discover new and changed requirements and decide on software features to add, enhance, or abandon. However, a structured account on the types and characteristics of user feedback useful for RE purposes is still lacking. This research fills the gap by providing a mapping study of literature on crowdsourced user feedback employed for RE purposes. On the basis of the analysis of 44 selected papers, we found nine pieces of metadata that characterized crowdsourced user feedback and that were employed in seven specific RE activities. We also found that the published research has a strong focus on crowd-generated comments (explicit feedback) to be used for RE purposes, rather than employing application logs or usage-generated data (implicit feedback). Our findings suggest a need to broaden the scope of research effort in order to leverage the benefits of both explicit and implicit feedback in RE.

KW - Crowdsourced feedback

KW - Evidence-based software engineering

KW - Large-scale user involvement

KW - Requirements engineering

KW - Systematic mapping study

KW - User feedback

UR - http://www.scopus.com/inward/record.url?scp=85074165779&partnerID=8YFLogxK

U2 - 10.1002/smr.2199

DO - 10.1002/smr.2199

M3 - Review article

AN - SCOPUS:85074165779

VL - 31

JO - Journal of software: Evolution and Process

JF - Journal of software: Evolution and Process

SN - 2047-7481

IS - 10

M1 - e2199

ER -