Experimental Evaluation of a Tool for Change Impact Prediction in Requirements Models: Design, Results and Lessons Learned

Arda Goknil, Roderick van Domburg, Ivan Kurtev, Klaas van den Berg, Fons Wijnhoven

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

7 Citations (Scopus)
193 Downloads (Pure)

Abstract

There are commercial tools like IBM Rational RequisitePro and DOORS that support semi-automatic change impact analysis for requirements. These tools capture the requirements relations and allow tracing the paths they form. In most of these tools, relation types do not say anything about the meaning of the relations except the direction. When a change is introduced to a requirement, the requirements engineer analyzes the impact of the change in related requirements. In case semantic information is missing to determine precisely how requirements are related to each other, the requirements engineer generally has to assume the worst case dependencies based on the available syntactic information only. We developed a tool that uses formal semantics of requirements relations to support change impact analysis and prediction in requirements models. The tool TRIC (Tool for Requirements Inferencing and Consistency checking) works on models that explicitly represent requirements and the relations among them with their formal semantics. In this paper we report on the evaluation of how TRIC improves the quality of change impact predictions. A quasiexperiment is systematically designed and executed to empirically validate the impact of TRIC. We conduct the quasi-experiment with 21 master’s degree students predicting change impact for five change scenarios in a real software requirements specification. The participants are assigned with Microsoft Excel, IBM RequisitePro or TRIC to perform change impact prediction for the change scenarios. It is hypothesized that using TRIC would positively impact the quality of change impact predictions. Two formal hypotheses are developed. As a result of the experiment, we are not able to reject the null hypotheses, and thus we are not able to show experimentally the effectiveness of our tool. In the paper we discuss reasons for the failure to reject the null hypotheses in the experiment.
Original languageEnglish
Title of host publication2014 IEEE 4th International Model-Driven Requirements Engineering Workshop (MoDRE)
Subtitle of host publicationProceedings
EditorsAna Moreira, Pablo Sánchez, Gunter Mussbacher, Joao Araújo
Place of PublicationPiscataway, NJ
PublisherIEEE
Number of pages10
ISBN (Print)978-1-4799-6343-0
DOIs
Publication statusPublished - 25 Aug 2014
Event2014 IEEE 4th International Model-Driven Requirements Engineering Workshop (MoDRE) - Karlskrona, Sweden
Duration: 25 Aug 201425 Aug 2014

Conference

Conference2014 IEEE 4th International Model-Driven Requirements Engineering Workshop (MoDRE)
Abbreviated titleMoDRE
Country/TerritorySweden
CityKarlskrona
Period25/08/1425/08/14

Keywords

  • METIS-304279
  • IR-91418

Fingerprint

Dive into the research topics of 'Experimental Evaluation of a Tool for Change Impact Prediction in Requirements Models: Design, Results and Lessons Learned'. Together they form a unique fingerprint.

Cite this