Automated performance evaluation of service-oriented systems

Fredericus Gerrit Brand van den Berg

    Research output: ThesisPhD Thesis - Research UT, graduation UT

    3 Citations (Scopus)
    534 Downloads (Pure)


    An embedded system is a software-intensive system that has a dedicated function within a larger system, e.g., medical equipment. They have increased significantly in complexity over time and are confronted with stringent cost constraints.
    Embedded systems interact with their environments in a time-critical way, making their safety is predominantly determined by their performance. Good performance is hard to achieve, because: (i) performance evaluation is hardly ever an integral part of software engineering; (ii) system architectures are increasingly heterogeneous, parallel and distributed; (iii) systems are often designed for many product families and different configurations; and, (iv) measurements that gain insight in system performance tend to be expensive to obtain.
    In this thesis, we consider so-called service(-oriented) systems, a special class of embedded systems. A service system provides services to its environment, accessible via so-called service requests, and generates exactly one service response to each request. These service requests are functionally isolated, but can affect each other's performance due to competition for shared resources. Evaluating the performance of service systems before implementation is hard; It calls for models that combine timing aspects and mechanisms to deal with uncertainty.
    We propose a new performance evaluation framework for service systems: The system designer models the performance of a system, yielding a high-level performance model. Under the hood, this model is transformed into an underlying performance model that is evaluated for performance to yield performance results.
    The literature reports about many so-called toolsets, a collection of connected tools that automates (part of) the performance evaluation process, that facilitate the performance evaluation process for the system designer.
    Many toolsets, however, require the user to have knowledge of formal performance evaluation techniques, provide a model that is not domain specific and return approximate results.The goal of this thesis is to overcome the above shortcomings.
    Therefore, we have designed and implemented iDSL, a language and toolchain for performance evaluation of service systems, which makes use of the so-called Modest language and toolchain to evaluate models using advanced model checking algorithms and discrete-event simulations. Modest has been selected because of its process algebra roots, its relatively readable language, and its support for multiple formalisms and ways of analysis.
    For accurate results, iDSL uses a new evaluation technique, which yields full latency distributions. For this purpose, iDSL uses an algorithm on top of Modest, in which many probabilistic model checking iterations are performed in a systematic way.
    The applicability of iDSL on real systems has been assessed by conducting several case studies on interventional X-ray systems throughout the development of iDSL. Interventional X-ray systems are dependable medical systems that support minimally-invasive surgeries.
    In conclusion, iDSL delivers a fully automated performance evaluation chain from a high-level model to visualized performance results for service systems. This approach is not only very efficient, it also brings advanced formal performance evaluation techniques at the fingertips of system designers, without bothering them with the technical details of these.
    Original languageEnglish
    QualificationDoctor of Philosophy
    Awarding Institution
    • University of Twente
    • Haverkort, Boudewijn R.H.M., Supervisor
    • J.J.M., Hooman, Supervisor, External person
    Thesis sponsors
    Award date14 Jun 2017
    Place of PublicationEnschede
    Print ISBNs978-90-365-4359-0
    Publication statusPublished - 14 Jun 2017


    Dive into the research topics of 'Automated performance evaluation of service-oriented systems'. Together they form a unique fingerprint.

    Cite this