Empirical validation of a quality framework for evaluating modelling languages in MDE environments.
In: Software Quality Journal, Jg. 29 (2021-06-01), Heft 2, S. 275-307
Online
academicJournal
Zugriff:
In previous research, we proposed the multiple modelling quality evaluation framework (MMQEF), which is a method and tool for evaluating modelling languages in model-driven engineering (MDE) environments. Rather than being exclusive, MMQEF attempts to complement other methods of evaluation of quality such as SEQUAL. However, to date, MMQEF has not been validated beyond some concept proofs. This paper evaluates the applicability of the MMQEF method in comparison with other existing methods. We performed an evaluation in which the subjects had to detect quality issues in modelling languages. A group of expert professionals and two experimental objects (i.e. two combinations of different modelling languages based on real industrial practices) were used. To analyse the results, we applied quantitative approaches, i.e. statistical tests on the results of the performance measures and the perception of subjects. We ran four replications of the experiment in Colombia between 2016 and 2019, with a total of 50 professionals. The results of the quantitative analysis show a low performance for all of the methods, but a positive perception of MMQEF.Conclusions: The application of modelling language quality evaluation methods within MDE settings is indeed tricky, and subjects did not succeed in identifying all quality problems. This experiment paves the way for additional investigation on the trade-offs between the methods and potential situational guidelines (i.e. circumstances under which each method is convenient). We encourage further inquiries on industrial applications to incrementally improve the method and tailor it to the needs of professionals working in real industrial environments. [ABSTRACT FROM AUTHOR]
Copyright of Software Quality Journal is the property of Springer Nature and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Titel: |
Empirical validation of a quality framework for evaluating modelling languages in MDE environments.
|
---|---|
Autor/in / Beteiligte Person: | Giraldo, Fáber D. ; Chicaiza, Ángela J. ; España, Sergio ; Pastor, Óscar |
Link: | |
Zeitschrift: | Software Quality Journal, Jg. 29 (2021-06-01), Heft 2, S. 275-307 |
Veröffentlichung: | 2021 |
Medientyp: | academicJournal |
ISSN: | 0963-9314 (print) |
DOI: | 10.1007/s11219-021-09554-1 |
Schlagwort: |
|
Sonstiges: |
|