Esta es la versión de autor del artículo publicado en: This is an author produced version of a paper published in:
AbstractThe integration of usable and flexible analysis support in modelling environments is a key success factor in Model-Driven Development. In this paradigm, models are the core asset from which code is automatically generated, and thus ensuring model correctness is a fundamental quality control activity. For this purpose, a common approach consists on transforming the system models into formal semantic domains for verification. However, if the analysis results are not shown in a proper way to the end-user (e.g. in terms of the original language) they may become useless.In this paper we present a novel DSVL called BaVeL that facilitates the flexible annotation of verification results obtained in semantic domains to different formats, including the context of the original language. BaVeL is used in combination with a consistency framework, providing support for all the verification life cycle: acquisition of additional input data, transformation of the system models into semantic domains, verification, and flexible annotation of analysis results.The approach has been empirically validated by its implementation in the AToM 3 meta-modelling tool, and tested with several DSVLs. In this paper we present a case study for the analysis of a notation in the area of Digital Libraries, where the analysis is performed by transformations into Petri nets and a process algebra.