2014
DOI: 10.1007/978-3-662-45234-9_30
|View full text |Cite
|
Sign up to set email alerts
|

Back-To-Back Testing of Model-Based Code Generators

Abstract: Abstract. In this paper, we present the testing approach of the Genesys code generator framework. The employed approach is based on back-toback-testing, which tests the translation performed by a code generator from a semantic perspective rather than just checking for syntactic correctness of the generation result. We describe the basic testing framework and show that it scales in three dimensions: parameterized tests, testing across multiple target platforms and testing on multiple meta-levels.In particular, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 40 publications
0
9
0
Order By: Relevance
“…They use Simulink as a simulation environment of models. In [54], authors present a testing approach of the Genesys code generator framework which tests the translation performed by a code generator from a semantic perspective rather that just checking for syntactic correctness of the generation result. Basically, Genesys realizes back-to-back testing by executing both the source model as well as the generated code on top of different target platforms.…”
Section: Testing and Code Generatorsmentioning
confidence: 99%
“…They use Simulink as a simulation environment of models. In [54], authors present a testing approach of the Genesys code generator framework which tests the translation performed by a code generator from a semantic perspective rather that just checking for syntactic correctness of the generation result. Basically, Genesys realizes back-to-back testing by executing both the source model as well as the generated code on top of different target platforms.…”
Section: Testing and Code Generatorsmentioning
confidence: 99%
“…Most of the previous work on code generator testing focuses on checking the correct functional behaviour of generated code .Most of these research efforts rely on the comparison of the model execution to the generated code execution. This is known in the software testing community as equivalence, comparative or back‐to‐back testing approach .…”
Section: Related Workmentioning
confidence: 99%
“…A reliable and acceptable way to increase the confidence in the correctness of a code generator family is to validate and check the functionality of generated code, which is a common practice for compiler validation and testing [11,25,26]. Therefore, developers try to check the syntactic and semantic correctness of the generated code by means of different techniques such as static analysis, test suites, etc., and ensure that the code is behaving correctly.…”
Section: Functional Correctness Of a Code Generator Familymentioning
confidence: 99%
“…We use InfluxDB 11 , an open source distributed time-series database as a back-end to record data. InfluxDB allows the user to execute SQL-like queries on the database.…”
Section: Back-end Database Componentmentioning
confidence: 99%
See 1 more Smart Citation