10th International Symposium on Temporal Representation and Reasoning, 2003 and Fourth International Conference on Temporal Log
DOI: 10.1109/time.2003.1214874
|View full text |Cite
|
Sign up to set email alerts
|

Counterexample-guided abstraction refinement

Abstract: Abstract. We present an automatic iterative abstraction-refinement methodology in which the initial abstract model is generated by an automatic analysis of the control structures in the program to be verified. Abstract models may admit erroneous (or "spurious") counterexamples. We devise new symbolic techniques which analyze such counterexamples and refine the abstract model correspondingly. The refinement algorithm keeps the size of the abstract state space small due to the use of abstraction functions which … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
447
0
3

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 309 publications
(450 citation statements)
references
References 17 publications
0
447
0
3
Order By: Relevance
“…Trau [2] is a SMT string solver based on the flattening technique introduced in [1], where flat automata are used to capture simple patterns of common constraints. It relies on a Counter-Example Guided Abstraction Refinement (CEGAR) framework [38] where an under-and an over-approximation module interact to increase the string solving precision. In addition, Trau implements string transduction by reduction to context-free membership constraints.…”
Section: Word-based Scs Approachesmentioning
confidence: 99%
“…Trau [2] is a SMT string solver based on the flattening technique introduced in [1], where flat automata are used to capture simple patterns of common constraints. It relies on a Counter-Example Guided Abstraction Refinement (CEGAR) framework [38] where an under-and an over-approximation module interact to increase the string solving precision. In addition, Trau implements string transduction by reduction to context-free membership constraints.…”
Section: Word-based Scs Approachesmentioning
confidence: 99%
“…accuracy of ML models and also to gain more assurance in the overall system containing the ML component. A more detailed study of using misclassifications (ML component-level counterexamples) to improve the accuracy of the neural network is presented in [12]; this approach is termed counterexample-guided data augmentation, inspired by counterexample-guided abstraction refinement (CEGAR) [8] and similar paradigms.…”
Section: Semantic Trainingmentioning
confidence: 99%
“…This approach can be viewed as the dual of counterexample-guided abstraction refinement [9]. CEGAR starts from an abstract model that represents an over-approximation of the system dynamics, and uses counterexamples (FPs) to refine the model, thereby reducing the FP rate.…”
Section: Reducing the False Negative Ratementioning
confidence: 99%