2021
DOI: 10.48550/arxiv.2102.10452
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Spotting Silent Buffer Overflows in Execution Trace through Graph Neural Network Assisted Data Flow Analysis

Zhilong Wang,
Li Yu,
Suhang Wang
et al.

Abstract: A software vulnerability could be exploited without any visible symptoms. When no source code is available, although such silent program executions could cause very serious damage, the general problem of analyzing silent yet harmful executions is still an open problem. In this work, we propose a graph neural network (GNN) assisted data flow analysis method for spotting silent buffer overflows in execution traces. The new method combines a novel graph structure (denoted DFG+) beyond data-flow graphs, a tool to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 29 publications
0
1
0
Order By: Relevance
“…The Inst2vec [39] model is based on the Low-Level Virtual Machine (LLVM) framework [40] to construct semantic flow graphs and uses the skip-gram algorithm for training to obtain vector representations of the graphs. The BRGCN [41] model extracts instruction-level heterogeneous data flow graphs, where edges include various relationships such as data flow, variable adjacency, and read-write relationships. It utilizes an R-GCN [42] for heterogeneous representation learning.…”
Section: Graph-based Approachesmentioning
confidence: 99%
“…The Inst2vec [39] model is based on the Low-Level Virtual Machine (LLVM) framework [40] to construct semantic flow graphs and uses the skip-gram algorithm for training to obtain vector representations of the graphs. The BRGCN [41] model extracts instruction-level heterogeneous data flow graphs, where edges include various relationships such as data flow, variable adjacency, and read-write relationships. It utilizes an R-GCN [42] for heterogeneous representation learning.…”
Section: Graph-based Approachesmentioning
confidence: 99%