2021
DOI: 10.48550/arxiv.2112.15491
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Semantics-Recovering Decompilation through Neural Machine Translation

Abstract: Decompilation transforms low-level program languages (PL) (e.g., binary code) into high-level PLs (e.g., C/C++). It has been widely used when analysts perform security analysis on software (systems) whose source code is unavailable, such as vulnerability search and malware analysis. However, current decompilation tools usually need lots of experts' efforts, even for years, to generate the rules for decompilation, which also requires long-term maintenance as the syntax of high-level PL or low-level PL changes. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…[2019] tries to address limitations of neural decompilation, with two sequential phases: code sketch generation and iterative error correction. Finally, Liang et al [2021b] use a method close to ours, and train Transformer models to translate between binary code and C.…”
Section: Related Workmentioning
confidence: 99%
“…[2019] tries to address limitations of neural decompilation, with two sequential phases: code sketch generation and iterative error correction. Finally, Liang et al [2021b] use a method close to ours, and train Transformer models to translate between binary code and C.…”
Section: Related Workmentioning
confidence: 99%