2022
DOI: 10.48550/arxiv.2207.03578
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Code Translation with Compiler Representations

Abstract: In this paper, we leverage low-level compiler intermediate representations (IR) to improve code translation. Traditional transpilers rely on syntactic information and handcrafted rules, which limits their applicability and produces unnaturallooking code. Applying neural machine translation (NMT) approaches to code has successfully broadened the set of programs on which one can get a naturallooking translation. However, they treat the code as sequences of text tokens, and still do not differentiate well enoug… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 12 publications
0
2
0
Order By: Relevance
“…Neural Machine Translation. Since the advent of sequence to sequence models [64], neural machine translation has been applied to programming language translation tasks [11,12,26], including unsupervised settings [13,55,65]. Training data is often extracted from coding websites [42].…”
Section: Other Approachesmentioning
confidence: 99%
“…Neural Machine Translation. Since the advent of sequence to sequence models [64], neural machine translation has been applied to programming language translation tasks [11,12,26], including unsupervised settings [13,55,65]. Training data is often extracted from coding websites [42].…”
Section: Other Approachesmentioning
confidence: 99%
“…It compiles source code from different languages into distilled code representation firstly and then decompile this intermediate representation into the target language. Szafraniec et al [31] proposed a method with low-level compiler intermediate representations to improve the effect of program translation. Ahmad et al [32] proposed a new approach by using code summary on directional model.…”
Section: Unsupervised Program Translationmentioning
confidence: 99%
“…To build neural style transfer models for code, we rely on extensive work in code language modeling, where large numbers of programs from open source repositories are used to pre-train models over code (Ahmad et al (2021), Wang et al (2021), Feng et al (2020)), and these models have in turn, been used in a variety of downstream tasks such as code translation, or generation of code from natural language (Szafraniec et al (2022), Chen et al ( 2021)). To our knowledge, CodeStylist is the only system that exploits these language models to perform the rather complex task of code style transfer.…”
Section: Introductionmentioning
confidence: 99%