2018
DOI: 10.48550/arxiv.1807.01784
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Program Language Translation Using a Grammar-Driven Tree-to-Tree Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 0 publications
0
6
0
Order By: Relevance
“…The input form of the T2Seq is a tree structure and the output is the traversal sequence of AST nodes. • Tree-to-Tree-grammar Model (T2T+Grammar): Drissi et al [37] improved the decoding method on the basis of T2T. This model added grammar rules of the target language to constrain the model to generate programs grammatically correct.…”
Section: Baselinementioning
confidence: 99%
See 2 more Smart Citations
“…The input form of the T2Seq is a tree structure and the output is the traversal sequence of AST nodes. • Tree-to-Tree-grammar Model (T2T+Grammar): Drissi et al [37] improved the decoding method on the basis of T2T. This model added grammar rules of the target language to constrain the model to generate programs grammatically correct.…”
Section: Baselinementioning
confidence: 99%
“…They also incorporate the concept of parent attention feeding to enhance the accuracy of prediction results. Based on the Tree-to-Tree model [23], Drissi et al [37] added syntax rules as limitation at each time step during decoding to enforce grammatical correctness in program generation. Lin et al [38] masked the path of abstract syntax tree in pre-training and required model recovery.…”
Section: Abstract Syntax Tree-based Program Translationmentioning
confidence: 99%
See 1 more Smart Citation
“…A technique known as tree-to-tree encoding and decoding uses parse trees and deep neural networks to convert source code from one language to another [12]. This approach includes a training phase to enhance the encoding process.…”
Section: Related Workmentioning
confidence: 99%
“…Among them, Chen et al (2018) used the attention mechanism to locate the corresponding subtree in the source tree and used the information of the subtree to guide the non-terminal extension. Drissi et al (2018) used grammar rules of the target language to generate grammatically correct programs. Ahmad et al (2022) used program summaries as an intermediate language and perform a reverse translation similar to target-to-NL-to-source generation to train the model.…”
Section: Translation Of Programming Languagesmentioning
confidence: 99%