2019 International Joint Conference on Neural Networks (IJCNN) 2019
DOI: 10.1109/ijcnn.2019.8851751
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Source Code Summarization with Extended Tree-LSTM

Abstract: Neural machine translation models are used to automatically generate a document from given source code since this can be regarded as a machine translation task. Source code summarization is one of the components for automatic document generation, which generates a summary in natural language from given source code. This suggests that techniques used in neural machine translation, such as Long Short-Term Memory (LSTM), can be used for source code summarization. However, there is a considerable difference betwee… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
58
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 78 publications
(58 citation statements)
references
References 23 publications
0
58
0
Order By: Relevance
“…Recent works in code summarization utilize structural information of a program in the form of Abstract Syntax Tree (AST) that can be encoded using tree structure encoders such as Tree-LSTM (Shido et al, 2019), Tree-Transformer (Harer et al, 2019), andGraph Neural Network (LeClair et al, 2020). In contrast, Hu et al (2018a) proposed a structure based traversal (SBT) method to flatten the AST into a sequence and showed improvement over the AST based methods.…”
Section: Related Workmentioning
confidence: 99%
“…Recent works in code summarization utilize structural information of a program in the form of Abstract Syntax Tree (AST) that can be encoded using tree structure encoders such as Tree-LSTM (Shido et al, 2019), Tree-Transformer (Harer et al, 2019), andGraph Neural Network (LeClair et al, 2020). In contrast, Hu et al (2018a) proposed a structure based traversal (SBT) method to flatten the AST into a sequence and showed improvement over the AST based methods.…”
Section: Related Workmentioning
confidence: 99%
“…Liang and Zhu (2018) proposed a tree-based recursive neural network to represent the syntax tree of code. Shido et al (2019) represented source code using the tree structure encoder of tree-LSTM. Harer et al (2019) adopted tree-transformer to encode the structure of ASTs.…”
Section: Related Workmentioning
confidence: 99%
“…On the other hand, Shido et al (2019); Harer et al (2019); LeClair et al (2020); Scarselli et al (2008) proposed tree-based models to capture the features of the source code. They used the structural information from parse trees but hardly considered the sequence information of code tokens.…”
Section: Introductionmentioning
confidence: 99%
“…We set the threshold as 0.8, which means that the crosslanguage input code pairs are semantic identical if the cosine similarity between them is greater than 0.8. For evaluation, we select LSTM (Sundermeyer et al, 2012), Tree-LSTM (Shido et al, 2019), TBCNN (Mou et al, 2016) and GraphCode-BERT ) models as baselines. Code-to-Code Search During software development process, developers often look for code snippets that offer similar functionality (Kim et al, 2018).…”
Section: Dataset and Downstream Tasksmentioning
confidence: 99%