2023
DOI: 10.1007/978-981-19-9225-4_23
|View full text |Cite
|
Sign up to set email alerts
|

Seq2Code: Transformer-Based Encoder-Decoder Model for Python Source Code Generation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 11 publications
0
1
0
Order By: Relevance
“…Laskari et al [79] discussed Seq2Code, a transformer-based solution for translating natural language problem statements into Python source code. Using an encoderdecoder transformer design with multi-head attention and separate embeddings for special characters, the model demonstrated improved perplexity compared to similarly structured models.…”
Section: Code Generation Processmentioning
confidence: 99%
“…Laskari et al [79] discussed Seq2Code, a transformer-based solution for translating natural language problem statements into Python source code. Using an encoderdecoder transformer design with multi-head attention and separate embeddings for special characters, the model demonstrated improved perplexity compared to similarly structured models.…”
Section: Code Generation Processmentioning
confidence: 99%