2022
DOI: 10.48550/arxiv.2212.09666
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

MultiCoder: Multi-Programming-Lingual Pre-Training for Low-Resource Code Completion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…First, some studies fine-tuned foundational LLMs with LRPL datasets [12]. While this approach demands considerable datasets and computational power, it has not been applied to generative tasks yet [31]. Second, some studies used prompt engineering techniques.…”
Section: Related Work 21 Llms For Computational Programming and Modelingmentioning
confidence: 99%
“…First, some studies fine-tuned foundational LLMs with LRPL datasets [12]. While this approach demands considerable datasets and computational power, it has not been applied to generative tasks yet [31]. Second, some studies used prompt engineering techniques.…”
Section: Related Work 21 Llms For Computational Programming and Modelingmentioning
confidence: 99%