Proceedings of the 44th International Conference on Software Engineering 2022
DOI: 10.1145/3510003.3510042
|View full text |Cite
|
Sign up to set email alerts
|

Fast changeset-based bug localization with BERT

Abstract: Modern Deep Learning (DL) architectures based on transformers (e.g., BERT, RoBERTa) are exhibiting performance improvements across a number of natural language tasks. While such DL models have shown tremendous potential for use in software engineering applications, they are often hampered by insufficient training data. Particularly constrained are applications that require projectspecific data, such as bug localization, which aims at recommending code to fix a newly submitted bug report. Deep learning models f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 29 publications
(1 citation statement)
references
References 82 publications
0
1
0
Order By: Relevance
“…The results show that the model achieves promising results superior to state-of-the-art n-gram models, and the model learns better on some specific datasets (e.g., Android) when code abstraction is used. Ciborowska et al [79] apply BERT to the bug localization problem with the goal of improved retrieval quality, especially on bug reports where straightforward textual similarity would not suffice. Recently, Salza et al [80] investigate how transfer learning can be applied to code search by pre-training and fine-tuning a BERT-based model on combinations of natural language and source code.…”
Section: Applications Of Pre-trained Model In Sementioning
confidence: 99%
“…The results show that the model achieves promising results superior to state-of-the-art n-gram models, and the model learns better on some specific datasets (e.g., Android) when code abstraction is used. Ciborowska et al [79] apply BERT to the bug localization problem with the goal of improved retrieval quality, especially on bug reports where straightforward textual similarity would not suffice. Recently, Salza et al [80] investigate how transfer learning can be applied to code search by pre-training and fine-tuning a BERT-based model on combinations of natural language and source code.…”
Section: Applications Of Pre-trained Model In Sementioning
confidence: 99%