2022
DOI: 10.1016/j.eswa.2022.117607
|View full text |Cite
|
Sign up to set email alerts
|

Automatic detection of Long Method and God Class code smells through neural source code embeddings

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

10
37
2

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 37 publications
(56 citation statements)
references
References 36 publications
10
37
2
Order By: Relevance
“…The ML classifier trained using source code metrics as features outperformed the ML classifier trained using CodeT5 embeddings on both tasks and achieved the F-measure of 0.87 on the Long Method task and 0.91 on the Large Class task. This result differs from our findings in [10], where we found CuBERT [21] source code embeddings better capture the semantics needed to tackle the code smell detection problem. We may attribute this difference to the datasets used in the studies.…”
Section: Introductioncontrasting
confidence: 99%
See 4 more Smart Citations
“…The ML classifier trained using source code metrics as features outperformed the ML classifier trained using CodeT5 embeddings on both tasks and achieved the F-measure of 0.87 on the Long Method task and 0.91 on the Large Class task. This result differs from our findings in [10], where we found CuBERT [21] source code embeddings better capture the semantics needed to tackle the code smell detection problem. We may attribute this difference to the datasets used in the studies.…”
Section: Introductioncontrasting
confidence: 99%
“…In the DL approach, the relevant features are automatically inferred and hold the potential to capture semantics we are currently unable to express through existing code metrics. In contrast, as we showed in [10], the metric extraction process may be brittle, time-consuming, and not scalable. Secondly, the CodeT5 approach enables us to leverage the power of transfer learning.…”
Section: Introductionmentioning
confidence: 92%
See 3 more Smart Citations