2009 Ninth International Conference on Quality Software 2009
DOI: 10.1109/qsic.2009.47
|View full text |Cite
|
Sign up to set email alerts
|

A Bayesian Approach for the Detection of Code and Design Smells

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
113
0
1

Year Published

2014
2014
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 175 publications
(115 citation statements)
references
References 14 publications
1
113
0
1
Order By: Relevance
“…This is pretty much consistent with results of the previous study made by Ratiu et al for God classes [11]. This result highlights the usefulness of smell detectors providing a measure of severity for each identified smell, as in the approach by Khomh et al [20]. In this way, developers can focus their attention on smells that are more likely to represent a threat from their point of view.…”
Section: Discussionsupporting
confidence: 89%
See 1 more Smart Citation
“…This is pretty much consistent with results of the previous study made by Ratiu et al for God classes [11]. This result highlights the usefulness of smell detectors providing a measure of severity for each identified smell, as in the approach by Khomh et al [20]. In this way, developers can focus their attention on smells that are more likely to represent a threat from their point of view.…”
Section: Discussionsupporting
confidence: 89%
“…Khomh et al [20] use quality metrics to train a Bayesian belief networks aiming at detecting bad smells. The main novelty of that approach is that it provides a likelihood that a code component is affected by a smell, instead of just providing a Boolean value like the previous techniques.…”
Section: A Code Bad Smell Detectionmentioning
confidence: 99%
“…While the research community devoted a lot of effort on understanding [41], [42], [43], [44], [45], [46], [47], [48], [49], [50], [51], [52] and detecting [53], [54], [55], [56], [57], [58], [59], [60], [61] design flaws occurring in production code, smells affecting test code have only been partially explored.…”
Section: A About Test Smellsmentioning
confidence: 99%
“…We chose not to use existing detection tools (Marinescu 2004;Khomh et al 2009b;Sahin et al 2014;Tsantalis and Chatzigeorgiou 2009;Moha et al 2010;Oliveto et al 2010;Palomba et al 2015a) because (i) none of them has ever been applied to detect all the studied code smells and (ii) their detection rules are generally more restrictive to ensure a good compromise between recall and precision and thus may miss some smell instances. To verify this claim, we evaluated the behavior of three existing tools, i.e., DECOR (Moha et al 2010), JDeodorant (Tsantalis and Chatzigeorgiou 2009), and HIST (Palomba et al 2015a) on one of the systems used in the empirical study, i.e., Apache Cassandra 1.1.…”
Section: Research Questions and Planningmentioning
confidence: 99%