2009
DOI: 10.1109/tse.2008.71
|View full text |Cite
|
Sign up to set email alerts
|

What Types of Defects Are Really Discovered in Code Reviews?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

11
124
1

Year Published

2012
2012
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 161 publications
(136 citation statements)
references
References 55 publications
11
124
1
Order By: Relevance
“…Similar to Mäntylä's results [32], the dominant change categories are code comments (20%) and identifiers (10%). However, our study shows that 7-35% of review comments are discarded, and that a substantial 10-22% of the total changes are not triggered by review suggestions; this is not considered in prior studies.…”
Section: Introductionsupporting
confidence: 80%
See 4 more Smart Citations
“…Similar to Mäntylä's results [32], the dominant change categories are code comments (20%) and identifiers (10%). However, our study shows that 7-35% of review comments are discarded, and that a substantial 10-22% of the total changes are not triggered by review suggestions; this is not considered in prior studies.…”
Section: Introductionsupporting
confidence: 80%
“…The standard was the basis for two classifications by IBM and HP [13]. Wagner et al [52] and Duraes and Madeira [16] topology on this empirically validated classification scheme [32]. They analysed code review comments and extrapolated the types of defects found, reporting a 75:25 ratio between maintainability and functional defects.…”
Section: Research Methods -Rq1 and Rq2mentioning
confidence: 99%
See 3 more Smart Citations