2013 ACM / IEEE International Symposium on Empirical Software Engineering and Measurement 2013
DOI: 10.1109/esem.2013.38
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating Software Product Metrics with Synthetic Defect Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2016
2016
2017
2017

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 9 publications
0
1
0
Order By: Relevance
“…It is not surprising then that several studies have attempted to compare software metrics to better understand their similarities and differences. Work by Stuckman et al (2013) for example, used synthetic defect datasets to assess the performance of metrics, complemented by a formal mathematical model. One of the results of the study was that a relatively small set of source code metrics conveyed the same information as a larger set.…”
Section: Analysis Of Software Metricsmentioning
confidence: 99%
“…It is not surprising then that several studies have attempted to compare software metrics to better understand their similarities and differences. Work by Stuckman et al (2013) for example, used synthetic defect datasets to assess the performance of metrics, complemented by a formal mathematical model. One of the results of the study was that a relatively small set of source code metrics conveyed the same information as a larger set.…”
Section: Analysis Of Software Metricsmentioning
confidence: 99%