Proceedings of the 36th International Conference on Software Engineering 2014
DOI: 10.1145/2568225.2568229
|View full text |Cite
|
Sign up to set email alerts
|

Characterizing and detecting performance bugs for smartphone applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
177
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 237 publications
(178 citation statements)
references
References 13 publications
1
177
0
Order By: Relevance
“…We found that 251 places in 51 projects execute long-running operations in UI event thread. This also confirms the findings of a recent study by Liu et al [27] that shows that 21% of reported responsiveness bugs in an Android corpus arise because developers tend to forget encapsulating long-running operations in AsyncTask.…”
Section: Introductionsupporting
confidence: 90%
See 2 more Smart Citations
“…We found that 251 places in 51 projects execute long-running operations in UI event thread. This also confirms the findings of a recent study by Liu et al [27] that shows that 21% of reported responsiveness bugs in an Android corpus arise because developers tend to forget encapsulating long-running operations in AsyncTask.…”
Section: Introductionsupporting
confidence: 90%
“…Liu et al [27] empirically study three performance bug patterns in Android apps, one of which is running longrunning operations in main thread. They also propose an approach to detect such operations statically.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…They all have different focuses. Some of them [49] compare the qualitative difference between performance bugs and non-performance bugs across impact, context, fix and fix validation; some of them [21] look at how performance bugs are introduced, how performance bugs manifest, and how performance bugs are fixed; some of them [32] focuses on performance bugs in smart-phone applications. Different from all previous studies, our study aims to provide guidance to performance problem diagnosis, and hence focuses on how performance problems are noticed and reported by end users.…”
Section: Empirical Study Of Performance Bugsmentioning
confidence: 99%
“…They all have different focuses. Some of them [41] compare the qualitative difference between performance bugs and non-performance bugs across impact, context, fix and fix validation; some of them [18] look at how performance bugs are introduced, how performance bugs manifest, and how performance bugs are fixed; some of them [25] focuses on performance bugs in smart-phone applications. Different from all previous studies, our study aims to provide guidance to performance problem diagnosis, and hence focuses on how performance problems are noticed and reported by end users.…”
Section: Empirical Study Of Performance Bugsmentioning
confidence: 99%