IEEE International Symposium on - ISPASS Performance Analysis of Systems and Software, 2004
DOI: 10.1109/ispass.2004.1291356
|View full text |Cite
|
Sign up to set email alerts
|

Structures for phase classification

Abstract: Most

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
67
0
1

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 75 publications
(68 citation statements)
references
References 13 publications
0
67
0
1
Order By: Relevance
“…It examines the application's execution path to detect hardware independent phases [21,14]. Such phases can be readily missed by performance counter based phase detection, while changes in executed code reflect changes in many different metrics [20,21,5,22,9,18]. To leverage this, ScarPhase monitors what code is executed by dividing the application into windows and using hardware performance counters to sample which branches execute in a window.…”
Section: Phase Detectionmentioning
confidence: 99%
“…It examines the application's execution path to detect hardware independent phases [21,14]. Such phases can be readily missed by performance counter based phase detection, while changes in executed code reflect changes in many different metrics [20,21,5,22,9,18]. To leverage this, ScarPhase monitors what code is executed by dividing the application into windows and using hardware performance counters to sample which branches execute in a window.…”
Section: Phase Detectionmentioning
confidence: 99%
“…based on simulation results which often correspond to small performance differences. Recent research works on sampling have similar accuracy targets [6,7,13].…”
Section: Introduction and Related Workmentioning
confidence: 99%
“…While the recent surge of research articles on sampling started with rather large sample sizes (100M in the first SimPoint article [12]), it has later shifted to very small intervals (1,000 in SMARTS [14]), and it is now converging to intermediate sizes (1M and 10M in SimPoint [10,6]), and even to varying sizes in EXPERT [7] and SimPoint VLI [5] (ranging from 52,000 to 6.1M in EXPERT, and 100M to 500M in SimPoint VLI). With 100M samples, warm-up is not an issue, at least with current cache sizes.…”
Section: Introduction and Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Two pervious works were in representative sampling. Lau et al [5] extensively examined different phase features based on data accesses. Their features are related to the access patterns, but do not directly measure the data locality.…”
Section: Reuse Distance Distribution (Rdd) Definitionmentioning
confidence: 99%