2013
DOI: 10.1145/2494232.2465756
|View full text |Cite
|
Sign up to set email alerts
|

Reuse-based online models for caches

Abstract: We develop a reuse distance/stack distance based analytical modeling framework for efficient, online prediction of cache performance for a range of cache configurations and replacement policies LRU, PLRU, RANDOM, NMRU. Our framework unifies existing cache miss rate prediction techniques such as Smith's associativity model, Poisson variants, and hardware way-counter based schemes. We also show how to adapt LRU way-counters to work when the number of sets in the cache changes. As an example application, we demon… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
23
0

Year Published

2015
2015
2018
2018

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 21 publications
(24 citation statements)
references
References 46 publications
1
23
0
Order By: Relevance
“…Adaptive methods could be used to improve efficiency, while providing rigorous error bounds unlike prior models. Otherwise, this application of cache calculus is similar to prior reuse-distance-based models [3,4,8].…”
Section: Numerical Solutionmentioning
confidence: 99%
See 2 more Smart Citations
“…Adaptive methods could be used to improve efficiency, while providing rigorous error bounds unlike prior models. Otherwise, this application of cache calculus is similar to prior reuse-distance-based models [3,4,8].…”
Section: Numerical Solutionmentioning
confidence: 99%
“…Prior models [1,3,4,5,6,8] aim to provide an efficient alternative to simulation, to, for example, accelerate design space exploration [1,5,8] or perform dynamic cache partitioning [3]. By contrast, our main goal is to augment simulation through simpler, closed-form models.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Second, since modern LLCs use hashing and achieve high effective associativity, replacement candidates form a representative sample of cached lines [5,41,44]. Details of array organization, which have been the focus of prior studies [1,15,36,47,55], are relatively unimportant in modern LLCs. Our model leverages this insight by modeling replacement as a probabilistic process affecting individual lines, not sets.…”
Section: Modern Cache Architecturementioning
confidence: 99%
“…Existing cache models [1,15,36,47,55] tend to focus on traditional, set-associative caches using simple replacement policies like least-recently used (LRU), pseudo-LRU, or random replacement. But modern processors do not use LRU (or pseudo-LRU) for the LLC. Modern LLCs instead employ high-performance replacement policies that greatly improve cache performance over traditional policies like LRU (Sec.…”
Section: Introductionmentioning
confidence: 99%