2016
DOI: 10.1109/lca.2015.2442972
|View full text |Cite
|
Sign up to set email alerts
|

PATer: A Hardware Prefetching Automatic Tuner on IBM POWER8 Processor

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
19
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(20 citation statements)
references
References 3 publications
1
19
0
Order By: Relevance
“…Few research has been done when referring to parallel workloads. Li et al [45] apply a machine learning model to predict the best settings for the data prefetcher in different parallel workloads. Their search space is small compared to our work and they do not consider possible interactions between knobs.…”
Section: Data Prefetchingmentioning
confidence: 99%
“…Few research has been done when referring to parallel workloads. Li et al [45] apply a machine learning model to predict the best settings for the data prefetcher in different parallel workloads. Their search space is small compared to our work and they do not consider possible interactions between knobs.…”
Section: Data Prefetchingmentioning
confidence: 99%
“…The work in [21] reports an 8% increase on average in performance from applying AREP. The automatic prefetching tuner (PATer) for the POWER8 processor [22] provides a way of tuning the prefetch configuration.…”
Section: Investigation Of These Intermediate Hardware Buffers or Cachmentioning
confidence: 99%
“…However, to apply PATer requires an offline training phase with representative workloads. The authors of [22] report a 1.4 improvement in processing speed but do not consider energy consumption. Again, tuning is for a cache-based system, not the cache-less processors considered in this paper.…”
Section: Investigation Of These Intermediate Hardware Buffers or Cachmentioning
confidence: 99%
“…The mechanism provides the opportunity to turn off the prefetcher but only in those cases where prefetching hurts the system performance, and no policy is devised to turn on the prefetcher again. More recently, PATer [52] has been proposed. It uses a prediction model based on machine learning with the aim of dynamically tuning the prefetcher parameters of the IBM POWER8, which has 2 25 possible configurations.…”
Section: Prefetchingmentioning
confidence: 99%
“…applications) compete with regular memory requests for off-chip main memory bandwidth. Therefore, since prefetching is a speculative technique, it increases the total number of accesses to main memory [6,52,25]. In scenarios of high memory bandwidth consumption, this fact can turn into significant performance losses in some individual applications, which are affected by the prefetches of their co-runners.…”
Section: Introductionmentioning
confidence: 99%