Proceedings of the 11th International Conference on Supercomputing - ICS '97 1997
DOI: 10.1145/263580.263633
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive data prefetching using cache information

Abstract: Data prefetching that brings data close to the processor before it is actually used, can be done in parallel with computation, thus hiding the memory access latency and avoiding the need for the processor to stall. For maximum effectiveness it is necessary to adapt the prefetching parameters, such as prefetch offset and prefetch degree, to match programs and system conditions. Prefetch adaptability is the ability to change when to issue prefetching requests and the amount of data to be prefetched. We propose a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2001
2001
2009
2009

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 6 publications
0
4
0
Order By: Relevance
“…A comprehensive survey of most popular prefetching schemes for multiprocessors (also applicable to uniprocessors) is presented in [27]. This survey paper discusses various prefetching schemes including pure hardware [2], [10], [4], [30], [11], software [5], [8], [19], [20], [15] and integrated schemes.…”
Section: Related Workmentioning
confidence: 99%
“…A comprehensive survey of most popular prefetching schemes for multiprocessors (also applicable to uniprocessors) is presented in [27]. This survey paper discusses various prefetching schemes including pure hardware [2], [10], [4], [30], [11], software [5], [8], [19], [20], [15] and integrated schemes.…”
Section: Related Workmentioning
confidence: 99%
“…We define a virtual resource that captures the expected efficiency of read-ahead [Revel et al 1997;Ki and Knowles 1997]. We build our metric of read-ahead confidence on the adaptive read-ahead logic recently introduced in the Linux kernel [Fengguang 2007].…”
Section: Calculating Resource Utilizationmentioning
confidence: 99%
“…al [20] mitigate the negative effects of prefetching on performance by prefetching only when the memory bus is idle (to reduce contention), and prefetching to lower replacement priorities than demand misses (to reduce cache pollution). Ki and Knowles [31] used extra cache bits to increase prefetching's accuracy. Srinivasan, et.…”
Section: Related Workmentioning
confidence: 99%