Records of the 2004 International Workshop on Memory Technology, Design and Testing, 2004. 2004
DOI: 10.1109/mtdt.2004.1327979
|View full text |Cite
|
Sign up to set email alerts
|

SF-LRU cache replacement algorithm

Abstract: In this paper we propose a novel replacement algorithm, SF-LRU (Second Chance-FrequencyLeast Recently Used) that combines the LRU (Least Recently Used) and the LFU (Least Frequently Used) using the second chance concept. A comprehensive comparison is made between our algorithm and both LRU and LFU algorithms. Experimental results show that the SF-LRU significantly reduces the number of cache misses compared the other two algorithms. Simulation results show that our algorithm can provide a maximum value of app… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0
2

Year Published

2006
2006
2020
2020

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 42 publications
(21 citation statements)
references
References 11 publications
0
16
0
2
Order By: Relevance
“…Traditional cache algorithm, such as LFU (least frequency used) [26], LRU (least recently used) [27], LRU-K [28], and so on. LFU algorithm often replaces the object with least frequency and value of object is high with the high frequency.…”
Section: Existing Research Workmentioning
confidence: 99%
“…Traditional cache algorithm, such as LFU (least frequency used) [26], LRU (least recently used) [27], LRU-K [28], and so on. LFU algorithm often replaces the object with least frequency and value of object is high with the high frequency.…”
Section: Existing Research Workmentioning
confidence: 99%
“…A cache is a component that stores data transparently [5] so that requests for that data can be served faster in future. The data that is stored within a cache might be values that have been computed earlier or duplicates of original values that are stored somewhere else.…”
Section: Preliminariesmentioning
confidence: 99%
“…Cache pollution caused by mis-prediction of any processor increases the cache miss rate because the processor loads unnecessary data from memory into the cache as in [2]. To resolve the cache miss and cache pollution problem, several cache replacement policies such as random, first-in-first-out, least frequently used, and least recently used policies have been used as in [3,4]. Random cache replacement policy replaces the cache items randomly, and first-in-first-out policy replaces the item which was loaded into the cache first.…”
Section: Fig 1 Structure Of On Chip Multiprocessor With L1 L2mentioning
confidence: 99%