2011
DOI: 10.1007/978-3-642-24151-2_2
|View full text |Cite
|
Sign up to set email alerts
|

Enhanced Adaptive Insertion Policy for Shared Caches

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2013
2013
2019
2019

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 14 publications
0
5
0
Order By: Relevance
“…The work by C. Li. [10] establishes the policies for cache updation by inclusion. This policy is briefly criticized for limitation in the cache sizes on the MANET nodes.…”
Section: Outcomes From the Parallel Researchesmentioning
confidence: 99%
“…The work by C. Li. [10] establishes the policies for cache updation by inclusion. This policy is briefly criticized for limitation in the cache sizes on the MANET nodes.…”
Section: Outcomes From the Parallel Researchesmentioning
confidence: 99%
“…The second technique is LFU in which data that were requested from the server a lot in the past will be requested a lot again . In general, one recognized problem with cache replacement is that when a relatively large data item is requested, it may end up replacing many small cached items in the cache …”
Section: State‐of‐the‐art Cache Replacement Algorithmsmentioning
confidence: 99%
“…In the literature, a lot of variables have been used to remove data items from the cache such as data size, last time accessed, distance from the node that accesses the document, time when replica stored in the cache, the mobility of the clients, update frequency, and retrieve delay. Given the difficulties to define the associations between the previous major variables are unidentified or unwell comprehended, finding that the optimal solution is a major part of the cache problem.…”
Section: Introductionmentioning
confidence: 99%
“…The second technique is called least frequently used, the least requested data items are the first out of cache [16]. In general, one recognised problem with cache replacement is that when a relatively large data item is requested, it may end up replacing many small cached items in the cache [10–13, 17, 18].…”
Section: State‐of‐the‐art Cache Replacement Algorithmmentioning
confidence: 99%
“…A lot of variables [1, 10–13] have been used to remove data items from the cache when the cache is full such as data size, last time accessed, access account, distance from the node that access the document, time when replica stored in cache, mobility of the clients, update frequency, retrieve delay etc. Therefore, the selection of these parameters to be used in the cache replacement algorithm, and what data elements will be excluded from the cache when the cache is full are great challenges.…”
Section: Introductionmentioning
confidence: 99%