Proceedings of the 10th International Symposium on Highly-Efficient Accelerators and Reconfigurable Technologies 2019
DOI: 10.1145/3337801.3337820
|View full text |Cite
|
Sign up to set email alerts
|

A Layer-Adaptable Cache Hierarchy by a Multiple-layer Bypass Mechanism

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 14 publications
0
3
0
Order By: Relevance
“…Over the years, the industry has adopted large shared cache memories to allow several applications to share hardware resources and run concurrently in a multi-task environment. Consequently, the cache memory hierarchy consumes a large portion of the total area and energy budget [Egawa et al 2019]. However, due to the heterogeneous nature of the applications, such memories are not guaranteed to be used efficiently.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Over the years, the industry has adopted large shared cache memories to allow several applications to share hardware resources and run concurrently in a multi-task environment. Consequently, the cache memory hierarchy consumes a large portion of the total area and energy budget [Egawa et al 2019]. However, due to the heterogeneous nature of the applications, such memories are not guaranteed to be used efficiently.…”
Section: Introductionmentioning
confidence: 99%
“…Generally, these applications use a large portion of the shared resources. Consequently, identifying and avoiding noisy neighbors can improve general system performance by reducing conflicts and decreasing cache pollution [Egawa et al 2019].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation