2007
DOI: 10.1145/1230800.1230806
|View full text |Cite
|
Sign up to set email alerts
|

A predictive decode filter cache for reducing power consumption in embedded processors

Abstract: With advances in semiconductor technology, power management has increasingly become a very important design constraint in processor design. In embedded processors, instruction fetch and decode consume more than 40% of processor power. This calls for development of power minimization techniques for the fetch and decode stages of the processor pipeline. For this, filter cache has been proposed as an architectural extension for reducing the power consumption. A filter cache is placed between the CPU and the instr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2008
2008
2014
2014

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 19 publications
0
3
0
Order By: Relevance
“…The hierarchy is very simple, more than other structures proposed in literature. It does not have any particular promoting mechanism between L1 and L0 [17] or prediction mechanism [18]. "Conventional" leakage-saving techniques (Figure 3-a) can select the lines to put into low-power state by using many different policies [7,19,20].…”
Section: The Filtered-power Saving Architecturementioning
confidence: 99%
“…The hierarchy is very simple, more than other structures proposed in literature. It does not have any particular promoting mechanism between L1 and L0 [17] or prediction mechanism [18]. "Conventional" leakage-saving techniques (Figure 3-a) can select the lines to put into low-power state by using many different policies [7,19,20].…”
Section: The Filtered-power Saving Architecturementioning
confidence: 99%
“…Because the performance degradation may not be acceptable in modern high-performance embedded environments, researchers have studied ways of addressing this weakness [4]- [6]. These studies have focused on mitigating miss penalty of filter cache or increasing the hit rate of filter cache.…”
Section: Introductionmentioning
confidence: 99%
“…In order to reduce miss penalty of the filter cache, a predictive filter cache has been proposed [4]. In this structure, a predictor is added to the filter cache system and the predictor selects the access path between L1 cache and filter cache.…”
Section: Introductionmentioning
confidence: 99%