2010 IEEE 16th International Conference on Embedded and Real-Time Computing Systems and Applications 2010
DOI: 10.1109/rtcsa.2010.8
|View full text |Cite
|
Sign up to set email alerts
|

Combining Prefetch with Instruction Cache Locking in Multitasking Real-Time Systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
20
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 14 publications
(21 citation statements)
references
References 15 publications
1
20
0
Order By: Relevance
“…Also, hardware proposals such as instruction cache locking and prefetch (e.g. [28]) and detailed analysis methods on conventional instruction caches (e.g. [2]) may guarantee to always hit in certain chunks of instructions.…”
Section: Discussion On Matrix Multiplicationmentioning
confidence: 99%
“…Also, hardware proposals such as instruction cache locking and prefetch (e.g. [28]) and detailed analysis methods on conventional instruction caches (e.g. [2]) may guarantee to always hit in certain chunks of instructions.…”
Section: Discussion On Matrix Multiplicationmentioning
confidence: 99%
“…As the impact of cache locking on energy efficiency has not been reported (to the best of our knowledge), we intend to bridge such gap by implementing techniques like [16] and [2] within our experimental environment. We also intend to generalize our algorithms for handling unlocked data caches.…”
Section: Discussionmentioning
confidence: 99%
“…Those who prescribe cache locking [4,14] (to trade-off performance for predictability) argue that cache-aware WCET analysis [8,21] often neglects the interference between tasks [15]. They prescribe a combination of instruction prefetching and cache locking [16] [2]. Such works, however, target the minimization of WCET as a single objective and do not report the impact on energy efficiency.…”
Section: Prefetching Under Real-time Constraintsmentioning
confidence: 99%
See 2 more Smart Citations