Proceedings of the 5th IEEE/ACM International Conference on Hardware/Software Codesign and System Synthesis 2007
DOI: 10.1145/1289816.1289853
|View full text |Cite
|
Sign up to set email alerts
|

Compile-time decided instruction cache locking using worst-case execution paths

Abstract: Caches are notorious for their unpredictability. It is difficult or even impossible to predict if a memory access results in a definite cache hit or miss. This unpredictability is highly undesired for real-time systems. The Worst-Case Execution Time (WCET) of a software running on an embedded processor is one of the most important metrics during real-time system design. The WCET depends to a large extent on the total amount of time spent for memory accesses. In the presence of caches, WCET analysis must always… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
75
0

Year Published

2010
2010
2016
2016

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 63 publications
(75 citation statements)
references
References 8 publications
0
75
0
Order By: Relevance
“…The static approach grants a fixed area of cache space to each task; the dynamic approaches allow each task to use different areas of cache space according to their needs. Algorithms have been specified to determine cache allocations to minimize task WCETs [Arnaud and Puaut 2006;Falk et al 2007] and to maximize schedulability of all tasks [Campoy et al 2002;Liu et al 2012]. The static approach prevents all CRPD, while the dynamic approaches replace it with the cache fill penalty incurred during each context switch.…”
Section: Reducing Crpd By Cache Locking and Partitioningmentioning
confidence: 99%
“…The static approach grants a fixed area of cache space to each task; the dynamic approaches allow each task to use different areas of cache space according to their needs. Algorithms have been specified to determine cache allocations to minimize task WCETs [Arnaud and Puaut 2006;Falk et al 2007] and to maximize schedulability of all tasks [Campoy et al 2002;Liu et al 2012]. The static approach prevents all CRPD, while the dynamic approaches replace it with the cache fill penalty incurred during each context switch.…”
Section: Reducing Crpd By Cache Locking and Partitioningmentioning
confidence: 99%
“…Cache locking improves timing predictability because all the memory accesses to the locked memory blocks are guaranteed to be cache hits. Moreover, by carefully selecting the memory blocks to lock, cache locking can greatly improve performance [19], [9], [14], [7].…”
Section: Introductionmentioning
confidence: 99%
“…Most cache locking techniques aimed at improving the WCET employ static cache locking [9], [15], [18], [7]. Static locking loads and locks the memory blocks at program startup, and the locked content remains unchanged throughout the program execution.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Thus, locking the entire cache resolves the problem of timing unpredictability. More importantly, by carefully choosing the memory blocks to be locked, WCET estimate can be reduced compared to cache modeling techniques without locking [6,14].…”
Section: Introductionmentioning
confidence: 99%