2008
DOI: 10.1007/s11265-008-0244-0
|View full text |Cite
|
Sign up to set email alerts
|

Storage Estimation and Design Space Exploration Methodologies for the Memory Management of Signal Processing Applications

Abstract: The storage requirements in data-dominated signal processing systems, whose behavior is described by array-based, loop-organized algorithmic specifications, have an important impact on the overall energy consumption, data access latency, and chip area. This paper gives a tutorial overview on the existing techniques for the evaluation of the data memory size, This research was sponsored in part by the U.S. National Science Foundation (DAP 0133318). F. Balasa (B)Southern Utah University, Cedar City, UT, USA e-ma… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
6
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 66 publications
0
6
0
Order By: Relevance
“…For instance, for the Darwin algorithm with 27 arrays we observe the longest time for the overall flow, i.e. 2.5s [Balasa et al 2008]. Assuming that the array size computation for polyhedral approach takes 81 ms, which was the lower value obtained during our experimentations, then the total time dedicated to the computation of the maximum alive elements for all the arrays is 2.187s, which is the dominant task as it takes 88% of the total time.…”
Section: Motivationmentioning
confidence: 73%
See 1 more Smart Citation
“…For instance, for the Darwin algorithm with 27 arrays we observe the longest time for the overall flow, i.e. 2.5s [Balasa et al 2008]. Assuming that the array size computation for polyhedral approach takes 81 ms, which was the lower value obtained during our experimentations, then the total time dedicated to the computation of the maximum alive elements for all the arrays is 2.187s, which is the dominant task as it takes 88% of the total time.…”
Section: Motivationmentioning
confidence: 73%
“…The time required to explore one instance becomes now an important factor in the overall time of DSE, as we need to apply this step a lot of times in order to explore a set of different scenarios for the application under study and we have to apply this process for all the arrays existing in the application. To provide an intuition over the gains in the overall flow, we will use the information provided in [Balasa et al 2008] using the STOREQ, a tool developed for main parts of the steps [Kjeldsberg et al 2003]. This tool is typically used in the inner core of a loop transformation exploration kernel.…”
Section: Motivationmentioning
confidence: 99%
“…Various approaches have been investigated to reduce memory accesses . Two compile‐time optimizations, redundant load/store elimination and loop‐invariant load/store migration, can reduce explicit redundant memory accesses .…”
Section: Related Workmentioning
confidence: 99%
“…As explained in [21], the former optimization techniques often require a partial system synthesis and the execution of time-consuming algorithms. Although these techniques provide an exact or highly optimized memory requirement, they may be too slow to be used in the rapid prototyping context.…”
Section: Related Workmentioning
confidence: 99%
“…Although these techniques provide an exact or highly optimized memory requirement, they may be too slow to be used in the rapid prototyping context. In [21], Balasa et al survey existing estimation techniques that provide a reliable memory size approximation in a reasonable computation time. The main difference between these estimation techniques and our bounding method is the abstraction level considered.…”
Section: Related Workmentioning
confidence: 99%