2003 Design, Automation and Test in Europe Conference and Exhibition
DOI: 10.1109/date.2003.1253582
|View full text |Cite
|
Sign up to set email alerts
|

A new algorithm for energy-driven data compression in VLIW embedded processors

Abstract: This paper presents a new algorithm for on-the-fly data compression in high performance VLIW processors. The algorithm aggressively targets energy minimization of some of the dominant factors in the SoC energy budget (i.e., main memory access and high throughput global bus). Based on a differential technique, both the new algorithm and the HW compression unit have been developed to efficiently manage data compression and decompression into a high performance industrial processor architecture, under strict real… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…While such specialized architectures generate very good results from both power and performance angles, the resulting design may not be very flexible as the data access pattern of the entire application should be captured in a single memory configuration. Data compression has also been used to reduce memory footprint and energy consumption in the past [1,2,10,21,22]. In [4] data compression is used to reduce the energy consumption in DRAMs by increasing the effectiveness of low-power operating modes.…”
Section: Related Workmentioning
confidence: 99%
“…While such specialized architectures generate very good results from both power and performance angles, the resulting design may not be very flexible as the data access pattern of the entire application should be captured in a single memory configuration. Data compression has also been used to reduce memory footprint and energy consumption in the past [1,2,10,21,22]. In [4] data compression is used to reduce the energy consumption in DRAMs by increasing the effectiveness of low-power operating modes.…”
Section: Related Workmentioning
confidence: 99%
“…Moreover, they lack the ability to physically pack instructions to reduce the hardware cost, program code size, and energy consumption in memory. In [11], the authors managed to reduce the program code size and the memory access energy cost in VLIW architectures by applying instruction compression/decompression between memory and cache. However, it also requires complex compression algorithms and hardware implementation, and the power consumption of the processor has not been effectively reduced.…”
Section: Related Workmentioning
confidence: 99%
“…It is to be noted that addressing this problem is particularly important for embedded MpSoCs as these systems are primarily targeted at embedded environments that process large amounts of data. Data compression has been one of the ways of reducing memory footprint and energy consumption in the past [8,4]. Consequently, one could expect that it could so be used in an MpSoC-based environment.If used correctly, data compression can improve the execution cycles and reduce memory power consumption as it reduces the amount of data that needs to be accessed from the memory and needs to be communicated over the bus (i.e., by storing more data in the on-chip memory, we reduce the volume and frequency of off-chip memory activity).…”
Section: Introductionmentioning
confidence: 99%