2016 Annual Conference on Information Science and Systems (CISS) 2016
DOI: 10.1109/ciss.2016.7460522
|View full text |Cite
|
Sign up to set email alerts
|

Cache aided wireless networks: Tradeoffs between storage and latency

Abstract: Abstract-We investigate the fundamental information theoretic limits of cache-aided wireless networks, in which edge nodes (or transmitters) are endowed with caches that can store popular content, such as multimedia files. This architecture aims to localize popular multimedia content by proactively pushing it closer to the edge of the wireless network, thereby alleviating backhaul load. An information theoretic model of such networks is presented, that includes the introduction of a new metric, namely normaliz… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
128
0

Year Published

2017
2017
2019
2019

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 85 publications
(129 citation statements)
references
References 17 publications
1
128
0
Order By: Relevance
“…[1][2][3][4][5][6][7][8][9][10][11][12][13]). In particular, in a network with only one transmitter broadcasting to several receivers, it was shown in [2] that local delivery attains only a small fraction of the gain that caching can offer, and by designing a particular pattern in cache placement at the users and exploiting coding in delivery, a significantly larger global throughput gain can be achieved, which is a function of the entire cache throughout the network.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…[1][2][3][4][5][6][7][8][9][10][11][12][13]). In particular, in a network with only one transmitter broadcasting to several receivers, it was shown in [2] that local delivery attains only a small fraction of the gain that caching can offer, and by designing a particular pattern in cache placement at the users and exploiting coding in delivery, a significantly larger global throughput gain can be achieved, which is a function of the entire cache throughout the network.…”
Section: Introductionmentioning
confidence: 99%
“…As a follow-up, this work has been extended to the case of multiple transmitters in [3], where it was shown that the gain of caching can be improved if several transmitters have access to the entire library of files. Caching at the transmitters was also considered in [4,5] and used to induce collaboration between transmitters in the network. It is also shown in [7] that caches at the transmitters can improve load balancing and increase the opportunities for interference alignment.…”
Section: Introductionmentioning
confidence: 99%
“…An alternative performance measure, called the normalized delivery time (NDT), is introduced in [15], in order to account for the latency in the system. The NDT is defined as the asymptotic delivery time per bit in the high-power, longblocklength regime, that is,…”
Section: System Modelmentioning
confidence: 99%
“…Authors show that, for this particular approach, ZF is not needed for order-optimal delivery. Finally, in [15], authors introduce the normalized delivery time (NDT) as a measure of performance for cacheaided networks. They characterize the NDT for 2 × 2 and 3 × 3 interference networks with caches only at the transmitter side.…”
Section: Introductionmentioning
confidence: 99%
“…Rate-memory tradeoff is investigated in device-to-device (D2D) networks [6] and secrecy constraint [5]. In [7], the authors study the tradeoff between the cache memory at edge nodes and the transmission latency. The ratememory tradeoff of multi-layer coded caching networks is studied in [8].…”
Section: Introductionmentioning
confidence: 99%