2020
DOI: 10.1109/tcomm.2019.2950930
|View full text |Cite
|
Sign up to set email alerts
|

Centralized Caching and Delivery of Correlated Contents Over Gaussian Broadcast Channels

Abstract: Content delivery in a multi-user cache-aided broadcast network is studied, where a server holding a database of correlated contents communicates with the users over a Gaussian broadcast channel (BC).The minimum transmission power required to satisfy all possible demand combinations is studied, when the users are equipped with caches of equal size. Assuming uncoded cache placement, a lower bound on the required transmit power as a function of the cache capacity is derived. An achievable centralized caching sche… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…Various types of caching strategies like Markov Chain model-based universal caching [78], maximum distance separable (MDS) coded edge caching [82], mobility-aware greedy coded caching [83], proactive caching [85], and cooperative content caching [86] for NOMA systems have been proposed, and the performance of these methods is analyzed mostly in terms of a cache hit rate. Efficient caching techniques improve systems performance by reducing the cost of the link [78], content access delay [79], required transmit power [80], backhaul traffic load [82], [83], delivery latency [86], and enhancing successful decoding probability [81]. To deal with the sudden change in the popularity of the content and replace the cached content, the online content popularity prediction techniques like the popularity prediction model (PPM) and Grassmannian prediction model (GPM) [81], and popularity prediction cache model [79], [92] are proposed.…”
Section: B Caching Delivery Strategymentioning
confidence: 99%
“…Various types of caching strategies like Markov Chain model-based universal caching [78], maximum distance separable (MDS) coded edge caching [82], mobility-aware greedy coded caching [83], proactive caching [85], and cooperative content caching [86] for NOMA systems have been proposed, and the performance of these methods is analyzed mostly in terms of a cache hit rate. Efficient caching techniques improve systems performance by reducing the cost of the link [78], content access delay [79], required transmit power [80], backhaul traffic load [82], [83], delivery latency [86], and enhancing successful decoding probability [81]. To deal with the sudden change in the popularity of the content and replace the cached content, the online content popularity prediction techniques like the popularity prediction model (PPM) and Grassmannian prediction model (GPM) [81], and popularity prediction cache model [79], [92] are proposed.…”
Section: B Caching Delivery Strategymentioning
confidence: 99%
“…Centralized caching: It employs a central controller that possesses a global field of vision of entire network status in determining caching schemes. 5 Moreover, it can be able to obtain the optimal caching performance with optimal caching decisions. Nevertheless, it has faced a huge challenge with a bulk of mobile users in the 5G wireless networks.…”
Section: A Brief Review Of Mecsmentioning
confidence: 99%
“…The controlling procedure of the caching units can be either centralized or distributed. In a centralized process, all the caching nodes are controlled by a central entity, which gathers required information and instructs the nodes to cache and serve specific content [105]- [107]. In a distributed technique, the networking nodes identify the most suitable content for caching and serve user requests locally [45], [108].…”
Section: Caching Strategiesmentioning
confidence: 99%