2018 IEEE Global Communications Conference (GLOBECOM) 2018
DOI: 10.1109/glocom.2018.8647616
|View full text |Cite
|
Sign up to set email alerts
|

Federated Learning Based Proactive Content Caching in Edge Computing

Abstract: Content caching is a promising approach in edge computing to cope with the explosive growth of mobile data on 5G networks, where contents are typically placed on local caches for fast and repetitive data access. Due to the capacity limit of caches, it is essential to predict the popularity of files and cache those popular ones. However, the fluctuated popularity of files makes the prediction a highly challenging task. To tackle this challenge, many recent works propose learning based approaches which gather th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
71
0
2

Year Published

2020
2020
2021
2021

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 123 publications
(73 citation statements)
references
References 13 publications
0
71
0
2
Order By: Relevance
“…In fact, the training data is always distributed at vehicles and unlikely to be uploaded considering bandwidth overhead and privacy issues. Fortunately, federated learning has the potential to realize distributed learning [22], [23]. To achieve high cache efficiency as well as to protect users' privacy, Yu et al [22] propose a federated learning based proactive content caching scheme which does not require to gather users' data centrally for training.…”
Section: A Related Work and Chanllengesmentioning
confidence: 99%
“…In fact, the training data is always distributed at vehicles and unlikely to be uploaded considering bandwidth overhead and privacy issues. Fortunately, federated learning has the potential to realize distributed learning [22], [23]. To achieve high cache efficiency as well as to protect users' privacy, Yu et al [22] propose a federated learning based proactive content caching scheme which does not require to gather users' data centrally for training.…”
Section: A Related Work and Chanllengesmentioning
confidence: 99%
“…User preferences can be predicted in advance or on a regular basis (e.g., hourly, daily, or weekly) through systematic learning and analysis of user social behavior [ 36 , 37 ]. In this paper, considering the privacy security of users, we adopt the federated learning method [ 7 ] to accurately predict the content popularity in the region.…”
Section: System Modelmentioning
confidence: 99%
“…Due to the limited storage capacity of edge nodes, the network performance can be effectively improved by predicting the content popularity and actively caching the most popular one. However, most existing caching schemes are designed for highly controlled environments where users need to upload local private data to a central server, which may pose privacy and security risks [ 7 ]. Furthermore, with the increase of the number of users and data, the unreliability and communication cost of wireless networks cannot be ignored.…”
Section: Introductionmentioning
confidence: 99%
“…The main reason for the poor performance in the LRU and LFU solutions is that, the algorithms in LRU and LFU do not consider the popularity of contents in the future. As a result, the solutions do not adapt well to the dynamically changing content popularity and they achieve low cache efficiency [89], [113]. For instance, the LFU framework is not able to reach a good performance in IoT environment because it does not consider the saltation and timeliness of the IoT data popularity [113].…”
Section: Fl-based Edge Cachingmentioning
confidence: 99%
“…Although many studies have been done to address the issues of edge caching and in-network caching, many issues remain open for more research. For example, the work in [34], [89], [113], [120], [131] considered the future popularity of content while making cache decisions. In [113], various fundamental questions about IoT data popularity and related popularity-based caching were considered.…”
Section: Edge Caching and In-network Cachingmentioning
confidence: 99%