2022
DOI: 10.48550/arxiv.2208.08606
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

AoI-based Temporal Attention Graph Neural Network for Popularity Prediction and Content Caching

Abstract: Along with the fast development of network technology and the rapid growth of network equipment, the data throughput is sharply increasing. To handle the problem of backhaul bottleneck in cellular network and satisfy people's requirements about latency, the network architecture like informationcentric network (ICN) intends to proactively keep limited popular content at the edge of network based on predicted results. Meanwhile, the interactions between the content (e.g., deep neural network models, Wikipedia-al… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(6 citation statements)
references
References 24 publications
0
6
0
Order By: Relevance
“…Table II demonstrates the prediction performance of our proposed models (i.e., M1-STGN, M2-STGN), and their variants, as well as the original TGNs in [16], [17]. With a sparse dataset, it is clear that our models can yield better results in both transductive task and inductive task, even when we aggregate the semantic information only by summation.…”
Section: B Resultsmentioning
confidence: 99%
See 4 more Smart Citations
“…Table II demonstrates the prediction performance of our proposed models (i.e., M1-STGN, M2-STGN), and their variants, as well as the original TGNs in [16], [17]. With a sparse dataset, it is clear that our models can yield better results in both transductive task and inductive task, even when we aggregate the semantic information only by summation.…”
Section: B Resultsmentioning
confidence: 99%
“…1, it is intractable for classical GNN-based methods to predict the user preference in a sparse graph. But the attachment of semantics constructs more implicit structural patterns, which facilitates preference inference and may further improve the prediction performance of TGN models [17]. Additionally, a content might possess multiple genres (e.g., a fictional action movie containing both fiction and action genres) and the preferred genre might vary across users as well.…”
Section: Genrementioning
confidence: 99%
See 3 more Smart Citations