Proceedings of the Web Conference 2021 2021
DOI: 10.1145/3442381.3450020
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Embeddings for Interaction Prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
2
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 32 publications
0
4
0
Order By: Relevance
“…With regard to representing sequential interaction networks, a significant amount of work has been proposed to learn the user and item embeddings from their historical interaction records in Euclidean space [4,6,12,14,27,33]. Recurrent models such as Time-LSTM [59], Time-Aware LSTM [3] and RRN [53] capture dynamics of users and items by endowing them with a long short-term memory.…”
Section: Datasetmentioning
confidence: 99%
See 2 more Smart Citations
“…With regard to representing sequential interaction networks, a significant amount of work has been proposed to learn the user and item embeddings from their historical interaction records in Euclidean space [4,6,12,14,27,33]. Recurrent models such as Time-LSTM [59], Time-Aware LSTM [3] and RRN [53] capture dynamics of users and items by endowing them with a long short-term memory.…”
Section: Datasetmentioning
confidence: 99%
“…• Random walk based models: CAW [51], CTDNE [33] are two temporal network models adopting causal and anonymous random walks. • Sequence network embedding models: in this category, JODIE [28], HILI [10] and DeePRed [27] are three state-ofthe-art methods in generating embeddings from sequential networks employing recursive networks. • Hyperbolic models: we compare SINCERE with HGCF [42] which is a hyperbolic method on collaborative filtering.…”
Section: Experiments 41 Datasets and Compared Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…To investigate this subject, we conduct an in-depth study by referencing previous research [14], [17]. We develop a template to process movie reviews from a deliberately selected public dataset using LLMs to generate user and item representations.…”
Section: Introductionmentioning
confidence: 99%