2019
DOI: 10.48550/arxiv.1908.05435
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Temporal Collaborative Ranking Via Personalized Transformer

Liwei Wu,
Shuqing Li,
Cho-Jui Hsieh
et al.

Abstract: The collaborative ranking problem has been an important open research question as most recommendation problems can be naturally formulated as ranking problems. While much of collaborative ranking methodology assumes static ranking data, the importance of temporal information to improving ranking performance is increasingly apparent. Recent advances in deep learning, especially the discovery of various attention mechanisms and newer architectures in addition to widely used RNN and CNN in natural language proces… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…On one hand, we propose a novel way to encode long range graph interactions without require any training using bloom filters as backbone [119]. On the other hand, with the help of a new embedding-layer regularization called Stochastic Shared Embeddings (SSE) [117], we can also introduce personalization for the state-of-the-art sequential recommendation model and achieve much better ranking performance with our personalized model [118], where personalization is crucial for the success of recommender systems unlike most natural language tasks. This new regularization not only helps existing collaborative filtering and collaborative ranking algorithms but also benefits methods in natural language processing in fields like machine translation and sentiment analysis [117].…”
Section: Contributions and Outline Of This Thesismentioning
confidence: 99%
See 1 more Smart Citation
“…On one hand, we propose a novel way to encode long range graph interactions without require any training using bloom filters as backbone [119]. On the other hand, with the help of a new embedding-layer regularization called Stochastic Shared Embeddings (SSE) [117], we can also introduce personalization for the state-of-the-art sequential recommendation model and achieve much better ranking performance with our personalized model [118], where personalization is crucial for the success of recommender systems unlike most natural language tasks. This new regularization not only helps existing collaborative filtering and collaborative ranking algorithms but also benefits methods in natural language processing in fields like machine translation and sentiment analysis [117].…”
Section: Contributions and Outline Of This Thesismentioning
confidence: 99%
“…However, to the best of our knowledge, until now, they have not been used to encode graphs, nor has this encoding been applied to recommender systems. So it would be interesting to extend our work to other recommender systems settings, such as [118] and [117].…”
Section: Related Workmentioning
confidence: 99%