Proceedings of the 34th ACM International Conference on Supercomputing 2020
DOI: 10.1145/3392717.3392748
|View full text |Cite
|
Sign up to set email alerts
|

Fast distributed bandits for online recommendation systems

Abstract: Contextual bandit algorithms are commonly used in recommender systems, where content popularity can change rapidly. These algorithms continuously learn latent mappings between users and items, based on contexts associated with them both. Recent recommendation algorithms that learn clustering or social structures between users have exhibited higher recommendation accuracy. However, as the number of users and items in the environment increases, the time required to generate recommendations deteriorates significa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 52 publications
(12 citation statements)
references
References 23 publications
0
12
0
Order By: Relevance
“…Moreover, HMAB aims to learn latent parameters for the (1) , 𝑒 (2) , β€’ β€’ β€’ , 𝑒 (𝑀 ) } A Arm set, A = {π‘Ž (1) , π‘Ž (2) , β€’ β€’ β€’ , π‘Ž (𝐾 ) } I Item set, I = {𝑖 (1) , 𝑖 (2) nodes in the hierarchy tree, which is totally different from our goal of exploring users' latent interests. Distributed bandit algorithms, such as DCCB [14] and DistCLUB [20], aim to speed up the computation by distributing the workloads in parallel. However, these methods do not address the issue of searching from tremendous items, the computational cost is still too expensive for responding users' requests in an online manner (for example, how to response 100 users' concurrent requests within 10 milliseconds in a scenario involving one million items).…”
Section: Cluster-of-bandit Algorithmsmentioning
confidence: 99%
“…Moreover, HMAB aims to learn latent parameters for the (1) , 𝑒 (2) , β€’ β€’ β€’ , 𝑒 (𝑀 ) } A Arm set, A = {π‘Ž (1) , π‘Ž (2) , β€’ β€’ β€’ , π‘Ž (𝐾 ) } I Item set, I = {𝑖 (1) , 𝑖 (2) nodes in the hierarchy tree, which is totally different from our goal of exploring users' latent interests. Distributed bandit algorithms, such as DCCB [14] and DistCLUB [20], aim to speed up the computation by distributing the workloads in parallel. However, these methods do not address the issue of searching from tremendous items, the computational cost is still too expensive for responding users' requests in an online manner (for example, how to response 100 users' concurrent requests within 10 milliseconds in a scenario involving one million items).…”
Section: Cluster-of-bandit Algorithmsmentioning
confidence: 99%
“…Hao et al [19], to solve a similar problem, employs clustering to group users in ad-hoc social network environments. Mahadik et al [32] deals with a scalable algorithm for the problem, DistCLUB. Gentile, Li and Zapella [14] investigate on-line clustering; the paper presents a strict analysis of the problem.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Moreover, HMAB aims to learn latent parameters for the nodes in the hierarchy tree, which is totally different from our goal of exploring users' latent interests. Distributed bandit algorithms, such as DCCB [14] and DistCLUB [20], aim to speed up the computation by distributing the workloads in parallel. However, these methods do not address the issue of searching from tremendous items, the computational cost is still too expensive for responding users' requests in an online manner (for example, how to response 100 users' concurrent requests within 10 milliseconds in a scenario involving one million items).…”
Section: Cluster-of-bandit Algorithmsmentioning
confidence: 99%