2024
DOI: 10.54254/2755-2721/68/20241404
|View full text |Cite
|
Sign up to set email alerts
|

Advancing decision-making strategies through a comprehensive study of Multi-Armed Bandit algorithms and applications

Yang Kuang

Abstract: Multi-Armed Bandit (MAB) strategies play a pivotal role in decision-making algorithms by adeptly managing the exploration-exploitation trade-off in environments characterized by multiple options and constrained resources. This paper delves into the core MAB algorithms, including Explore-Then-Commit (ETC), Thompson Sampling, and Upper Confidence Bound (UCB). It provides a detailed examination of their theoretical underpinnings and their application across diverse sectors such as recommender systems, healthcare,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 9 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?