2018 IEEE International Symposium on Information Theory (ISIT) 2018
DOI: 10.1109/isit.2018.8437543
|View full text |Cite
|
Sign up to set email alerts
|

Universal Batch Learning with Log-Loss

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 15 publications
(14 citation statements)
references
References 10 publications
0
14
0
Order By: Relevance
“…Theorem 1 (Fogel and Feder (2018)). The universal learner, denoted as the pNML, minimizes the regret for the worst case test label…”
Section: Notation and Preliminariesmentioning
confidence: 96%
See 4 more Smart Citations
“…Theorem 1 (Fogel and Feder (2018)). The universal learner, denoted as the pNML, minimizes the regret for the worst case test label…”
Section: Notation and Preliminariesmentioning
confidence: 96%
“…The pNML learner. The pNML learner is the min-max solution of the supervised batch learning in the individual setting (Fogel and Feder, 2018). For sequential prediction it is termed the conditional normalized maximum likelihood (Rissanen and Roos, 2007;Roos and Rissanen, 2008).…”
Section: Dnn Adaptationmentioning
confidence: 99%
See 3 more Smart Citations