2004
DOI: 10.1162/089976604323057452
|View full text |Cite
|
Sign up to set email alerts
|

Information Geometry of U-Boost and Bregman Divergence

Abstract: We aim at an extension of AdaBoost to U-Boost, in the paradigm to build a stronger classification machine from a set of weak learning machines. A geometric understanding of the Bregman divergence defined by a generic convex function U leads to the U-Boost method in the framework of information geometry extended to the space of the finite measures over a label set. We propose two versions of U-Boost learning algorithms by taking account of whether the domain is restricted to the space of probability functions. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
121
0
2

Year Published

2006
2006
2018
2018

Publication Types

Select...
5
2
2

Relationship

3
6

Authors

Journals

citations
Cited by 149 publications
(123 citation statements)
references
References 15 publications
0
121
0
2
Order By: Relevance
“…For properties and applications of Bregman geometry see Eguchi (2005), Murata et al (2004). However most of these do not depend on the Bregman form, and continue to hold for a general decision geometry.…”
Section: Bregman Geometrymentioning
confidence: 99%
“…For properties and applications of Bregman geometry see Eguchi (2005), Murata et al (2004). However most of these do not depend on the Bregman form, and continue to hold for a general decision geometry.…”
Section: Bregman Geometrymentioning
confidence: 99%
“…The first and typical one in machine learning community is AdaBoost [19] for the minimization of the exponential loss. Other boosting methods for various objective function such as likelihood, L 2 -loss, mixture of the exponential loss and naive loss, U-loss, AUC and pAUC [4], [5], [7], [13], [14], [20] have been considered and applied to real data analysis. However, the boosting methods for other purpose than prediction seem to have been paid little attention, see [21]- [23].…”
Section: Boosting For Density Estimationmentioning
confidence: 99%
“…Boosting satisfies a great applicability for minimization of various loss functions. A class of U-loss functions is discussed with a close association with U-entropy and Udivergence [5], [6], where U is a generator function on the real line such as an exponential function. Any U-loss function can employ the idea of boosting with a simple change from AdaBoost.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, this extension is useful for proposing boosting methods [10][11][12][13][14][15][16].…”
Section: Power Divergencementioning
confidence: 99%