2012
DOI: 10.4064/am39-2-4
|View full text |Cite
|
Sign up to set email alerts
|

Probabilistic comparison of weighted majority rules

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2018
2018
2018
2018

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…In this review, we will try to keep the presentation as simple as possible by presenting the problems, the hypotheses, and the ideas behind the methods without going in too many technical details but still presenting the state‐of‐the‐art results in the field. In particular, we will start from the first works of the 1960s about probability inequalities (Anguita, Boni, & Ridella, ; Anguita, Ghelardoni, Ghio, Oneto, & Ridella, ; Arlot, ; Arlot & Celisse, ; Bennett, ; Bentkus, ; Bernstein, ; Clopper & Pearson, ; Devroye & Wagner, ; Efron, , ; Efron & Tibshirani, ; Hoeffding, ; Koavi, ; Massart, ; Maurer & Pontil, ; Oneto, Ghio et al, ; Talagrand, ), and proceed with the asymptotic analysis (Abu‐Mostafa, ; Blumer, Ehrenfeucht, Haussler, & Warmuth, ; Floyd & Warmuth, ; Vapnik, ) of the 1970s, and concentration inequalities (Bobkov & Ledoux, ; Boucheron et al, , ; Bousquet, ; Klein & Rio, ; Ledoux, , ; Talagrand, , , ) of the 1980s, then move to the finite sample analysis (Ambroladze, Parrado‐Hernández, & Shawe‐Taylor, ; Anguita, Ghio et al, ; Anguita, Ghio, Oneto, & Ridella, ; Audibert, ; Audibert & Bousquet, ; Bartlett, Boucheron, & Lugosi, ; Bartlett, Bousquet, & Mendelson, , ; Bartlett & Mendelson, ; Bégin, Germain, Laviolette, & Roy, , ; Berend & Kontorovitch, ; Blanchard & Massart, ; Catoni, ; Gelman, Carlin, Stern, & Rubin, ; Germain, Lacasse, Laviolette, & Marchand, ; Germain, Lacasse, Laviolette, Marchand, & Roy, ; Germain, Lacoste, Marchand, Shanian, & Laviolette, ; Koltchinskii, , ; Lacasse, Laviolette, Marchand, Germain, & Usunier,…”
Section: The “Five W” Of Ms and Eementioning
confidence: 99%
See 1 more Smart Citation
“…In this review, we will try to keep the presentation as simple as possible by presenting the problems, the hypotheses, and the ideas behind the methods without going in too many technical details but still presenting the state‐of‐the‐art results in the field. In particular, we will start from the first works of the 1960s about probability inequalities (Anguita, Boni, & Ridella, ; Anguita, Ghelardoni, Ghio, Oneto, & Ridella, ; Arlot, ; Arlot & Celisse, ; Bennett, ; Bentkus, ; Bernstein, ; Clopper & Pearson, ; Devroye & Wagner, ; Efron, , ; Efron & Tibshirani, ; Hoeffding, ; Koavi, ; Massart, ; Maurer & Pontil, ; Oneto, Ghio et al, ; Talagrand, ), and proceed with the asymptotic analysis (Abu‐Mostafa, ; Blumer, Ehrenfeucht, Haussler, & Warmuth, ; Floyd & Warmuth, ; Vapnik, ) of the 1970s, and concentration inequalities (Bobkov & Ledoux, ; Boucheron et al, , ; Bousquet, ; Klein & Rio, ; Ledoux, , ; Talagrand, , , ) of the 1980s, then move to the finite sample analysis (Ambroladze, Parrado‐Hernández, & Shawe‐Taylor, ; Anguita, Ghio et al, ; Anguita, Ghio, Oneto, & Ridella, ; Audibert, ; Audibert & Bousquet, ; Bartlett, Boucheron, & Lugosi, ; Bartlett, Bousquet, & Mendelson, , ; Bartlett & Mendelson, ; Bégin, Germain, Laviolette, & Roy, , ; Berend & Kontorovitch, ; Blanchard & Massart, ; Catoni, ; Gelman, Carlin, Stern, & Rubin, ; Germain, Lacasse, Laviolette, & Marchand, ; Germain, Lacasse, Laviolette, Marchand, & Roy, ; Germain, Lacoste, Marchand, Shanian, & Laviolette, ; Koltchinskii, , ; Lacasse, Laviolette, Marchand, Germain, & Usunier,…”
Section: The “Five W” Of Ms and Eementioning
confidence: 99%
“…In fact, many state‐of‐the‐art algorithms search for a weighted combination of simpler rules (Germain et al, ): bagging (Breiman, , ), boosting (Schapire et al, ; Schapire & Singer, ), and Bayesian approaches (Gelman et al, ) or even Kernel methods (Vapnik, ) and neural networks (Bishop, ). The major open problem in this scenario is how to weight the different rules in order to obtain good performance (Berend & Kontorovitch, ; Catoni, ; Lever et al, , ; Nitzan & Paroush, ; Parrado‐Hernández et al, ), how these performances can be assessed (Catoni, ; Donsker & Varadhan, ; Germain et al, , ; Lacasse et al, ; Langford & Seeger, ; Laviolette & Marchand, , ; Lever et al, , ; London et al, ; McAllester, , , ; Shawe‐Taylor & Williamson, ; Tolstikhin & Seldin, ; Van Erven, ), and how this theoretical framework can be exploited for deriving new learning approaches or for applying it in other contexts (Audibert, ; Audibert & Bousquet, ; Bégin et al, ; Germain et al, ; McAllester, ; Morvant, ; Ralaivola et al, ; Roy et al, ; Seeger, , ; Seldin et al, , ; Seldin & Tishby, , ; Shawe‐Taylor & Langford, ). The PAC‐Bayes approach is one of the sharpest analysis frameworks in this context, since it can provide tight bounds on the risk of the Gibbs classifier (GC), also called randomized (or probabilistic) classifier, and the Bayes classifier (BC), also called weighted majority vote classifier (Germain et al, ).…”
Section: Pac‐bayes Theorymentioning
confidence: 99%