1990
DOI: 10.1214/aos/1176347872
|View full text |Cite
|
Sign up to set email alerts
|

Asymptotic Analysis of Penalized Likelihood and Related Estimators

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
104
0

Year Published

2002
2002
2014
2014

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 159 publications
(105 citation statements)
references
References 9 publications
1
104
0
Order By: Relevance
“…The algorithms under consideration approximately solve a linear programming problem, but tend to perform sub-optimally on noisy data. From the margin bound (43), this is indeed not surprising. The minimum on the right hand side of (43) is not necessarily achieved with the maximum (hard) margin (largest θ).…”
Section: Optimization Of the Marginsmentioning
confidence: 85%
See 1 more Smart Citation
“…The algorithms under consideration approximately solve a linear programming problem, but tend to perform sub-optimally on noisy data. From the margin bound (43), this is indeed not surprising. The minimum on the right hand side of (43) is not necessarily achieved with the maximum (hard) margin (largest θ).…”
Section: Optimization Of the Marginsmentioning
confidence: 85%
“…However when using Boosting procedures on noisy realworld data, it turns out that regularization (e.g. [103,186,143,43]) is mandatory if overfitting is to be avoided (cf. Section 6).…”
Section: Learning From Data and The Pac Propertymentioning
confidence: 99%
“…This fundamental property goes under the name of representer theorem. The result, which goes back to (Kimeldorf and Wahba 1971) for squared loss functions, extends to differentiable loss functions, see (Cox and O'Sullivan 1990) and (Poggio and Girosi 1992). More recently, it has been shown that the representer theorem holds in a very general setting ).…”
mentioning
confidence: 81%
“…Examples include penalized least square regression, penalized logistic regression, penalized density estimation, and regularization procedures used in more general nonlinear inverse problems. Cox and O'Sullivan (1990) provided a general framework for studying regularization methods. The method of regularization has two components: a data fit functional component and a regularization penalty component.…”
Section: Support Vector Machines For the Nonstandard Situationmentioning
confidence: 99%