1997
DOI: 10.2139/ssrn.20534
|View full text |Cite
|
Sign up to set email alerts
|

A Data-Dependent Skeleton Estimate and a Scale-Sensitive Dimension for Classification

Abstract: The classical binary classi cation problem is investigated when it is known in advance that the posterior probability function (or regression function) belongs to some class of functions. We i n troduce and analyze a method which e ectively exploits this knowledge. The method is based on minimizing the empirical risk over a carefully selected \skeleton" of the class of regression functions. The skeleton is a covering of the class based on a data-dependent metric, especially tted for classi cation. A new scale-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

1998
1998
1998
1998

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 17 publications
(4 reference statements)
0
1
0
Order By: Relevance
“…Covering numbers enter naturally in learning problems when one replaces an infinite class of rules by a carefully selected finite subset of rules, and then attempts to use empirical risk minimization (or other learning rule) over this finite subcollection. Work along these lines can be found in [40], [53], [102], [155], [175], [274], and [285].…”
Section: B Vc Dimension and Empirical Risk Minimizationmentioning
confidence: 99%
“…Covering numbers enter naturally in learning problems when one replaces an infinite class of rules by a carefully selected finite subset of rules, and then attempts to use empirical risk minimization (or other learning rule) over this finite subcollection. Work along these lines can be found in [40], [53], [102], [155], [175], [274], and [285].…”
Section: B Vc Dimension and Empirical Risk Minimizationmentioning
confidence: 99%