2006
DOI: 10.1007/11776420_26
|View full text |Cite
|
Sign up to set email alerts
|

Improved Lower Bounds for Learning Intersections of Halfspaces

Abstract: Recent work of Klivans, Stavropoulos, and Vasilyan initiated the study of testable learning with distribution shift (TDS learning), where a learner is given labeled samples from training distribution D, unlabeled samples from test distribution D ′ , and the goal is to output a classifier with low error on D ′ whenever the training samples pass a corresponding test. Their model deviates from all prior work in that no assumptions are made on D ′ . Instead, the test must accept (with high probability) when the ma… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2006
2006
2009
2009

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 21 publications
0
5
0
Order By: Relevance
“…Until recently, the problem was known to be hard only for proper learning: if the learner's output hypothesis must be from a restricted class of functions (e.g., intersections of halfspaces), then the learning problem is NP-hard with respect to randomized reductions [4,2]. Klivans and Sherstov [16] have since obtained a 2 Ω( √ n) lower bound on the sample complexity of learning intersections of √ n halfspaces in the statistical query (SQ) model, an important restriction of the PAC model. Since the SQ model is stronger than PAC, the lower bounds in [16] do not not imply hardness in the PAC model, the subject of this paper.…”
Section: Previous Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Until recently, the problem was known to be hard only for proper learning: if the learner's output hypothesis must be from a restricted class of functions (e.g., intersections of halfspaces), then the learning problem is NP-hard with respect to randomized reductions [4,2]. Klivans and Sherstov [16] have since obtained a 2 Ω( √ n) lower bound on the sample complexity of learning intersections of √ n halfspaces in the statistical query (SQ) model, an important restriction of the PAC model. Since the SQ model is stronger than PAC, the lower bounds in [16] do not not imply hardness in the PAC model, the subject of this paper.…”
Section: Previous Resultsmentioning
confidence: 99%
“…Klivans and Sherstov [16] have since obtained a 2 Ω( √ n) lower bound on the sample complexity of learning intersections of √ n halfspaces in the statistical query (SQ) model, an important restriction of the PAC model. Since the SQ model is stronger than PAC, the lower bounds in [16] do not not imply hardness in the PAC model, the subject of this paper. We are not aware of any other results on the difficulty of learning intersections of halfspaces.…”
Section: Previous Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…without relying on any computational hardness assumptions) [16,4]. This result is a special case of lower bounds based on the SQ dimension which was introduced by Blum et al [4] and studied in a number of subsequent works [5,6,26,18,22]. Limitations on learnability in the SQ model imply limitations on evolutionary algorithms.…”
Section: Introductionmentioning
confidence: 85%
“…There are numerous negative results known for proper learning of such concepts [38,2], and for learning in the Statistical Query model [32]. Based on certain cryptographic assumptions, Kearns and Valiant showed that constant depth threshold circuits cannot be learned over a certain distribution using any representation [24].…”
Section: Learning Thresholds Of Halfspacesmentioning
confidence: 99%