2020
DOI: 10.48550/arxiv.2002.04840
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Efficient active learning of sparse halfspaces with arbitrary bounded noise

Abstract: In this work we study active learning of homogeneous s-sparse halfspaces in R d under label noise. Even in the absence of label noise this is a challenging problem and only recently have label complexity bounds of the form Õ s • polylog d, 1 been established in Zhang ( 2018) for computationally efficient algorithms under the broad class of isotropic log-concave distributions. In contrast, under high levels of label noise, the label complexity bounds achieved by computationally efficient algorithms are much wor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 74 publications
(84 reference statements)
0
1
0
Order By: Relevance
“…Finally, some prior works on agnostic learning of halfspaces have considered various extensions of the problem such as active agnostic learning (Awasthi et al, 2014;Yan and Zhang, 2017), agnostic learning of sparse halfspaces with sample complexity scaling logarithmically in the ambient dimensionality (Shen and Zhang, 2021), and agnostic learning under weaker noise models such as the random classification noise (Blum et al, 1998;Dunagan and Vempala, 2008), Massart's noise model (Awasthi et al, 2015(Awasthi et al, , 2016Zhang et al, 2020;Diakonikolas et al, 2019Diakonikolas et al, , 2020bDiakonikolas et al, , 2021Chen et al, 2020) and the Tsybakov noise model (Diakonikolas et al, 2020c;Zhang and Li, 2021). We do not consider these extensions in our work.…”
Section: Related Workmentioning
confidence: 99%
“…Finally, some prior works on agnostic learning of halfspaces have considered various extensions of the problem such as active agnostic learning (Awasthi et al, 2014;Yan and Zhang, 2017), agnostic learning of sparse halfspaces with sample complexity scaling logarithmically in the ambient dimensionality (Shen and Zhang, 2021), and agnostic learning under weaker noise models such as the random classification noise (Blum et al, 1998;Dunagan and Vempala, 2008), Massart's noise model (Awasthi et al, 2015(Awasthi et al, , 2016Zhang et al, 2020;Diakonikolas et al, 2019Diakonikolas et al, , 2020bDiakonikolas et al, , 2021Chen et al, 2020) and the Tsybakov noise model (Diakonikolas et al, 2020c;Zhang and Li, 2021). We do not consider these extensions in our work.…”
Section: Related Workmentioning
confidence: 99%