2014
DOI: 10.1137/120868402
|View full text |Cite
|
Sign up to set email alerts
|

On the Weight of Halfspaces over Hamming Balls

Abstract: For S ⊆ {0, 1}n , a Boolean function f : S → {−1, 1} is a halfspace over S if there exist w ∈ R n and θ ∈ R such that f (x) = sign(w · x − θ) for all x ∈ S. We give bounds on the size of integer weights w 1 , . . . , w n ∈ Z that are required to represent halfspaces over Hamming balls S = {x ∈ {0, 1} n : x 1 + · · · + x n ≤ k}. Such weight bounds for halfspaces over Hamming balls have immediate consequences for the performance of learning algorithms in the common scenario of learning from very high-dimensional… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 23 publications
0
1
0
Order By: Relevance
“…The case of Hamming balls {0, 1} n ≤k consisting of all vectors with at most k 1s received some attention. Long and Servedio [21] gave bounds for the weights of PTFs for degree d = 1. Their main motivation to study this setting comes from learning theory: in scenarios involving learning a categorical data the common representation for examples in the one-hot encoded vector, which might have an extremely large amount of features, but only a small faction of them can be active at the same time.…”
Section: Introductionmentioning
confidence: 99%
“…The case of Hamming balls {0, 1} n ≤k consisting of all vectors with at most k 1s received some attention. Long and Servedio [21] gave bounds for the weights of PTFs for degree d = 1. Their main motivation to study this setting comes from learning theory: in scenarios involving learning a categorical data the common representation for examples in the one-hot encoded vector, which might have an extremely large amount of features, but only a small faction of them can be active at the same time.…”
Section: Introductionmentioning
confidence: 99%