2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2017
DOI: 10.1109/icassp.2017.7953385
|View full text |Cite
|
Sign up to set email alerts
|

Stable recovery of sparse vectors from random sinusoidal feature maps

Abstract: Random sinusoidal features are a popular approach for speeding up kernel-based inference in large datasets. Prior to the inference stage, the approach suggests performing dimensionality reduction by first multiplying each data vector by a random Gaussian matrix, and then computing an element-wise sinusoid. Theoretical analysis shows that collecting a sufficient number of such features can be reliably used for subsequent inference in kernel classification and regression.In this work, we demonstrate that with a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
3
1

Relationship

4
0

Authors

Journals

citations
Cited by 4 publications
(18 citation statements)
references
References 31 publications
0
18
0
Order By: Relevance
“…As discussed in [5], the block diagonal structure of D and C allows the signal reconstruction problem to be reduced to a sequence of decoupled scalar estimation problems; such a decoupling enables the estimation of each entry of z and u independently of all other entries. We also assume that the analog values of the input signal lies within a range known a priori.…”
Section: )mentioning
confidence: 99%
See 1 more Smart Citation
“…As discussed in [5], the block diagonal structure of D and C allows the signal reconstruction problem to be reduced to a sequence of decoupled scalar estimation problems; such a decoupling enables the estimation of each entry of z and u independently of all other entries. We also assume that the analog values of the input signal lies within a range known a priori.…”
Section: )mentioning
confidence: 99%
“…There are several different ways of doing this, including the multi-shot UHDR method of [3]. Here, we describe a novel approach, based on the MF-Sparse algorithm of [5]. We assume that the entries of z belong to some bounded set Ω ∈ R. Fix l ∈ [q] and form θ = exp(i u).…”
Section: Modulo Recoverymentioning
confidence: 99%
“…In the periodic case, we use the approach of [17] both for designing the matrix X and the link function g. Specifically, we let X be factorized as X = DB, where D ∈ R m×q , and B ∈ R q×n have some specific structures; please see Section 2 for details. Again, for this case, we demonstrate a novel two-stage algorithm that stably estimate the components θ1 and θ2.…”
Section: Summary Of Contributionsmentioning
confidence: 99%
“…In this section, we focus on the periodic link functions which are either sinusoidal (complex-exponential), or any periodic function such that it is monotonic within each period. We start with the sinusoidal (complex-exponential) link function and follow the approach of [17]. In [17], the authors proposed an algorithm called MF-Sparse for recovering an underlying signal which is arbitrary sparse, or is the superposition of two arbitrary sparse components, but they only considered sinusoidal link functions.…”
Section: Periodic Link Functionsmentioning
confidence: 99%
See 1 more Smart Citation