2017 IEEE 7th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP) 2017
DOI: 10.1109/camsap.2017.8313172
|View full text |Cite
|
Sign up to set email alerts
|

One-Bit compressive sampling with time-varying thresholds for multiple sinusoids

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 32 publications
(20 citation statements)
references
References 23 publications
0
20
0
Order By: Relevance
“…Similarly, Gianelli et al [32] describe the properties of the maximum likelihood estimator (MLE) of the parameters of a noisy single-sinusoidal signal after binary quantization. The analysis is extended to the binary quantization of multiple sinusoids in [33], where a nonlinear least squares estimator is described to solve the problem. Finally, numerical methods for the estimation of the likelihood function may resort to iterative procedures such as the expectation-maximization approach [34]- [36].…”
Section: State Of the Artmentioning
confidence: 99%
“…Similarly, Gianelli et al [32] describe the properties of the maximum likelihood estimator (MLE) of the parameters of a noisy single-sinusoidal signal after binary quantization. The analysis is extended to the binary quantization of multiple sinusoids in [33], where a nonlinear least squares estimator is described to solve the problem. Finally, numerical methods for the estimation of the likelihood function may resort to iterative procedures such as the expectation-maximization approach [34]- [36].…”
Section: State Of the Artmentioning
confidence: 99%
“…For given ω, the above optimization problem is convex in { a k } K k=1 , { b k } K k=1 and λ. Therefore, for fixed ω, globally optimal methods can be employed to find the minimizing values of { a k } K k=1 , { b k } K k=1 and λ [26], [34]. With this fact in mind, the ML estimator can be summarized as follows.…”
Section: Maximum Likelihood Estimation and 1brelaxmentioning
confidence: 99%
“…Specifically, 1bMMRELAX is more than an order of magnitude faster than 1bRELAX, while maintaining similar sinusoidal parameter estimation accuracy. Regarding certain penalized/sparse algorithms proposed in the literature for sinusoidal parameter estimation from signed measurements [34], [36], we do not consider them in this comparative study because they are slow for large values of N and have worse performance than 1bRELAX [34]. Example 2: To further illustrate the resolution capability and estimation accuracy of 1bCLEAN, 1bMMRELAX and 1bRELAX, we consider a signal composed of two sinusoids with a small frequency separation and the same amplitudes and phases.…”
Section: B Sinusoidal Parameter Estimationmentioning
confidence: 99%
“…andλ as functions of ω. The details on the implementation of the K-dimensional frequency search can be found in [24].…”
Section: A Maximum Likelihood Estimationmentioning
confidence: 99%
“…Note that as the number of sinusoids K and the size of the signed measurement matrix increase, this method can become computationally prohibitive and thus, more efficient algorithms, such as the relaxation based methods in [18], [24], may be considered.…”
Section: A Maximum Likelihood Estimationmentioning
confidence: 99%