2019
DOI: 10.1016/j.sigpro.2019.02.003
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Bayesian learning with multiple dictionaries

Abstract: Sparse Bayesian learning (SBL) has emerged as a fast and competitive method to perform sparse processing. The SBL algorithm, which is developed using a Bayesian framework, approximately solves a non-convex optimization problem using fixed point updates. It provides comparable performance and is significantly faster than convex optimization techniques used in sparse processing. We propose a signal model which accounts for dictionary mismatch and the presence of errors in the weight vector at low signal-to-noise… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
61
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

3
5

Authors

Journals

citations
Cited by 66 publications
(61 citation statements)
references
References 49 publications
0
61
0
Order By: Relevance
“…In acoustics this framework has been used for range estimation 30 and for sparse estimation via the sparse Bayesian learning approach. 31,32 In the latter, the sparsity is controlled by diagonal prior covariance matrix, where entries with zero prior variance will force the posterior variance and mean to be zero. With prior knowledge and assumptions about the data, Bayesian approaches to parameter estimation can prevent overfitting.…”
Section: F Bayesian Machine Learningmentioning
confidence: 99%
“…In acoustics this framework has been used for range estimation 30 and for sparse estimation via the sparse Bayesian learning approach. 31,32 In the latter, the sparsity is controlled by diagonal prior covariance matrix, where entries with zero prior variance will force the posterior variance and mean to be zero. With prior knowledge and assumptions about the data, Bayesian approaches to parameter estimation can prevent overfitting.…”
Section: F Bayesian Machine Learningmentioning
confidence: 99%
“…This update rule was used in [42] . Using an exponent of 1 (instead of 0.5) in (36) gives the update equation used in [39,43,57] .…”
Section: Source Power Estimationmentioning
confidence: 99%
“…Specifically, we derive noise variance estimates and demonstrate this for compressive beamforming [37][38][39][40][41] using multiple measurement vectors (MMV), also called multiple snapshots. We solve the MMV problem using the sparse Bayesian learning (SBL) framework [39,42,43] . We assume the source signals to jointly follow a zero-mean multivariate complex normal distribution with unknown power levels.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Second, the SBL procedure has been shown to be robust to undesirable dictionary structure such as the combination of high coherence 2 and diverse column magnitudes [13], structure that violates sufficient conditions for 1 -based sparse estimation algorithms but that nonetheless commonly appears in real-world applications such as imaging [14]. Third, many applications make use of the SBL framework because its probabilistic framework allows other application-specific priors or constraints to be incorporated into the inference process in a simple and interpretable way (e.g., [15]). Finally, the SBL inference procedure allows the noise variance (sparsity) parameter to be learned automatically, reducing the need for manual parameter tuning compared with similar algorithms.…”
Section: Introductionmentioning
confidence: 99%