2016 IEEE International Conference on Image Processing (ICIP) 2016
DOI: 10.1109/icip.2016.7533085
|View full text |Cite
|
Sign up to set email alerts
|

Robust Bayesian method for simultaneous block sparse signal recovery with applications to face recognition

Abstract: In this paper, we present a novel Bayesian approach to recover simultaneously block sparse signals in the presence of outliers. The key advantage of our proposed method is the ability to handle nonstationary outliers, i.e. outliers which have time varying support. We validate our approach with empirical results showing the superiority of the proposed method over competing approaches in synthetic data experiments as well as the multiple measurement face recognition problem.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 23 publications
0
10
0
Order By: Relevance
“…The existing SSR algorithms are commonly developed by simplifying the NP-hard problem as a constrained optimization problem by greedy approximations or by applying L 1 -norm or L p -norm constraints on the decomposition coefficients to find sub-optimal solutions. Many SSR algorithms have been developed in the past two decade, such as matching pursuit (MP) [4], greedy basis pursuit [5], Sparse Bayesian learning (SBL) [6], nonconvex regularization [7], and applications of SSR extend into many fields [8][9][10][11][12].…”
Section: Ultrasonic Echo Estimation Via Sparse Signal Representationmentioning
confidence: 99%
“…The existing SSR algorithms are commonly developed by simplifying the NP-hard problem as a constrained optimization problem by greedy approximations or by applying L 1 -norm or L p -norm constraints on the decomposition coefficients to find sub-optimal solutions. Many SSR algorithms have been developed in the past two decade, such as matching pursuit (MP) [4], greedy basis pursuit [5], Sparse Bayesian learning (SBL) [6], nonconvex regularization [7], and applications of SSR extend into many fields [8][9][10][11][12].…”
Section: Ultrasonic Echo Estimation Via Sparse Signal Representationmentioning
confidence: 99%
“…RVM has been shown to achieve much higher SR rates than many deterministic approaches [31] and has the potential to outperform algorithms based on the formulation in (1) in the re-id task. We build upon the signal model we introduced in [32] and refer to our approach as the Relevance Subject Machine (RSM).…”
Section: Contributionmentioning
confidence: 99%
“…The combination of block sparsity, robustness to sparse noise, and joint sparsity over multiple observations was first studied by us in [32]. The signal model considered was…”
Section: Bayesian Sparse Recoverymentioning
confidence: 99%
“…This letter considers the recovery 1 of high dimensional structured sparse signals from low dimensional linear measurements, a problem relevant in many signal processing and machine learning applications [1]- [4]. This letter considers two structured sparse recovery scenarios given by a) multiple measurement vector (MMV) model and b) block sparse (BS) model.…”
Section: Introductionmentioning
confidence: 99%
“…In BS, we consider a regression model with L = 1. However, the p entries of B are divided into p b = p/l b nonoverlapping blocks of equal size l b such that the entries in 1 The following notations are used. X[i, j] denotes the (i, j) th entry of a matrix X. X[:, I] and X[I, :] denote the columns and rows of matrix X indexed by I. X T , X −1 and X † represent the transpose, inverse and Moore-Penrose pseudo inverse of X respectively.…”
Section: Introductionmentioning
confidence: 99%