2019
DOI: 10.1109/tsp.2019.2943225
|View full text |Cite
|
Sign up to set email alerts
|

Non-Negative Orthogonal Greedy Algorithms

Abstract: Orthogonal greedy algorithms are popular sparse signal reconstruction algorithms. Their principle is to select atoms one by one. A series of unconstrained least-squares subproblems of gradually increasing size is solved to compute the approximation coefficients, which is efficiently performed using a fast recursive update scheme. When dealing with nonnegative sparse signal reconstruction, a series of non-negative least-squares subproblems have to be solved. Fast implementation becomes tricky since each subprob… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
26
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 33 publications
(27 citation statements)
references
References 42 publications
1
26
0
Order By: Relevance
“…Popular examples are orthogonal matching pursuit (OMP) [13] and orthogonal least squares (OLS) [14]. Nonnegative variants of these algorithms have been proposed; see [15] and the references therein. They aim to solve (2) heuristically, but theoretical recovery guarantees are similarly limited.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Popular examples are orthogonal matching pursuit (OMP) [13] and orthogonal least squares (OLS) [14]. Nonnegative variants of these algorithms have been proposed; see [15] and the references therein. They aim to solve (2) heuristically, but theoretical recovery guarantees are similarly limited.…”
Section: Related Workmentioning
confidence: 99%
“…• Nonnegative OMP (NNOMP) and Nonnegative OLS (NNOLS) [15] for which the Matlab codes are provided by the authors. Synthetic test cases are built by generating a random matrix A ∈ R m×n + and a random k-sparse vector xtrue ∈ R n + , computing b = Axtrue, and trying to find again xtrue with A, b and k as parameters of the sparse NNLS algorithm.…”
Section: Comparison On Synthetic Datasetsmentioning
confidence: 99%
“…For example, the 1 relaxation can be easily extended to the nonnegative setting [5,6]. Nonnegative extensions of greedy algorithms have been also introduced [7,8].…”
Section: Introductionmentioning
confidence: 99%
“…This yields an increase of computation time since NNLS subproblems do not have closed-form solutions, and an iterative subroutine is needed. In [27], following the early work of [36], we proposed fully recursive implementations. We showed that non-negative greedy algorithms yield accurate empirical results and that their computation cost is of the same order of magnitude as those of Oxx for moderate size problems.…”
Section: Introductionmentioning
confidence: 99%
“…Non-negative OMP was first introduced by Bruckstein et al [5] under the name OMP, and then renamed NNOMP in [36] (see also [21,27]). It relies on the repeated maximization of the positive inner product between the residual vector and the dictionary atoms, followed by the resolution of an NNLS problem.…”
Section: Introductionmentioning
confidence: 99%