2018
DOI: 10.1109/tsp.2018.2791949
|View full text |Cite
|
Sign up to set email alerts
|

Weighted LASSO for Sparse Recovery With Statistical Prior Support Information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(10 citation statements)
references
References 16 publications
0
10
0
Order By: Relevance
“…2) Standard LASSO: The channel is recovered by solving the convex optimization problem (5). Standard LASSO estimates the channel without exploiting the burst sparsity property [24], [27]. 3) Burst LASSO: In this approach, standard LASSO is modified by using a lifting transformation to convert the burst sparsity to the block model [13], [25].…”
Section: Numerical Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…2) Standard LASSO: The channel is recovered by solving the convex optimization problem (5). Standard LASSO estimates the channel without exploiting the burst sparsity property [24], [27]. 3) Burst LASSO: In this approach, standard LASSO is modified by using a lifting transformation to convert the burst sparsity to the block model [13], [25].…”
Section: Numerical Resultsmentioning
confidence: 99%
“…The challenge yet to be overcome is that these algorithms have been developed for certain special cases of massive MIMO channels, e.g., where the users' channels share the same scatterers in the propagation environment or the same sparsity pattern in the sub-channels, properties that may not hold in many scenarios. Another direction taken in the literature [22]- [24] is to improve CS-based methods by exploiting partial support information. In all these works, the channel recovery problem is formulated as a weighted linear optimization problem.…”
Section: Introductionmentioning
confidence: 99%
“…We note that Lian, et al recently studied (4) from both theoretical and experimental aspects in [28],…”
Section: Introductionmentioning
confidence: 89%
“…Comparing with other model selection methods such as the least absolute shrinkage and selection operator (LASSO) method [27], the LARS method is more stable and computationally efficient. Nevertheless, there are some factors that may negatively influence the performance of the LARS method with respect to the accuracy and the stability.…”
Section: The Adaptive Lars Methods 1) the Potential Problems Of The Lars Methodmentioning
confidence: 99%