2019
DOI: 10.1088/1751-8121/ab3e89
|View full text |Cite
|
Sign up to set email alerts
|

Cross validation in sparse linear regression with piecewise continuous nonconvex penalties and its acceleration

Abstract: We investigate the signal reconstruction performance of sparse linear regression in the presence of noise when piecewise continuous nonconvex penalties are used. Among such penalties, we focus on the smoothly clipped absolute deviation (SCAD) penalty. The contributions of this study are three-fold: We first present a theoretical analysis of a typical reconstruction performance, using the replica method, under the assumption that each component of the design matrix is given as an independent and identically dis… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 43 publications
0
2
0
Order By: Relevance
“…Annealing the values of the nonconvexity parameters in signal reconstruction is a possible solution to achieve fine tuning. In the linear regression problems with piecewise nonconvex penalties, annealing of the nonconvexity parameters is efficient for obtaining a stable solution path and the associated cross-validation error [23]. Further, monitoring the time evolution of ε is significant for an efficient controlling.…”
Section: Approximate Message Passing With Nonconvexity Controlmentioning
confidence: 99%
“…Annealing the values of the nonconvexity parameters in signal reconstruction is a possible solution to achieve fine tuning. In the linear regression problems with piecewise nonconvex penalties, annealing of the nonconvexity parameters is efficient for obtaining a stable solution path and the associated cross-validation error [23]. Further, monitoring the time evolution of ε is significant for an efficient controlling.…”
Section: Approximate Message Passing With Nonconvexity Controlmentioning
confidence: 99%
“…Actually, AMP for LASSO is one case of mismatched model, but its convergence is guaranteed due to convex nature of LASSO [32]. The failure of AMP can occur when the mismatched models are defined by non-convex cost function [33].…”
Section: Introductionmentioning
confidence: 99%