2018
DOI: 10.1080/10618600.2017.1390470
|View full text |Cite
|
Sign up to set email alerts
|

Gradient-based Regularization Parameter Selection for Problems With Nonsmooth Penalty Functions

Abstract: In high-dimensional and/or non-parametric regression problems, regularization (or penalization) is used to control model complexity and induce desired structure. Each penalty has a weight parameter that indicates how strongly the structure corresponding to that penalty should be enforced. Typically the parameters are chosen to minimize the error on a separate validation set using a simple grid search or a gradient-free optimization method. It is more efficient to tune parameters if the gradient can be determin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 16 publications
(22 citation statements)
references
References 28 publications
0
22
0
Order By: Relevance
“…Additionally, we have found that the performance of SPLAT is not overly sensitive to the precise choices of tuning parameters, as seen in Figure 4. A recent work by Feng and Simon 27 may provide a more elegant approach to address this issue of tuning parameter selection, but we leave a more thorough investigation of this to future work.…”
Section: Algorithm For Splatmentioning
confidence: 99%
“…Additionally, we have found that the performance of SPLAT is not overly sensitive to the precise choices of tuning parameters, as seen in Figure 4. A recent work by Feng and Simon 27 may provide a more elegant approach to address this issue of tuning parameter selection, but we leave a more thorough investigation of this to future work.…”
Section: Algorithm For Splatmentioning
confidence: 99%
“…the combination of a set of parameters [1000, 100, 10, 1, 0.1, 0.05, 0.01, 0.005, 0.001, 0.0001] are gradually tuned to obtain the best results. In the future, more efforts will be made to investigate optimal parameter section method for multiple-constrained regularization with non-smooth penalty functions as a separate study, based on the state-of-the-art research outcomes, such as the most recently reported work in [31].…”
Section: Discussionmentioning
confidence: 99%
“…This method is reported to be able to find a solution close to the global optimum. 21 In the validation phase, the MSE is given by:…”
Section: Proposed Methodsmentioning
confidence: 99%
“…Since there is not more than one hyper‐parameter to obtain, the simple grid search method is applied. This method is reported to be able to find a solution close to the global optimum . In the validation phase, the MSE is given by: MSE=1Mj=1Me2()j where M is the number of validation data.…”
Section: Proposed Methodsmentioning
confidence: 99%