2016
DOI: 10.1016/j.csda.2015.08.019
|View full text |Cite
|
Sign up to set email alerts
|

A modified local quadratic approximation algorithm for penalized optimization problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
26
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
9

Relationship

4
5

Authors

Journals

citations
Cited by 32 publications
(26 citation statements)
references
References 23 publications
0
26
0
Order By: Relevance
“…That is, LASSO automatically deletes unnecessary covariates. LASSO is known to have many desirable properties for regression models with a large number of covariates, and various efficient optimization algorithms are available for linear regression as well as for generalized linear models [ 8 - 10 ]. To our knowledge, this study is the first to develop a logistic LASSO regression model for diagnosing breast cancer based on radiologic findings and CDD.…”
Section: Introductionmentioning
confidence: 99%
“…That is, LASSO automatically deletes unnecessary covariates. LASSO is known to have many desirable properties for regression models with a large number of covariates, and various efficient optimization algorithms are available for linear regression as well as for generalized linear models [ 8 - 10 ]. To our knowledge, this study is the first to develop a logistic LASSO regression model for diagnosing breast cancer based on radiologic findings and CDD.…”
Section: Introductionmentioning
confidence: 99%
“…. This modification enables the MLQA algorithm to possess the ascent property (Lee et al, 2016). For the readers, we summarize the algorithm in Algorithm 1.…”
Section: Algorithmmentioning
confidence: 99%
“…We show that under some regularity conditions, the MCL is asymptotically equivalent to an oracle type estimator which is selection consistent even when the number of variables is larger than the sample size. We also develop an efficient algorithm for the MCL by applying the concave-convex procedure (Yuille and Rangarajan, 2003) and modified local quadratic approximation algorithm (Lee et al, 2016). We conduct some numerical studies via simulations and data analysis to show that the MCL can perform better than other penalized estimators for the high-dimensional GLM.…”
Section: Introductionmentioning
confidence: 99%
“…One main reason comes from the lack of efficient computational algorithms that implement the penalized estimators. Although there are some unified algorithms studied before (Kwon and Kim, 2012;Lee et al, 2016), data analysts still feel annoying or uncomfortable from working with nonconvex penalties for multinomial logistic regression.…”
Section: Introductionmentioning
confidence: 99%