2019
DOI: 10.48550/arxiv.1905.10432
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Cross validation approaches for penalized Cox regression

Abstract: Cross validation is commonly used for selecting tuning parameters in penalized regression, but its use in penalized Cox regression models has received relatively little attention in the literature. Due to its partial likelihood construction, carrying out cross validation for Cox models is not straightforward, and there are several potential approaches for implementation. Here, we propose two new cross-validation methods for Cox regression and compare them to approaches that have been proposed elsewhere. Our pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 20 publications
0
7
0
Order By: Relevance
“…For example, glmnet uses cross-validated partial likelihood. In our implementation, however, we instead implement a cross-validated linear predictors criterion proposed in Dai and Breheny (2019). Specifically, for each of the J populations, we randomly assign subjects to one of K folds, K (j)1 , .…”
Section: Tuning Criterionmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, glmnet uses cross-validated partial likelihood. In our implementation, however, we instead implement a cross-validated linear predictors criterion proposed in Dai and Breheny (2019). Specifically, for each of the J populations, we randomly assign subjects to one of K folds, K (j)1 , .…”
Section: Tuning Criterionmentioning
confidence: 99%
“…We found (10) worked better than cross-validated partial likelihood, particularly for applications where many subjects are censored. This is partly because cross-validated linear predictors do not require constructing a risk set separately for each fold: see Dai and Breheny (2019) for more on this approach.…”
Section: Tuning Criterionmentioning
confidence: 99%
“…While it is clear that the solution to ( 6) is the solution to (3), there are many redundancies in the n 2 variables θ i,j , which would impose a substantial burden on memory and storage. Instead, we use that θ i,j = −θ j,i and the fact that if δ i = 0 and δ j = 0, the value of θ i,j does not affect (6) to reduce the number of constraints. Thus, letting…”
Section: Formulation and Updating Equationsmentioning
confidence: 99%
“…Tuning parameters are often chosen by cross-validation, which requires the choice of a performance metric. In this section, we propose a new performance metric inspired by that of Dai and Breheny 6 , who studied various approaches for tuning parameter selection when fitting proportional hazards models. Let V 1 , .…”
Section: A New Tuning Parameter Selection Criterionmentioning
confidence: 99%
“…Here (h, β) prev and (h, β) new are the previous and new estimated set of covariates, respectively. To validate that the selected covariates do not overfit the patient data, we use leave-one-out cross-validation (LOOCV) on the dataset and predict linear estimators [3] as ηi = h i • β −i and η = (η 1 , η2 , . .…”
Section: Survival Analysismentioning
confidence: 99%