2016
DOI: 10.1016/j.csda.2015.10.007
|View full text |Cite
|
Sign up to set email alerts
|

Regularized quantile regression under heterogeneous sparsity with application to quantitative genetic traits

Abstract: Genetic studies often involve quantitative traits. Identifying genetic features that influence quantitative traits can help to uncover the etiology of diseases. Quantile regression method considers the conditional quantiles of the response variable, and is able to characterize the underlying regression structure in a more comprehensive manner. On the other hand, genetic studies often involve high-dimensional genomic features, and the underlying regression structure may be heterogeneous in terms of both effect … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
14
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 27 publications
(14 citation statements)
references
References 21 publications
0
14
0
Order By: Relevance
“…The model is also advantageous in the sense of no conditional assumption about the distribution of the data (Sherwood and Wang 2016). Lastly, the model minimizes the sum of squares of the residuals by adopting the linear programming method (He et al 2016). Despite the listed advantages, the model is ineffective in terms of estimating a large number of fixed effects and exhibits incidental parameter problems when T is small.…”
Section: Methods and Empirical Resultsmentioning
confidence: 99%
“…The model is also advantageous in the sense of no conditional assumption about the distribution of the data (Sherwood and Wang 2016). Lastly, the model minimizes the sum of squares of the residuals by adopting the linear programming method (He et al 2016). Despite the listed advantages, the model is ineffective in terms of estimating a large number of fixed effects and exhibits incidental parameter problems when T is small.…”
Section: Methods and Empirical Resultsmentioning
confidence: 99%
“…Implications of these challenges are apparent in the drawbacks (Davatzikos, 2004 ) of voxel-based analyses (Ashburner and Friston, 2000 ) and motivate many of the dimension reduction and regularization techniques commonly applied to neuroimaging data. These methods include pattern identification and discrimination (Fan et al, 2007 ), which rely on sample sizes not feasible in small neuroimaging studies, and sparse regularization for feature extraction or selection (Batmanghelich et al, 2011 ; Sabuncu and Van Leemput, 2012 ; He et al, 2016 ; Yu et al, 2019 ; Su et al, 2020 ). Despite successful applications of the latter to medical data (Krishnapuram et al, 2005 ; Zou and Hastie, 2005 ; Ryali et al, 2010 ), these approaches are unstable with respect to selected or extracted features.…”
Section: Introductionmentioning
confidence: 99%
“…We formulate this problem in terms of sparse regression with L 1 (lasso), TV, and L 2 group lasso penalties, with the last of these combining information across multiple modalities. These penalties have been widely used in the development of robust methods for medical data (He et al, 2016 ; Yu et al, 2019 ). Together, the penalties regularize spatial effect estimates for better interpretability, both within and between modalities.…”
Section: Introductionmentioning
confidence: 99%
“…To overcome this weakness in this paper, we developed a multicategory generalized DWD method that is capable of performing variable selection and classification simultaneously. Our approach incorporates sparsity and group structure information via the sparse group lasso penalty (see [ 20 , 21 , 22 , 23 , 24 ]).…”
Section: Introductionmentioning
confidence: 99%