2018
DOI: 10.1214/17-aos1567
|View full text |Cite
|
Sign up to set email alerts
|

Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space

Abstract: This paper considers the estimation of the sparse additive quantile regression (SAQR) in high-dimensional settings. Given the nonsmooth nature of the quantile loss function and the nonparametric complexities of the component function estimation, it is challenging to analyze the theoretical properties of ultrahigh-dimensional SAQR. We propose a regularized learning approach with a two-fold Lasso-type regularization in a reproducing kernel Hilbert space (RKHS) for SAQR. We establish nonasymptotic oracle inequali… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
28
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
8

Relationship

5
3

Authors

Journals

citations
Cited by 47 publications
(30 citation statements)
references
References 47 publications
2
28
0
Order By: Relevance
“…Under the sparse and additive settings, similar results have been published for various stationary models such as additive regression (Huang et al, ; Meier et al, ; Ravikumar et al, ) and additive quantile regressions Lv et al (). The existing additive regression corresponds to the least square loss or quantile loss, which are sum of iid random variables.…”
Section: Theoretical Resultssupporting
confidence: 60%
“…Under the sparse and additive settings, similar results have been published for various stationary models such as additive regression (Huang et al, ; Meier et al, ; Ravikumar et al, ) and additive quantile regressions Lv et al (). The existing additive regression corresponds to the least square loss or quantile loss, which are sum of iid random variables.…”
Section: Theoretical Resultssupporting
confidence: 60%
“…This has been verified under some mild conditions on the underlying distribution (Steinwart and Christmann 2011;Lv et al 2016).…”
Section: Assumption A1mentioning
confidence: 56%
“…While the first combined penalty leads to a group lasso formulation, it appears to be lack of theoretical justification. The second term has been proved to enjoy some theoretical properties (Lv et al 2016;Raskutti et al 2012); however, it still requires SOCP. To enjoy both theoretical properties and computational efficiency, we propose to equip the additive QR model with the new combination of the sparsity and smoothness penalty I n ( f ) in (2).…”
Section: Regularized Qr With Smoothness-sparsity Penaltymentioning
confidence: 99%
“…We get H W = L 2 (J m ) iff W ⊂ P (R m ) contains all the rational points. For additional details regarding reproducing kernel Hilbert spaces, see e.g., [28,29,30,31].…”
Section: Fourier Expansionmentioning
confidence: 99%