Proceedings of the 25th International Conference on Machine Learning - ICML '08 2008
DOI: 10.1145/1390156.1390262
|View full text |Cite
|
Sign up to set email alerts
|

Bi-level path following for cross validated solution of kernel quantile regression

Saharon Rosset

Abstract: Modeling of conditional quantiles requires specification of the quantile being estimated and can thus be viewed as a parameterized predictive modeling problem. Quantile loss is typically used, and it is indeed parameterized by a quantile parameter. In this paper we show how to follow the path of cross validated solutions to regularized kernel quantile regression. Even though the bi-level optimization problem we encounter for every quantile is non-convex, the manner in which the optimal cross-validated solution… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2010
2010
2017
2017

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(21 citation statements)
references
References 25 publications
0
21
0
Order By: Relevance
“…Cross-validation could be replaced by the computation of a suitable model selection criterion, as done in previous work (Li and Zhu 2008). For quantile regression, Rosset (2008) proposes a bi-level solution path algorithm for varying regularization parameter and varying quantile τ , thus elegantly avoiding the problem of quantile crossing (Koenker 2005). An extension of the path algorithm of Sect.…”
Section: Discussionmentioning
confidence: 99%
“…Cross-validation could be replaced by the computation of a suitable model selection criterion, as done in previous work (Li and Zhu 2008). For quantile regression, Rosset (2008) proposes a bi-level solution path algorithm for varying regularization parameter and varying quantile τ , thus elegantly avoiding the problem of quantile crossing (Koenker 2005). An extension of the path algorithm of Sect.…”
Section: Discussionmentioning
confidence: 99%
“…In this section, we use the CCPRs of both (λ, η)-SVM and 2C-SVM to explore the region of [0, 1] × [0, 1] for the (λ, η) parameter space and the lower triangle region of [0, 1] × [0, 1] for the (C + , C − ) parameter space respectively. Once the 1.5 square units have been completely covered by the CCPRs, we can fit solutions of CS-SVM for all values of (C + , C − ) according to (14)- (16) and (20)- (21). This means that the complete two-dimensional solution surface of CS-SVM is determined.…”
Section: Bi-parameter Space Partition Algorithmmentioning
confidence: 99%
“…To overcome the first difficulty, solution path algorithms were proposed for many learning models [12], [13], [14], [15], [16], [17], [18], [20], to fit the entire solutions for every value of the parameter, which avoids training the classifier many times under different parameter settings. Specifically, Hastie et al [12] proposed an solution path approach for C-SVM.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations