2018
DOI: 10.1080/02331888.2018.1500579
|View full text |Cite
|
Sign up to set email alerts
|

Composite quantile regression for massive datasets

Abstract: Analysis of massive datasets is challenging owing to limitations of computer primary memory. Composite quantile regression (CQR) is a robust and efficient estimation method. In this paper, we extend CQR to massive datasets and propose a divide-and-conquer CQR method. The basic idea is to split the entire dataset into several blocks, applying the CQR method for data in each block, and finally combining these regression results via weighted average. The proposed approach significantly reduces the required amount… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 27 publications
(8 citation statements)
references
References 42 publications
1
7
0
Order By: Relevance
“…This is not surprising because the LAD estimator is just the minimizer of the MAE on the training set. This phenomenon is consistent with the results in real data example of Jiang et al (2018).…”
Section: Asymmetric Datasupporting
confidence: 93%
See 1 more Smart Citation
“…This is not surprising because the LAD estimator is just the minimizer of the MAE on the training set. This phenomenon is consistent with the results in real data example of Jiang et al (2018).…”
Section: Asymmetric Datasupporting
confidence: 93%
“…However, this is a common condition in DC quantile methods with one-shot aggregation, even for parametric models, this condition is commonly used (see e.g. Zhao et al, 2015, Jiang et al, 2018, Chen and Zhou, 2020.…”
Section: Regularity Conditionsmentioning
confidence: 99%
“…where f(x i ) is an estimator of f(x i ). We employ EBIC in (7) to select L and λ, and the results are showed in Table 1. To reduce the computational load, we consider all the combinations of λ � 0, 0.05, .…”
Section: Prediction Performancementioning
confidence: 99%
“…A block average quantile regression (QR) approach for the massive dataset is proposed in [6] by combining the divide-and-conquer method with QR. Jiang [7] extended the work of [6] to composite quantile regression (CQR) for massive datasets. Recently, Chen et al [8] studied QR under memory constraint for massive datasets.…”
Section: Introductionmentioning
confidence: 99%
“…We denote this algorithm as EIS Q . Second, we propose three efficient importance sampling imputation algorithms in composite quantile regression (denoted as EIS CQ ) using IP, majorize‐minimization (MM) and coordinate descent (CD) based on publicly available cqrReg package for R [4, 8, 9, 21]. For brevity, we denote the above three EIS algorithms in composite quantile regression as normalEISitalicCQnormalIP, normalEISitalicCQnormalMM and normalEISitalicCQnormalCD, respectively.…”
Section: Introductionmentioning
confidence: 99%