2019
DOI: 10.1016/j.csda.2019.02.007
|View full text |Cite
|
Sign up to set email alerts
|

Dependence modelling in ultra high dimensions with vine copulas and the Graphical Lasso

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 33 publications
0
5
0
Order By: Relevance
“…Traditional variable selection methods for regression can also be applied, for example, the forward selection approach. Moreover, recent papers proposed methods for learning sparse vine copula models [21,22], which can be potentially used as a variable selection method for copula regression. Table 4.…”
Section: Resultsmentioning
confidence: 99%
“…Traditional variable selection methods for regression can also be applied, for example, the forward selection approach. Moreover, recent papers proposed methods for learning sparse vine copula models [21,22], which can be potentially used as a variable selection method for copula regression. Table 4.…”
Section: Resultsmentioning
confidence: 99%
“…This method is implemented in the vinecopulib library. A separate line of research (Müller & Czado 2018, 2019a exploits connections between vine copulas and Gaussian directed acyclic graphs to find sparsity patterns. A completely different approach is to use dimension reduction techniques before employing a copula model (as in Tagasovska et al 2019).…”
Section: Structure Selection and High-dimensional Modelsmentioning
confidence: 99%
“…The advantage is, of course, the flexibility, but the drawback is that this approach can be computationally unaffordable in terms of estimation, and there is always a possibility of overfitting. Recently, Müller & Czado (2019) proposed a novel three-step approach that overcomes the computational limitation. First, Gaussian methods are applied to split data sets into feasibly small subsets, then parsimonious and flexible vine copulas are applied, and finally, these submodels are reconciled into one joint model.…”
Section: Copulasmentioning
confidence: 99%
“…First, Gaussian methods are applied to split data sets into feasibly small subsets, then parsimonious and flexible vine copulas are applied, and finally, these submodels are reconciled into one joint model. We refer the reader to Müller & Czado (2019) for more details and for an example of the use of vine copulas in high dimensions. A modified Bayesian information criterion (BIC) tailored to sparse vine copula models by Nagler et al (2019) represents another approach for dealing with the computational cost and overfitting in vine copula models for high dimensions.…”
Section: Copulasmentioning
confidence: 99%