2022
DOI: 10.1016/j.jedc.2022.104530
|View full text |Cite
|
Sign up to set email alerts
|

Identification of Structural VAR Models via Independent Component Analysis: A Performance Evaluation Study

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 71 publications
0
6
0
Order By: Relevance
“…They can be used according to the proven method both for principal component analysis (PCA, [13][14][15]) and (with certain reservations) for non-negative matrix factorization (NMF, [16][17][18]). SVD can also be used to improve the results of independent component analysis (ICA, [19][20][21]). It is convenient to apply SVD because there are no restrictions on the structure of the original data matrix (square when using the LU [22] or Schur distribution [23]; square, symmetric, or positive definite when using the Cholesky distribution [24]; matrix with positive elements when applying NMF).…”
Section: Outputmentioning
confidence: 99%
See 1 more Smart Citation
“…They can be used according to the proven method both for principal component analysis (PCA, [13][14][15]) and (with certain reservations) for non-negative matrix factorization (NMF, [16][17][18]). SVD can also be used to improve the results of independent component analysis (ICA, [19][20][21]). It is convenient to apply SVD because there are no restrictions on the structure of the original data matrix (square when using the LU [22] or Schur distribution [23]; square, symmetric, or positive definite when using the Cholesky distribution [24]; matrix with positive elements when applying NMF).…”
Section: Outputmentioning
confidence: 99%
“…At the same time, it was believed that these components have an abnormal distribution, and the sources of their origin are independent. To determine independent components, either minimization of mutual information based on Kullback-Leibler divergence [19] or minimization of "non-Gaussianity" [20,21] (using measures such as kurtosis coefficient and negentropy) are used. In the context of the dimensionality reduction problem, the application of ICA is trivial: to represent the input data as a mixture of components, divide them and select a certain number.…”
Section: Outputmentioning
confidence: 99%
“…The speed at which new implementations of identification based on non-Gaussianity are developed makes comprehensive systematic comparisons difficult; current options described in Section 4 include various parametric models, moment based approaches where the researcher must choose which higher moments to use and which restrictions to impose on cross-moments, and a plethora of non-parametric estimators from the ICA literature. Moneta and Pallante (2022) is the only simulation comparison of which I am aware. There is ample scope for further and updated simulation studies comparing the leading alternatives, as well as "pre-tests" to select suitable moments containing relevant identifying variation.…”
Section: Discussionmentioning
confidence: 99%
“…For the latter, there is an important choice of which coskewness and/or co-kurtosis restrictions to impose or of which contrast function to use for ICA. Moneta and Pallante (2022) compare a variety of estimators in a simulation study. They include FastICA, the PML estimator of Gouriéroux et al (2017), and two other ICA approaches based on Givens matrices; unfortunately, they do not include recent moment-based estimators.…”
Section: Choosing the Right Functional Formmentioning
confidence: 99%
“…Lu et al (2009), for instance, proposes an integration of ICA with support vector regression for forecasting of the Nikkei 225 opening index and TAIEX closing index, see also Liu and Wang (2011) and Chowdhury et al (2018). Moreover, Moneta and Pallante (2020) compare the performance of different ICA estimators within the field of structural vector autoregressive (SVAR) method on the US government spending and tax cuts data. Ceffer et al (2019) predict and model the future value of financial time series applying SVAR with ICA for pre-processing data.…”
Section: Literature Reviewmentioning
confidence: 99%