2017
DOI: 10.1080/00949655.2017.1362405
|View full text |Cite
|
Sign up to set email alerts
|

Testing diagonality of high-dimensional covariance matrix under non-normality

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 22 publications
0
2
0
Order By: Relevance
“…Yang and Pan (2015) extend the canonical correlation through regularization for high-dimensional case. Another test, using block correlation matrices, is proposed in Bao et al (2017), where Srivastava, Kollo, and von Rosen (2011) and Xu (2017) provide diagonality tests relaxing normality assumption, where a similarity coefficient based treatment is given in Ahmad (2019).…”
Section: Introductionmentioning
confidence: 99%
“…Yang and Pan (2015) extend the canonical correlation through regularization for high-dimensional case. Another test, using block correlation matrices, is proposed in Bao et al (2017), where Srivastava, Kollo, and von Rosen (2011) and Xu (2017) provide diagonality tests relaxing normality assumption, where a similarity coefficient based treatment is given in Ahmad (2019).…”
Section: Introductionmentioning
confidence: 99%
“…This estimator is unbiased under the normality population assumption, but is biased for the non-normal case. Some studies indicate that the test procedures which work well under the normality assumption may encounter performance degradation when the population dispenses with the normality assumption (see for example [4,5]). Therefore, for the test problem (1) under the non-normal case, Srivastava et al [6] modified the test statistic in [1] by providing an unbiased estimator of 2 tr i  for general models; Ahmad [7] constructed the test statistic based on the U -statistics with the additional assumption of zero mean vectors.…”
mentioning
confidence: 99%