2022
DOI: 10.1214/21-sts841
|View full text |Cite
|
Sign up to set email alerts
|

A Regression Perspective on Generalized Distance Covariance and the Hilbert–Schmidt Independence Criterion

Abstract: In a seminal paper, Sejdinovic et al. (Ann. Statist. 41 (2013) 2263-2291 showed the equivalence of the Hilbert-Schmidt Independence Criterion (HSIC) and a generalization of distance covariance. In this paper, the two notions of dependence are unified with a third prominent concept for independence testing, the "global test" introduced in (J. R. Stat. Soc. Ser. B. Stat. Methodol. 68 (2006) 477-493). The new viewpoint provides novel insights into all three test traditions, as well as a unified overall view of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 66 publications
0
2
0
Order By: Relevance
“…The traditional method for doing this is to specify a statistical model that captures the desired relationship, such as a polynomial regression model. However, the findings in Edelmann and Goeman (2022) suggest that utilizing the Hilbert-Schmidt Independence Criterion (HSIC) with a kernel tailored to the anticipated associations may prove to be a more effective approach.…”
Section: Discussionmentioning
confidence: 99%
“…The traditional method for doing this is to specify a statistical model that captures the desired relationship, such as a polynomial regression model. However, the findings in Edelmann and Goeman (2022) suggest that utilizing the Hilbert-Schmidt Independence Criterion (HSIC) with a kernel tailored to the anticipated associations may prove to be a more effective approach.…”
Section: Discussionmentioning
confidence: 99%
“…In our opinion this topic would be a valuable addition to a graduate course on mathematical statistics, because distance covariance is a general method with interesting properties and wide ranging applications, for instance in variable selection (Chen et al, 2018), sparse contingency tables (Zhang, 2019), independent component analysis (Matteson and Tsay, 2017), and time series (Davis et al, 2018). It can be computed fast (Huo and Székely, 2016;Chaudhuri and Hu, 2019), and there are interesting connections with other dependence measures (Edelmann and Goeman, 2022). Its robustness to outliers was studied recently (Leyder et al, 2024).…”
mentioning
confidence: 99%