2018
DOI: 10.1007/s10462-018-9666-7
|View full text |Cite
|
Sign up to set email alerts
|

Independence test and canonical correlation analysis based on the alignment between kernel matrices for multivariate functional data

Abstract: In the case of vector data, Gretton et al. (Algorithmic learning theory. Springer, Berlin, pp 63-77, 2005) defined Hilbert-Schmidt independence criterion, and next Cortes et al. (J Mach Learn Res 13:795-828, 2012) introduced concept of the centered kernel target alignment (KTA). In this paper we generalize these measures of dependence to the case of multivariate functional data. In addition, based on these measures between two kernel matrices (we use the Gaussian kernel), we constructed independence test and n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 22 publications
(17 citation statements)
references
References 29 publications
2
15
0
Order By: Relevance
“…[15] have developed a two-sample test for distributions based on generalisations of a finite-dimensional test by utilising functional principal component analysis, and [16] have derived kernels over functions to be used with MMD for the two-sample test. Independence testing for functional data using kernels was recently proposed in [17] but assumes the samples lie on a finite-dimensional subspace of the function space-an assumption not required in our work. Moreover, ref.…”
Section: Related Workmentioning
confidence: 99%
“…[15] have developed a two-sample test for distributions based on generalisations of a finite-dimensional test by utilising functional principal component analysis, and [16] have derived kernels over functions to be used with MMD for the two-sample test. Independence testing for functional data using kernels was recently proposed in [17] but assumes the samples lie on a finite-dimensional subspace of the function space-an assumption not required in our work. Moreover, ref.…”
Section: Related Workmentioning
confidence: 99%
“…Other studies aim at expanding and utilizing the measures in a general manner. They proposed methods based on the measures for the following: statistical tests [44]- [46], robustness improvement [47]- [49], algorithmic strategies for multi dependence detection [50], [51], feature selection [52]- [56], and feature extraction [57], [58]. The other studies are to utilize the measures in specific domains, the applications in short, which are as given in Section I-A.…”
Section: Cc(x Y) =mentioning
confidence: 99%
“…However, Pomann et al [2016] have developed a two-sample test for distributions based on generalisations of a finite-dimensional test by utilising functional principal component analysis, and Wynne and Duncan [2020] have derived kernels over functions to be used with MMD for the two-sample test. Independence testing for functional data using kernels was recently proposed in Górecki et al [2018], but assumes the samples lie on a finite-dimensional subspace of the function space-an assumption not required in our work. Moreover, Zhang et al [2018] have developed computationally efficient methods to test for independence on high-dimensional distributions and large sample sizes by using eigenvalues of centred kernel matrices to approximate the distribution under the null hypothesis H 0 instead of simulating a large number of permutations.…”
Section: Related Workmentioning
confidence: 99%