2017
DOI: 10.1007/s00180-017-0732-4
|View full text |Cite
|
Sign up to set email alerts
|

Some hypothesis tests based on random projection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…(2,∞) n statistics. We will consider three competitors: the well known distance covariance test proposed in [24] and adapted to perform better in high dimensions in [23], the Hilbert-Schmidt Information Criterion proposed in [9], and that proposed more recently in [6] based on random projections. Basically, this test is based on the idea of choosing K pairs of random directions, and observing that if X and Y are independent, then the projections of X and Y in each one of K pairs of directions are independent.…”
Section: Comparison With Other Tests In High Dimensionmentioning
confidence: 99%
See 1 more Smart Citation
“…(2,∞) n statistics. We will consider three competitors: the well known distance covariance test proposed in [24] and adapted to perform better in high dimensions in [23], the Hilbert-Schmidt Information Criterion proposed in [9], and that proposed more recently in [6] based on random projections. Basically, this test is based on the idea of choosing K pairs of random directions, and observing that if X and Y are independent, then the projections of X and Y in each one of K pairs of directions are independent.…”
Section: Comparison With Other Tests In High Dimensionmentioning
confidence: 99%
“…If at least one of these tests rejects the hypotheses of independence, then H 0 is rejected. To work at the 5%level, in [6] it is proposed to use a Bonferroni correction, that is, to compute the proportion of p-values smaller than 0.05/K to perform each one of the K uni-dimensional tests. We will call the RPK test.…”
Section: Comparison With Other Tests In High Dimensionmentioning
confidence: 99%
“…In this Subsection we deal with construction of ranks by means of the scalar products of observations with a vector which is a function of these observations. Thus in difference from Fraiman, Moreno, and Vallejo (2017), where the multiplying vectors are chosen randomly and yield randomized tests with which various statisticians may come to various conclusions with the same data, the tests presented in this Subsection are not randomized.…”
Section: Ranks Constructed By Means Of a Random Vectormentioning
confidence: 99%
“…As a popular and useful dimension reduction technique in high dimensional learning in statistics, random projections have been of considerable independent interest, and their applications have appeared in machine learning [1], data mining [2], information retrieval [3], classification [4], clustering [5], hypothesis testing [6], regression [7], and similarity search [8], etc. Many papers have been devoted to the discussion of various aspects of random projections, to name a few, [9]- [17].…”
Section: Introductionmentioning
confidence: 99%