2021
DOI: 10.1016/j.neunet.2021.01.034
|View full text |Cite
|
Sign up to set email alerts
|

General stochastic separation theorems with optimal bounds

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
41
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 18 publications
(42 citation statements)
references
References 27 publications
1
41
0
Order By: Relevance
“…It means that the simple classical techniques like linear Fisher's discriminant become unexpectedly powerful in high dimensions under some assumptions about regularity of probability distributions [12][13][14]. These assumptions can be rather mild and typically include absence of extremely dense lumps that are areas with relatively low volume but unexpectedly high probability (for more detail we refer to [15]). These lumps correspond to the narrow but high peaks of probability density.…”
Section: One-and Few-short Learningmentioning
confidence: 99%
See 3 more Smart Citations
“…It means that the simple classical techniques like linear Fisher's discriminant become unexpectedly powerful in high dimensions under some assumptions about regularity of probability distributions [12][13][14]. These assumptions can be rather mild and typically include absence of extremely dense lumps that are areas with relatively low volume but unexpectedly high probability (for more detail we refer to [15]). These lumps correspond to the narrow but high peaks of probability density.…”
Section: One-and Few-short Learningmentioning
confidence: 99%
“…2. In the situation of 'blessing of dimensionality', with sufficiently regular probability distribution in high dimensions the simple linear (or kernel [18]) one-and few-short methods become effective [7,14,15]. 3.…”
Section: One-and Few-short Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…In the last decade, many other examples of blessing-of-dimensionality appeared, both in the general analysis of complex systems (see, e.g., [ 1 , 3 , 4 , 5 , 6 , 7 ]) and, specifically, in the analysis of neural networks; see, e.g., [ 8 , 9 , 10 , 11 ].…”
Section: Introduction: From Curse Of Dimensionality To Blessing Of Dimensionalitymentioning
confidence: 99%