2020
DOI: 10.3389/fnbot.2019.00110
|View full text |Cite
|
Sign up to set email alerts
|

Lizard Brain: Tackling Locally Low-Dimensional Yet Globally Complex Organization of Multi-Dimensional Datasets

Abstract: Machine learning deals with datasets characterized by high dimensionality. However, in many cases, the intrinsic dimensionality of the datasets is surprisingly low. For example, the dimensionality of a robot's perception space can be large and multi-modal but its variables can have more or less complex non-linear interdependencies. Thus multidimensional data point clouds can be effectively located in the vicinity of principal varieties possessing locally small dimensionality, but having a globally complicated … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
29
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2

Relationship

3
4

Authors

Journals

citations
Cited by 19 publications
(29 citation statements)
references
References 72 publications
0
29
0
Order By: Relevance
“…In this particular case, applying non-linear dimensionality reduction (e.g., tSNE) could make the ‘hidden’ branch more visible in 2D at the cost of distorting the underlying data geometry. Nevertheless, this situation can be reproduced with any data dimensionality technique (for examples, see [ 56 ]).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this particular case, applying non-linear dimensionality reduction (e.g., tSNE) could make the ‘hidden’ branch more visible in 2D at the cost of distorting the underlying data geometry. Nevertheless, this situation can be reproduced with any data dimensionality technique (for examples, see [ 56 ]).…”
Section: Resultsmentioning
confidence: 99%
“…In order to compare these algorithms with ElPiGraph, we used previously published LizardBrain generator of noisy branching data point clouds [ 56 ]. Briefly, it generates data points in a unit m -dimensional hypercube around a set of non-linear (e.g., parabolic) branches, such that each next branch starts from a randomly selected point on one of the previously generated branches in a random direction.…”
Section: Resultsmentioning
confidence: 99%
“…All these methods use an object embedded in the data space. They are called Injective Methods [ 68 ]. In addition, a family of Projective Methods was developed.…”
Section: Dimension Estimationmentioning
confidence: 99%
“…These methods do not construct a data approximator, but project the dataspace onto a space of lower dimension with preservation of similarity or dissimilarity of objects. A brief review of modern injective and projective methods can be found in [ 68 ].…”
Section: Dimension Estimationmentioning
confidence: 99%
See 1 more Smart Citation