2016
DOI: 10.1016/j.chaos.2016.02.035
|View full text |Cite
|
Sign up to set email alerts
|

Big data naturally rescaled

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 46 publications
0
8
0
Order By: Relevance
“…6b), where the set of all avalanche vectors spans a 59-dimensional "feature space". To see how the avalanches are organized in this space, we used an unsupervised Hebbian learning clustering (HLC) developed by us [48][49][50][51] . This approach finds clusters of arbitrary shapes, without prior knowledge of the number of clusters or requiring a data dimensionality reduction step that generally distorts and biases the distances between data points 52 .…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…6b), where the set of all avalanche vectors spans a 59-dimensional "feature space". To see how the avalanches are organized in this space, we used an unsupervised Hebbian learning clustering (HLC) developed by us [48][49][50][51] . This approach finds clusters of arbitrary shapes, without prior knowledge of the number of clusters or requiring a data dimensionality reduction step that generally distorts and biases the distances between data points 52 .…”
Section: Discussionmentioning
confidence: 99%
“…Hebbian learning clustering [48][49][50][51] (HLC) recasts the data (in our application: the avalanche vectors) as the nodes on a k-nearest neighbor graph (using Euclidean distance). The distances between data points furnish the initial edge weights of the graph.…”
Section: Methodsmentioning
confidence: 99%
“…3) Many clustering algorithms have difficulties in identifying clusters that are highly nonlinear in multiple dimensions. Unsupervised clustering is an active area of research and future advances in clustering methods—especially density-based and neural network-based methods—may alleviate some of these problems [ 36 ]. However, every such algorithm has its own weaknesses and such improvements may come at the cost of other features (e.g.…”
Section: Results and Assessmentmentioning
confidence: 99%
“…From simple reasoning, most elements of a cluster will be the result of the same generating process, obtained via slightly changed parameters of the generative process. Such generative processes lead to distributions that strongly differ from Gaussians [3,26]: whenever two or more parameters are involved in a nonlinear generative process, a shrimp-shaped domain collects the items that have similar properties (figure 4).…”
Section: (C) Sampling Model Genericitymentioning
confidence: 99%
“…In the following, we compare and discuss two promising approaches that are fully based on local k-nearest neighbour information and are therefore promising candidates for unbiased clustering: the Phenograph approach (a recently published leading algorithm in the clustering of mass cytometry data with a view to medical application [29]); and a current implementation of our previously described Hebbian learning clustering (HLC) [26,30] . Shrimp-shaped clusters of similar dynamical behaviour (orange, period 1; blue, period 2; white, higher periodic or divergent behaviour) in the (a, b) parameter space of the Hénon map, which is the prototype for all generic properties of nonlinear processes [18].…”
Section: Searching For Minimally Biased Clustersmentioning
confidence: 99%