2010
DOI: 10.1007/s00521-010-0413-5
|View full text |Cite
|
Sign up to set email alerts
|

Fractal initialization for high-quality mapping with self-organizing maps

Abstract: Initialization of self-organizing maps is typically based on random vectors within the given input space. The implicit problem with random initialization is the overlap (entanglement) of connections between neurons. In this paper, we present a new method of initialization based on a set of self-similar curves known as Hilbert curves. Hilbert curves can be scaled in network size for the number of neurons based on a simple recursive (fractal) technique, implicit in the properties of Hilbert curves. We have shown… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
3
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 24 publications
1
3
0
Order By: Relevance
“…However, the Hilbert initialization nishes with almost no knotting unlike its random counterpart. The random initialization and its result are shown in Figure 1c, d. These gures, as well as the work done by Beaton et al [10] clearly show that HCV initialization achieves faster convergence at a higher quality than random initialization. This type of initialization cannot be applied to growing architectures, as neurons are added where needed based on a multitude of parameters [11].…”
Section: Introductionsupporting
confidence: 71%
See 1 more Smart Citation
“…However, the Hilbert initialization nishes with almost no knotting unlike its random counterpart. The random initialization and its result are shown in Figure 1c, d. These gures, as well as the work done by Beaton et al [10] clearly show that HCV initialization achieves faster convergence at a higher quality than random initialization. This type of initialization cannot be applied to growing architectures, as neurons are added where needed based on a multitude of parameters [11].…”
Section: Introductionsupporting
confidence: 71%
“…Typically, SOMs neurons are placed randomly in the input space or start from one point. However, through previous research, and in agreement with the nature of SOM convergence topologies, utilization of space-lling curves for initialization leads to signi cantly faster convergence [10]. As such, we utilize Hilbert curve vector initialization (HCV initialization).…”
Section: Introductionmentioning
confidence: 83%
“…where t K is constant and t l is the characteristic (average) length of the transport network (dominantly neo-angiogenic) and t m is the actual mass of a tumor. Due to the dense surface growth of the new vessels, we can consider its asymptotic structure by Hilbert fractal [39], having fractal dimension [40]. This means the linear size of a tumor ( t L ) is proportional with t l , so the linear size of the tumour determines the mass by scaling:…”
Section: Growth Of Cancermentioning
confidence: 99%
“…As the viewing scale increases or lowers, the shape of the object does not change, remaining identical or very similar to the original structure. Fractal structures are currently being used in a variety of microwave applications, including the development of compact interferometers based on fractal geometry [10,11], fractal phase shifters [12], and resonators for filter miniaturization [13,14]. Minkowski fractal antennas are becoming increasingly important for transferring data with other devices.…”
Section: Introductionmentioning
confidence: 99%