Artificial Neural Networks 1991
DOI: 10.1016/b978-0-444-89178-5.50092-0
|View full text |Cite
|
Sign up to set email alerts
|

A Neural Network That Learns to Do Hyphenation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
106
0
9

Year Published

1999
1999
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 74 publications
(115 citation statements)
references
References 1 publication
0
106
0
9
Order By: Relevance
“…The Growing Neural Gas (GNG) algorithm presented in (Fritzke, 1995) is a vector-based learning method which is capable of building (clustering) topological relationships that are present in the input space by applying the Competitive Hebbian Learning (CHL) rule. The neurons in the m-dimensional continuous output space R m are competing for the right to respond to a given input signal ξ with ξ ∈ R n of the discrete input training dataset D. A single input signal contains three coordinates of a point in the input space.…”
Section: Building a Topological Model Via The Growing Neural Gas Algomentioning
confidence: 99%
See 1 more Smart Citation
“…The Growing Neural Gas (GNG) algorithm presented in (Fritzke, 1995) is a vector-based learning method which is capable of building (clustering) topological relationships that are present in the input space by applying the Competitive Hebbian Learning (CHL) rule. The neurons in the m-dimensional continuous output space R m are competing for the right to respond to a given input signal ξ with ξ ∈ R n of the discrete input training dataset D. A single input signal contains three coordinates of a point in the input space.…”
Section: Building a Topological Model Via The Growing Neural Gas Algomentioning
confidence: 99%
“…This method (Equation 3) saturates N after k steps to avoid too many performance dips and to force the network to a consistent distribution of the nodes in R m . The other hyperparameters which are not listed in the table are set to the default values from (Fritzke, 1995). Nevertheless, a small increase of tmax is necessary to keep the network error low with the increasing amount of data.…”
Section: Hyperparameter Adaptionmentioning
confidence: 99%
“…With Growing Neural Gas (GNG) [10] method a growth process takes place from minimal network size and new neurons are inserted successively using a particular type of vector quantization. To determine where to insert new neurons, local measures are gathered during the adaptation process and each new neuron is inserted close to the neuron with highest accumulated error.…”
Section: Gng Algorithmmentioning
confidence: 99%
“…However, as the NG has a fixed number of nodes, it is necessary to have some a priori information about the input space to pre-establish the size of the network. This model was extended by [10] proposing the Growing Neural Gas (GNG) network, which combined the flexible structure of the NG with a growing strategy. Moreover, the learning adaptation step was slightly modified.…”
Section: Introductionmentioning
confidence: 99%
“…We therefore use a growing-when-required (GWR) network [15] to efficiently cluster keypoints that are highly similar. A GWR-network is a clustering method, very similar to a growing-neural-gas (GNG) network [7]. Both networks are based on Kohonen's self-organizing maps (SOM) [10].…”
Section: Keypoint Clusteringmentioning
confidence: 99%