1999
DOI: 10.1109/72.774213
|View full text |Cite
|
Sign up to set email alerts
|

Generalization, discrimination, and multiple categorization using adaptive resonance theory

Abstract: Abstract-The internal competition between categories in the adaptive resonance theory (ART) neural model can be biased by replacing the original choice function by one that contains an attentional tuning parameter under external control. For the same input but different values of the attentional tuning parameter, the network can learn and recall different categories with different degrees of generality, thus permitting the coexistence of both general and specific categorizations of the same set of data. Any nu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2006
2006
2020
2020

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(3 citation statements)
references
References 18 publications
0
3
0
Order By: Relevance
“…The second subnetwork is placed at the next layer and usually divides the input space into 16-32 categories, which indicates a slightly more detailed classification of the input space. The last subnetwork in our self-organizing neural network will be placed at the lowest layer and classifies all the input patterns into either a motif or a nonmotif category with one or a few patterns [37]. Typically, the number of output neurons will be large for the last subnetwork and gradually reduced to a small number for the first subnetwork.…”
Section: B a New Structure Of Self-organizing Neural Networkmentioning
confidence: 99%
“…The second subnetwork is placed at the next layer and usually divides the input space into 16-32 categories, which indicates a slightly more detailed classification of the input space. The last subnetwork in our self-organizing neural network will be placed at the lowest layer and classifies all the input patterns into either a motif or a nonmotif category with one or a few patterns [37]. Typically, the number of output neurons will be large for the last subnetwork and gradually reduced to a small number for the first subnetwork.…”
Section: B a New Structure Of Self-organizing Neural Networkmentioning
confidence: 99%
“…Generally, the hierarchical relationships between ART modules are defined implicitly by the input signal flow, explicitly by enforcing constraints or connections, and/or by the setting of multiple vigilance parameters to define hierarchies. Alternatively, hierarchies within the same ART can be created by designing custom ART activation functions [40,41] or by analyzing its distributed activation patterns [42]. ART-based hierarchical approaches have been successfully applied, for instance, in text mining [20,43] and robotics [30,39].Another branch of clustering includes multi-prototype-based methods.…”
mentioning
confidence: 99%
“…Lavoie et al [43] proposed modifying the FAM choice function by adding an attentional tuning parameter. Using this parameter, the network can learn and use different categories with different degrees of generality, hence enabling mutual representation by both general and specific categories.…”
Section: Fam-based New Algorithmsmentioning
confidence: 99%