2001
DOI: 10.1080/09540090110108679
|View full text |Cite
|
Sign up to set email alerts
|

Flexible feature discovery and structural information control

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
31
0

Year Published

2003
2003
2017
2017

Publication Types

Select...
6
2

Relationship

3
5

Authors

Journals

citations
Cited by 67 publications
(31 citation statements)
references
References 27 publications
0
31
0
Order By: Relevance
“…We have so far shown that competitive learning as well as self-organizing maps aim to maximize mutual information between input patterns and output neurons [59], [60], [61]. However, little attention has been paid to information content in input neurons.…”
Section: ) Validity Of Methods and Experimental Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…We have so far shown that competitive learning as well as self-organizing maps aim to maximize mutual information between input patterns and output neurons [59], [60], [61]. However, little attention has been paid to information content in input neurons.…”
Section: ) Validity Of Methods and Experimental Resultsmentioning
confidence: 99%
“…In particular, we have introduced informationtheoretic competitive learning [59], [60], [61]. Contrary to the computational methods so far developed, we have supposed that competitive learning is a realization of mutual information maximization between output neurons and input neurons.…”
Section: A Double Competitionmentioning
confidence: 99%
“…We have so far demonstrated that competitive processes in competitive learning can be described by using the mutual information between competitive units and input patterns (Kamimura & Kamimura, 2000;Kamimura et al, 2001;Kamimura, 2003a;c;. In other words, the degree of organization of competitive units can be described by using mutual information between competitive units and input patterns.…”
Section: Information-theoretic Competitive Learningmentioning
confidence: 99%
“…First, information contained in competitive units must be as large as possible, as shown in Figure 1(b1). We have already shown that this information on competitive units, more exactly, mutual information between competitive units and input patterns, represents competitive processes (Kamimura & Kamimura, 2000;Kamimura et al, 2001;Kamimura, 2003a;c;. Thus, this information, or more exactly, mutual information, should be as large as possible.…”
Section: Information-theoretic Approachmentioning
confidence: 99%
“…Information maximization methods have long been used to interpret final representations by simplifying network configurations [18,19,20]. In this case, the information content can be stored in a small number of neurons and connection weights.…”
Section: Potential Learning For Interpretationmentioning
confidence: 99%