1999
DOI: 10.1109/72.809080
|View full text |Cite
|
Sign up to set email alerts
|

Incremental learning methods with retrieving of interfered patterns

Abstract: There are many cases that a neural-network-based system must memorize some new patterns incrementally. However, if the network learns the new patterns only by referring to them, it probably forgets old memorized patterns, since parameters in the network usually correlate not only to the old memories but also to the new patterns. A certain way to avoid the loss of memories is to learn the new patterns with all memorized patterns. It needs, however, a large computational power. To solve this problem, we propose … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
30
0

Year Published

2003
2003
2011
2011

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 85 publications
(30 citation statements)
references
References 8 publications
0
30
0
Order By: Relevance
“…Fu et al [19] proposed an incremental backpropagation learning network that employs bounded weight modification and structural adaptation learning rules and applies initial knowledge to constrain the learning process. Yamauchi et al [4] proposed incremental learning methods for retrieving interfered patterns. In their methods, a neural network learns new patterns with a relearning of a few number of retrieved past patterns that interfere with the new patterns.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…Fu et al [19] proposed an incremental backpropagation learning network that employs bounded weight modification and structural adaptation learning rules and applies initial knowledge to constrain the learning process. Yamauchi et al [4] proposed incremental learning methods for retrieving interfered patterns. In their methods, a neural network learns new patterns with a relearning of a few number of retrieved past patterns that interfere with the new patterns.…”
Section: Related Workmentioning
confidence: 99%
“…Thus, the final chromosome can be encoded as a string consisting of characters. According to the above encoding mechanism, each chromosome will consist of characters, where (4) If all the antecedent elements in a rule are inactive, this rule will be regarded as a noncontributing rule. With this mechanism, our approach has the feature of variable-length GAs so that the number of active rules in a rule set can be flexible.…”
Section: A Encoding Mechanismmentioning
confidence: 99%
See 2 more Smart Citations
“…Yamauchi et al 10 proposed incremental learning methods for retrieving interfered patterns. In their methods, a neural network learns new patterns with relearning of a small number of retrieved past patterns that interfere with the new patterns.…”
Section: Introductionmentioning
confidence: 99%