IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)
DOI: 10.1109/ijcnn.1999.836173
|View full text |Cite
|
Sign up to set email alerts
|

Pattern classification by an incremental learning fuzzy neural network

Abstract: PREFACETo detect and identify defects in machine condition health monitoring, classical neural classifiers, such as Multilayer Perceptron (MLP) neural networks, are proposed to supervise the monitored system. A drawback of classical neural classifiers, off-line and iterative learning algorithms, is a long training time. In addition, they are often stuck at local minima, unable to achieve the optimum solution. Furthennore, in an operating mode, it is possible that new faults are developing while a monitored sys… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 32 publications
0
5
0
Order By: Relevance
“…The incremental neural learning algorithm was developed based on the theory of boosting. The boosting theory states that a system can be adaptively adjusted to the errors of the weak hypotheses returned by a weak learning algorithm [53] and then by summing the probabilistic predictions of the weak hypotheses [5,6,[19][20][21]44]. Schapire and Freund showed that any weak learning algorithm can be efficiently transformed or "boosted" into a strong algorithm [11,19,44] by altering the distribution of samples so as to increase the probability of the "harder" parts of the data space being recognized correctly.…”
Section: An Incremental Neural Learning Frameworkmentioning
confidence: 99%
See 1 more Smart Citation
“…The incremental neural learning algorithm was developed based on the theory of boosting. The boosting theory states that a system can be adaptively adjusted to the errors of the weak hypotheses returned by a weak learning algorithm [53] and then by summing the probabilistic predictions of the weak hypotheses [5,6,[19][20][21]44]. Schapire and Freund showed that any weak learning algorithm can be efficiently transformed or "boosted" into a strong algorithm [11,19,44] by altering the distribution of samples so as to increase the probability of the "harder" parts of the data space being recognized correctly.…”
Section: An Incremental Neural Learning Frameworkmentioning
confidence: 99%
“…Neural learning algorithms such as backpropagation are not incremental, since old instances need to be reexamined to determine whether weights need to be updated or the structure of a neural network system needs to be changed [22]. As Mandziuk and Shastri pointed out, one of the greatest impediments in building large, scalable learning systems based on neural networks is that when a network trained to solve task A is subsequently trained to solve task B, it often "forgets" the solution to task A [31,53]. A comprehensive discussion of incremental neural learning research is presented in Sect.…”
Section: Introductionmentioning
confidence: 99%
“…However, clusters that belong to the same class are grouped together via the pruning module. The details of learning algorithm can be found in [37], [38]. A trained ILFN does not exhibit a clear meaning of knowledge embedded inside its structure.…”
Section: A Incremental Learning Fuzzy Neural Network (Ilfn)mentioning
confidence: 99%
“…The comparison study was performed by Robinson [37]. In Robinson's stud}', the best results with the correct prediction of 56% were obtained using the nearest neighbor classifier.…”
Section: B Vowel Recognition Data Setmentioning
confidence: 99%
See 1 more Smart Citation