1997
DOI: 10.1017/s0269888997003123
|View full text |Cite
|
Sign up to set email alerts
|

Combining diverse neural nets

Abstract: An appropriate use of neural computing techniques is to apply them to problems such as condition monitoring, fault diagnosis, control and sensing, where conventional solutions can be hard to obtain. However, when neural computing techniques are used, it is important that they are employed so as to maximise their performance, and improve their reliability. Their performance is typically assessed in terms of their ability to generalise to a previously unseen test set, although unless the training set is very ca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
93
0
4

Year Published

2000
2000
2013
2013

Publication Types

Select...
6
4

Relationship

1
9

Authors

Journals

citations
Cited by 139 publications
(97 citation statements)
references
References 27 publications
0
93
0
4
Order By: Relevance
“…Ghosh (1996b, 1999) do not mention the case of negative correlation although it clearly supports their thesis that the smaller the correlation, the better the ensemble. A negative correlation between the continuousvalued outputs has been sought, predominantly by altering the available training set or parameters of the classifier (Dietterich, 2000a;Hashem, 1999;Krogh & Vedelsby, 1995;Liu & Yao, 1999;Opitz & Shavlik, 1999;Parmanto et al, 1996;Giacinto & Roli, 2001;Rosen, 1996;Sharkey & Sharkey, 1997;Skalak, 1996;Tumer & Ghosh, 1999). -When classifiers output class labels, the classification error can be decomposed into bias and variance terms (also called 'spread') (Bauer & Kohavi, 1999;Breiman, 1999;Kohavi & Wolpert, 1996) or into bias and spread terms.…”
Section: Correct/incorrect Decision (The Oracle Output) the Output Dmentioning
confidence: 99%
“…Ghosh (1996b, 1999) do not mention the case of negative correlation although it clearly supports their thesis that the smaller the correlation, the better the ensemble. A negative correlation between the continuousvalued outputs has been sought, predominantly by altering the available training set or parameters of the classifier (Dietterich, 2000a;Hashem, 1999;Krogh & Vedelsby, 1995;Liu & Yao, 1999;Opitz & Shavlik, 1999;Parmanto et al, 1996;Giacinto & Roli, 2001;Rosen, 1996;Sharkey & Sharkey, 1997;Skalak, 1996;Tumer & Ghosh, 1999). -When classifiers output class labels, the classification error can be decomposed into bias and variance terms (also called 'spread') (Bauer & Kohavi, 1999;Breiman, 1999;Kohavi & Wolpert, 1996) or into bias and spread terms.…”
Section: Correct/incorrect Decision (The Oracle Output) the Output Dmentioning
confidence: 99%
“…One of the most widespread and useful techniques in order to avoid such problems is the ensemble learning scheme [19], [20]. The main idea behind this kind of meta-algorithms is to train several slightly different simpler classifiers and combine their results in order to improve the results obtained by a single, usually more complex, one [21].…”
Section: Classifiers and Ensemblesmentioning
confidence: 99%
“…The underlying reason for increased reliability through the use of ensembles is that di erent classi cation algorithms will show di erent patterns of generalization. More formal explanations of the way ensembles can improve performance may be found in [5,6].…”
Section: Ensemble Learningmentioning
confidence: 99%