2004
DOI: 10.1111/j.1468-0394.2004.00285.x
|View full text |Cite
|
Sign up to set email alerts
|

Neural network ensembles: combining multiple models for enhanced performance using a multistage approach

Abstract: Neural network ensembles (sometimes referred to as committees or classifier ensembles) are effective techniques to improve the generalization of a neural network system. Combining a set of neural network classifiers whose error distributions are diverse can generate better results than any single classifier. In this paper, some methods for creating ensembles are reviewed, including the following approaches: methods of selecting diverse training data from the original source data set, constructing different neu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
38
0

Year Published

2006
2006
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 72 publications
(38 citation statements)
references
References 27 publications
0
38
0
Order By: Relevance
“…The use of multiple neural systems was described in the work of Sharkey (1998). A recent review on neural network ensembles can be found in (Yang and Browne 2001).…”
Section: Ensemble Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…The use of multiple neural systems was described in the work of Sharkey (1998). A recent review on neural network ensembles can be found in (Yang and Browne 2001).…”
Section: Ensemble Neural Networkmentioning
confidence: 99%
“…Different from neural network ensembles and modular neural network approaches, cooperative modular neural networks can decompose automatically and combines adaptively individual neural network models so that a global optimal solution of the original problem can be obtained. Reported results show that the cooperative modular neural networks can be well applied to classification and pattern recognition (Auda and Kamel 1997a, b, 1998a, b, 1999Zhang 2000;Lu and Ito 1999;Yang and Browne 2001;Oh and Suen 2002;Melin et al 2005;Fogelman-Soulie 1993;Hodge et al 1999;Kamel 1999;Alexandre et al 2001;Ozawa 1998;Islam et al 2003). Specially, in recent decade, as special one class of cooperative modular neural networks, cooperative recurrent modular neural networks for constrained optimization have been developed and well studied (Rodríguez-Vázquez et al 1990;Glazos et al 1998;Zhang and Constantinides 1992;He and Sun 2001;Tao and Fang 2000;Xia and Wang 1995, b, 2001, b, 2005Xia 1996aXia , b, 1997Xia , 2003Xia , 2004Xia et al 2002aXia et al , b, 2004aXia et al , b, 2005Xia et al , 2007Wang et al 2000;Tan et al 2000;Anguita and Boni 2002;Zhang et al 2003;Feng 2004, 2006;Kamel 2007a, b, c, d, 2008;Tao et al 2001;Leung et al 2001).…”
Section: Introductionmentioning
confidence: 99%
“…Ensembles can reduce the variance of estimation errors and improve the overall classification accuracy. Many ensemble-based approaches have been proposed in recent research, including an ANN ensemble for decision support system (Ohlsson, 2004), an ensemble of ANNs for breast cancer and liver disorder prediction (Yang & Browne, 2004), MDSS with an ensemble of several different classifiers for breast diagnosis (West, Mangiameli, Rampal, & West, 2005), and multiple classifier combinations with an evolutionary approach (Kim, Min, & Han, 2006). …”
Section: Related Workmentioning
confidence: 99%
“…That is, neural networks have the ability to provide flexible mapping between inputs and outputs. Secondly, neural networks are far from being optimal classifier [21]. Many experimental results have shown the generalization of individual networks is not unique.…”
Section: Introductionmentioning
confidence: 99%