2012
DOI: 10.1016/j.ins.2011.06.023
|View full text |Cite
|
Sign up to set email alerts
|

Supervised subspace projections for constructing ensembles of classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2014
2014
2018
2018

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 40 publications
(9 citation statements)
references
References 32 publications
0
9
0
Order By: Relevance
“…One approach to deal with complex, real-world problems is to combine AI prediction models and form an ensemble of predictors that exploit the different local behaviors of the base models to improve the overall prediction system's performance (Masoudnia et al, 2012). The main objective of ensemble learning methods is to simplify a difficult prediction task by dividing it into some relatively easy prediction subtasks and formulating a consensus prediction result for the original data (García-Pedrajas et al, 2012). From another perspective, ensemble learning is an approach to enhance the prediction accuracy for complex problems, such as those involving a limited number of instances, highdimensional feature sets, and highly complex trends and behaviors (Kotsiantis, 2011).…”
Section: Introduction and Literature Reviewmentioning
confidence: 99%
See 2 more Smart Citations
“…One approach to deal with complex, real-world problems is to combine AI prediction models and form an ensemble of predictors that exploit the different local behaviors of the base models to improve the overall prediction system's performance (Masoudnia et al, 2012). The main objective of ensemble learning methods is to simplify a difficult prediction task by dividing it into some relatively easy prediction subtasks and formulating a consensus prediction result for the original data (García-Pedrajas et al, 2012). From another perspective, ensemble learning is an approach to enhance the prediction accuracy for complex problems, such as those involving a limited number of instances, highdimensional feature sets, and highly complex trends and behaviors (Kotsiantis, 2011).…”
Section: Introduction and Literature Reviewmentioning
confidence: 99%
“…One of the causes of boosting failure is putting too much emphasis on correctly classifying all instances. Outliers, or noisy instances become too relevant in the training set, undermining the ensemble's performance (García-Pedrajas et al, 2012).…”
Section: Introduction and Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation
“…Wei et al have proposed an ensemble approach by combining the predictions made by Positive Naïve Bayes with the classifier of positive example-based learning for the unlabeled examples [5]. García-Pedrajas et al have used the supervised projections of random subspaces to construct ensemble, which combines the philosophy of boosting to generate supervised projection based on the misclassified samples, and then trains using all available samples in the space given by the supervised projections [6]. Bock et al have adopted a statistical technique for nonparametric, the generalized additive model (GAM), as the ensemble component and proposed three GAM ensemble classifiers for binary classification based on Bagging, random subspace and a combination of both [7].…”
Section: Literature Reviewmentioning
confidence: 99%
“…Face recognition [1][2][3][4][5][6] has been an active research area in computer vision and pattern recognition communities in the last decades. Since two-dimensional face images are usu-B Gui-Fu Lu luguifu_tougao@163.com 1 School of Computer and Information, Anhui Polytechnic University, Wuhu 241000, Anhui, China ally transformed into one-dimensional vectors via column by column or row by row concatenation in face recognition, the original input-image space has a very high dimension and a dimensionality reduction technique is usually employed to solve the high-dimensionality problem.…”
Section: Introductionmentioning
confidence: 99%