2010
DOI: 10.1007/978-3-642-12101-2_35
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of Bagging, Boosting and Stacking Ensembles Applied to Real Estate Appraisal

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
56
0
3

Year Published

2012
2012
2023
2023

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 103 publications
(61 citation statements)
references
References 20 publications
2
56
0
3
Order By: Relevance
“…In comparison with other ensemble methods, bagging ensemble is more consistent with the results [26]. Stacking ensemble method, as an example, returns highly variable results, in some cases increasing the prediction accuracy and in other ones giving an inaccurate prediction even if all models in the ensemble have high accuracy.…”
Section: Ensemble Methodssupporting
confidence: 64%
See 1 more Smart Citation
“…In comparison with other ensemble methods, bagging ensemble is more consistent with the results [26]. Stacking ensemble method, as an example, returns highly variable results, in some cases increasing the prediction accuracy and in other ones giving an inaccurate prediction even if all models in the ensemble have high accuracy.…”
Section: Ensemble Methodssupporting
confidence: 64%
“…e main idea of ensemble methods is to determine a more precise prediction by means of the vote of diverse models. is has been studied in [25,26] where it was concluded that a more general model for prediction is obtained when an ensemble method is applied, but not in all applications.…”
Section: Ensemble Methodsmentioning
confidence: 99%
“…The misclassified examples are marked and their weights increased so they will have a higher probability of appearing in the training set of the next predictor. It results in different machines being specialized in predicting different areas of the dataset [8].…”
Section: Boostingmentioning
confidence: 99%
“…In the new dataset, each instance is related to the real value that it is suppose to predict. Then that dataset is used by stacking model learner (level-1) to provide the final output [8]. For example, the predicted classifications from the three base classifiers, naïve bayes, decision tree and rule induction can be used as input variables into a nearest neighbour classifier as a stacking model learner, which will attempt to learn from the data how to combine the predictions from the different models to achieve the best classification accuracy.…”
Section: Stackingmentioning
confidence: 99%
“…genetic fuzzy systems and artificial neural networks as both single models [7] and ensembles built using various resampling techniques [8], [9], [10], [11], [12], [13]. An especially good performance revealed evolving fuzzy models applied to cadastral data [14], [15].…”
Section: Introductionmentioning
confidence: 99%