2006
DOI: 10.4304/jcp.1.4.30-37
|View full text |Cite
|
Sign up to set email alerts
|

Local Boosting of Decision Stumps for Regression and Classification Problems

Abstract: <p class="MsoNormal" style="text-align: left; margin: 0cm 0cm 0pt;" align="left"><span class="text"><span style="font-family: ";Arial";,";sans-serif";; font-size: 9pt;">Numerous data mining problems involve an investigation of associations between features in heterogeneous datasets, where different prediction models can be more suitable for different regions. We propose a technique of boosting localized weak learners; rather than having constant weights attached to each learner (as in standar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
67
0
3

Year Published

2013
2013
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 79 publications
(70 citation statements)
references
References 18 publications
0
67
0
3
Order By: Relevance
“…This classifier is available in the MOA framework 4 [17], supporting massive data streams and several modeling scenarios (statistics, clustering, and classification) from these. f) Decision stump [52]: one-level decision trees that are updatable by appropriate counts in the information criterion; in addition, this classifier is available in the MOA tool. g) IB1 [45], [53]: an instanced-based classifier [54] where the reference update base is growing-for each classification query, the whole history is available instead of single snapshots of the data stream.…”
Section: Discussionmentioning
confidence: 99%
“…This classifier is available in the MOA framework 4 [17], supporting massive data streams and several modeling scenarios (statistics, clustering, and classification) from these. f) Decision stump [52]: one-level decision trees that are updatable by appropriate counts in the information criterion; in addition, this classifier is available in the MOA tool. g) IB1 [45], [53]: an instanced-based classifier [54] where the reference update base is growing-for each classification query, the whole history is available instead of single snapshots of the data stream.…”
Section: Discussionmentioning
confidence: 99%
“…Discretization means the partition of a given interval into a set of discrete sub-intervals, and split continuous intervals into two or more sub-intervals which can be treated as nominal values. A good discretization method should well consider the balance between the information loss and partition numbers [36]. Depending on whether or not they involve target variable information, discretization methods are typically organized in two groups, i.e., the supervised and unsupervised discretization, respectively.…”
Section: Applications In General Sciencesmentioning
confidence: 99%
“…Thus, the k 1 was set to 5 and k 2 =50. since at about this size of instances, it is appropriate for a simple algorithm to build a precise model [14]. The proposed method is presented in Algorithm 1.…”
Section: The Proposed Algorithmmentioning
confidence: 99%