2017
DOI: 10.1007/978-981-10-6620-7_26
|View full text |Cite
|
Sign up to set email alerts
|

Classification of E-commerce Products Using RepTree and K-means Hybrid Approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 6 publications
0
6
0
1
Order By: Relevance
“…We chose J48 [28] and Logistic Model Tree (LMT) [29] from the tree-based family, Decision Table [30], JRip [31], and OneR [32] from rule-based family, IBK [33] and KStar [34] from lazy learner family, Bagging [35] and LogitBoost [36] from metaheuristic family, and Bayes Net [37] from Bayesian family. Rep Tree [38] and Regression are used as base classifiers for Bagging and LogitBoost algorithms.…”
Section: Classification Algorithmsmentioning
confidence: 99%
“…We chose J48 [28] and Logistic Model Tree (LMT) [29] from the tree-based family, Decision Table [30], JRip [31], and OneR [32] from rule-based family, IBK [33] and KStar [34] from lazy learner family, Bagging [35] and LogitBoost [36] from metaheuristic family, and Bayes Net [37] from Bayesian family. Rep Tree [38] and Regression are used as base classifiers for Bagging and LogitBoost algorithms.…”
Section: Classification Algorithmsmentioning
confidence: 99%
“…We have considered four well‐known data mining algorithms in this article. These four algorithms include MLP, naïve Bayes, SMO, and REPtree 54 . We used different random sizes of data.…”
Section: Case Studymentioning
confidence: 99%
“…In the pre-pruning process, if the information gain due to division is not recognized, the expansion of the tree will be terminated [70]. XGBoost Extreme gradient boost (XGBoost) is an optimized distributed gradient boosting algorithm with high efficiency and flexibility.…”
Section: Liblinearmentioning
confidence: 99%