2015 International Conference on Innovations in Information, Embedded and Communication Systems (ICIIECS) 2015
DOI: 10.1109/iciiecs.2015.7193250
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble learning approach in improved K Nearest Neighbor algorithm for Text categorization

Abstract: Due to the tremendous growth of digital content in World Wide Web (WWW), Text categorization has become an important tool to manage and organize text related data. This paper proposes an Ensemble Learning approach in Improved K Nearest Neighbor algorithm for Text Categorization (EINNTC), which consists of single pass clustering, Ensemble learning and KNN algorithm. The EINNTC method provides solution to traditional KNN classifier issues, by reducing the huge text similarity computation complexity, avoids an im… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 18 publications
0
2
0
Order By: Relevance
“…It identifies the k nearest neighbors of a test document in the training set and assesses class candidates according to their classes. Iswarya and Radha [16] suggested an Ensemble learning strategy for the Improved KNN method for text categorization (EINNTC), which use one-pass clustering to reduce similarity calculation time and minimize noisy samples. In the first stage, a classification model is developed and updated, and in the second step, ensemble learning is used to determine the ideal value for the parameter K. In terms of F1 score, the results demonstrate that EINNTC surpasses SVM and conventional KNN.…”
Section: K-nearst Neibghors (Knn)mentioning
confidence: 99%
“…It identifies the k nearest neighbors of a test document in the training set and assesses class candidates according to their classes. Iswarya and Radha [16] suggested an Ensemble learning strategy for the Improved KNN method for text categorization (EINNTC), which use one-pass clustering to reduce similarity calculation time and minimize noisy samples. In the first stage, a classification model is developed and updated, and in the second step, ensemble learning is used to determine the ideal value for the parameter K. In terms of F1 score, the results demonstrate that EINNTC surpasses SVM and conventional KNN.…”
Section: K-nearst Neibghors (Knn)mentioning
confidence: 99%
“…If the user requires an absolute response, then the document category that represents the most neighbours is returned. [9][10] However, if a user wanted a fuzzy request, the percentage for each category is returned.…”
Section: Implementing a Text Classifiermentioning
confidence: 99%