2016 Eighth International Conference on Information and Knowledge Technology (IKT) 2016
DOI: 10.1109/ikt.2016.7777760
|View full text |Cite
|
Sign up to set email alerts
|

Optimization of the Ho-Kashyap classification algorithm using appropriate learning samples

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 9 publications
0
3
0
Order By: Relevance
“…After feature selection, there would be binary classification (Dezfoulian et al, 2016) for fire / no fire data which leads to a labeling task for classification. Data is divided into 70 % / 30% for training and testing stages.…”
Section: Fig 8 Lpq Algorithm Workflowmentioning
confidence: 99%
“…After feature selection, there would be binary classification (Dezfoulian et al, 2016) for fire / no fire data which leads to a labeling task for classification. Data is divided into 70 % / 30% for training and testing stages.…”
Section: Fig 8 Lpq Algorithm Workflowmentioning
confidence: 99%
“…The authors used the Multi Class Instance Selection (MCIS) algorithm by Chen, Zhang, Xue, and Liu (2013) to obtain the most valuable data from each class and speed up the Widrow-Hoff classification algorithm by Steinbuch and Widrow (1965). A similar method is presented in Dezfoulian, MiriNezhad, Mousavi, Mosleh, and Shalchi (2016). In this case, the authors used MCIS in the first step and the Ho-Kashyap algorithm by Ho and Kashyap (1965) in the classification step.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Mousavi, MiriNezhad, and Mirmoini (2017) proposed a system using K-means clustering and triangular calculations to find support vectors. Mirinezhad, Dezfoulian, Mosleh, and Mousavi (2016) combined MCIS clustering and Widrow-Hoff classification algorithms to obtain the closest samples from each class and speed up the classification process. Table 1 provides a summary of the strengths and weaknesses of our proposed approach versus these other approaches.…”
Section: Literature Reviewmentioning
confidence: 99%