2017
DOI: 10.1016/j.patcog.2017.02.011
|View full text |Cite
|
Sign up to set email alerts
|

Weighted linear loss multiple birth support vector machine based on information granulation for multi-class classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
20
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
8
2

Relationship

1
9

Authors

Journals

citations
Cited by 72 publications
(20 citation statements)
references
References 45 publications
0
20
0
Order By: Relevance
“…Thus, parameter selection is a practical problem and should be investigated in the future. Moreover, the extension of the proposed EFTBSVM to multi-class [66]- [68], multi-label [69], [70] and multi-view [71], [72] classification problems are also interesting.…”
Section: Discussionmentioning
confidence: 99%
“…Thus, parameter selection is a practical problem and should be investigated in the future. Moreover, the extension of the proposed EFTBSVM to multi-class [66]- [68], multi-label [69], [70] and multi-view [71], [72] classification problems are also interesting.…”
Section: Discussionmentioning
confidence: 99%
“…It should be pointed out that there are many parameters in our WLPTSVM, so parameter selection is a practical problem and needs to be investigated in the future. In addition, the extension of our WLPTSVM to multiclass classification [46]- [48], multi-label classification [49] and feature selection problems [50], [51] are also interesting. Furthermore, how to use our WLPTSVM to deal with the large-scale classification problems in real world is also under our consideration.…”
Section: Discussionmentioning
confidence: 99%
“…Support Vector Machines were introduced [16] as a technique aimed to solve binary classification problems; due to their solid theoretical fundaments, SVMs have been used to answer regression, clustering and multi-classification tasks [17] along with practical applications in several fields, including computer vision [18], text classification [19], and natural language processing [20], among others. It can be defined as a discriminative classifier which works with labelled training data to output an optimal hyperplane used to categorize new examples.…”
Section: Support Vector Machinesmentioning
confidence: 99%