2017
DOI: 10.1002/tee.22479
|View full text |Cite
|
Sign up to set email alerts
|

A geometry‐based two‐step method for nonlinear classification using quasi‐linear support vector machine

Abstract: This paper proposes a two‐step method to construct a nonlinear classifier consisting of multiple local linear classifiers interpolated with a basis function. In the first step, a geometry‐based approach is first introduced to detect local linear partitions and build local linear classifiers. A coarse nonlinear classifier can then be constructed by interpolating the local linear classifiers. In the second step, a support vector machine (SVM) formulation is used to further implicitly optimize the linear paramete… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
9
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
1

Relationship

3
2

Authors

Journals

citations
Cited by 5 publications
(9 citation statements)
references
References 26 publications
0
9
0
Order By: Relevance
“…Before applying the SVM optimization, we need to determine the parameters μ j and σ j . For the purpose, we first introduce a modification to the geometry‐based partitioning algorithm for the imbalanced dataset and apply it to divide the input space of training data into M linearly separable local partitions along the separation boundary.…”
Section: Quasi‐linear Svm Classifiermentioning
confidence: 99%
See 2 more Smart Citations
“…Before applying the SVM optimization, we need to determine the parameters μ j and σ j . For the purpose, we first introduce a modification to the geometry‐based partitioning algorithm for the imbalanced dataset and apply it to divide the input space of training data into M linearly separable local partitions along the separation boundary.…”
Section: Quasi‐linear Svm Classifiermentioning
confidence: 99%
“…The geometry‐based algorithm for detecting the local linear partitions can be found in Refs . The modified version for imbalanced dataset is briefly described by the following.…”
Section: Quasi‐linear Svm Classifiermentioning
confidence: 99%
See 1 more Smart Citation
“…Another algorithmic solution is a combination of several local linear models because it is more efficient to train linear models. Inspired by the theory of locally linear approximation of nonlinear functions, many algorithms solve nonlinear classification problems by composing a set of local linear classifiers . They assume that the data instances in a small local region are more linearly separable.…”
Section: Introductionmentioning
confidence: 99%
“…Kernel methods such as support vector machines (SVMs) and kernel Fisher discriminant analysis (KFDA) have been successfully applied to a wide variety of machine learning problems . These methods map data points from the input space to some feature space, that is, a higher dimensional reproducing kernel Hilbert space (RKHS), where even relatively simple algorithms such as linear methods can deliver very impressive performance.…”
Section: Introductionmentioning
confidence: 99%