2015
DOI: 10.1016/j.eswa.2015.03.007
|View full text |Cite
|
Sign up to set email alerts
|

Multithreshold Entropy Linear Classifier: Theory and applications

Abstract: Linear classifiers separate the data with a hyperplane. In this paper we focus on the novel method of construction of multithreshold linear classifier, which separates the data with multiple parallel hyperplanes. Proposed model is based on the information theory concepts -namely Renyi's quadratic entropy and Cauchy-Schwarz divergence.We begin with some general properties, including data scale invariance. Then we prove that our method is a multithreshold large margin classifier, which shows the analogy to the S… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
21
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
3
1

Relationship

3
6

Authors

Journals

citations
Cited by 28 publications
(23 citation statements)
references
References 29 publications
2
21
0
Order By: Relevance
“…SVM aims to find a hyperplane that minimizes the structural risk ( Czarnecki and Tabor, 2015 ) in kernel space. Gaussian radial basis function, Linear, and polynomial are several common kernel functions.…”
Section: Methodsmentioning
confidence: 99%
“…SVM aims to find a hyperplane that minimizes the structural risk ( Czarnecki and Tabor, 2015 ) in kernel space. Gaussian radial basis function, Linear, and polynomial are several common kernel functions.…”
Section: Methodsmentioning
confidence: 99%
“…In the future, we plan to consider a more general family of entropy functions, including Rényi and Tsallis entropies, which are of great importance in the theory of coding and related problems [6,20,21]. Moreover, there also arises a natural question concerning the compression of n-tuple random variables.…”
Section: Discussionmentioning
confidence: 99%
“…where Σ A = (h γ A ) 2 cov A and (for γ being a scaling hyperparameter [6]) h γ A = γ( 4 k+2 ) 1/(k+4) |A| −1/(k+4) . Now we need the formula for A B , which is calculated [6] with the use of…”
Section: Closed Form Solution For Objective and Its Gradientmentioning
confidence: 99%