2013
DOI: 10.1016/j.cor.2012.05.015
|View full text |Cite
|
Sign up to set email alerts
|

Supervised classification and mathematical optimization

Abstract: Data Mining techniques often ask for the resolution of optimization problems. Supervised Classification, and, in particular, Support Vector Machines, can be seen as a paradigmatic instance. In this paper, some links between Mathematical Optimization methods and Supervised Classification are emphasized. It is shown that many different areas of Mathematical Optimization play a central role in off-the-shelf Supervised Classification methods. Moreover, Mathematical Optimization turns out to be extremely useful to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
91
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 124 publications
(91 citation statements)
references
References 201 publications
(232 reference statements)
0
91
0
Order By: Relevance
“…For a given set of ndimensional feature vectors, the SVM classifier can find the hyperplane with the highest margin that categorizes these vectors. Note that SVM is a powerful tool for solving classification problems in specific as a two-class classifier (Carrizosa and Morales, 2013). The k-NN classifier uses directly closest training samples from the feature vectors to classify a new test example.…”
Section: Classification Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…For a given set of ndimensional feature vectors, the SVM classifier can find the hyperplane with the highest margin that categorizes these vectors. Note that SVM is a powerful tool for solving classification problems in specific as a two-class classifier (Carrizosa and Morales, 2013). The k-NN classifier uses directly closest training samples from the feature vectors to classify a new test example.…”
Section: Classification Methodsmentioning
confidence: 99%
“…There are many other techniques similar to artificial neural networks including Convolutional Neural Network (CNN), simple and oblique decision trees and Support Vector Machines (SVM) methods (Carrizosa and Morales, 2013). All of these techniques are founded on a similar principle that consists of choosing a structure (for example: MLP for neural networks, leaves representing class labels and branches representing conjunction of features for decision tree and core function for the SVM method).…”
Section: Machine Learning Algorithmsmentioning
confidence: 99%
“…In all the cases, we use the usual techniques to linearize the product of binary variables proposed in [22] and the absolute values both in the objective function and constraint (7). Thus, we consider the following reformulations R1, R2 and R3 stated as follows:…”
Section: H H H H H H H H H H H H H H H H H H H H H H H H H H H H H H mentioning
confidence: 99%
“…The box-connectivity of P r (x) is enforced by imposing that the box generated by each pair of non-adjacent cells belonging to P r (x) (two cells that do not share a common boundary) must contain also cells of P r (x), namely the intersection between such box (excluding its two generator cells) and the portion must be nonempty. Finally, the error incurred by approximating the statistical values ω by the area of the portions is modeled through constraint (7).…”
Section: The Mathematical Optimization Modelmentioning
confidence: 99%
See 1 more Smart Citation