2014
DOI: 10.1016/j.ejor.2012.08.017
|View full text |Cite
|
Sign up to set email alerts
|

Interpretable support vector machines for functional data

Abstract: Support Vector Machines (SVM) has been shown to be a powerful nonparametric classification technique even for high-dimensional data. Although predictive ability is important, obtaining an easy-to-interpret classifier is also crucial in many applications. Linear SVM provides a classifier based on a linear score. In the case of functional data, the coefficient function that defines such linear score usually has many irregular oscillations, making it difficult to interpret. This paper presents a new method, calle… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
34
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 56 publications
(34 citation statements)
references
References 41 publications
0
34
0
Order By: Relevance
“…An important parameter in Support Vector Machines (SVM) is the kernel function (Martin- Barragan et al, 2014). The most popular choices are the linear, polynomial, and radial basis (RBF) kernel (Ballings and Van den Poel, 2013a).…”
Section: Support Vector Machinesmentioning
confidence: 99%
“…An important parameter in Support Vector Machines (SVM) is the kernel function (Martin- Barragan et al, 2014). The most popular choices are the linear, polynomial, and radial basis (RBF) kernel (Ballings and Van den Poel, 2013a).…”
Section: Support Vector Machinesmentioning
confidence: 99%
“…Hwang et al [35] developed a novel Lagrangian SVM to address multiclass problems. Lillo and Martin-Barragan [36] presented interpretable SVMs for functional data that provide an interpretable classifier with high predictive power. Milad and Morteza [37,38] presented a multilevel SVM that enhanced the original SVM.…”
Section: Introductionmentioning
confidence: 99%
“…Next, the classification procedures are applied to the Tecator dataset previously considered by Ferraty and Vieu (2003), Rossi and Villa (2006), Li and Yu (2008), Alonso et al (2012) and Martin-Barragan et al (2013), among others. The dataset that consists of 215 nearinfrared absorbance spectra of meat samples, recorded on a Tecator Infracted Food Analyzer is available at http://lib.stat.cmu.edu/datasets/tecator.…”
Section: Real Data Study: Tecator Datasetmentioning
confidence: 99%
“…Alternatively, Preda et al (2007) used functional PLS regression to obtain the discriminant functions while Shin (2008) considered an approach based on reproducing kernel Hilbert spaces. Finally, Ferraty and Vieu (2003) have proposed a method based on estimating nonparametrically the posterior probability that the new function χ 0 is of a given class, López-Pintado and Romo (2006), Cuevas et al (2007) and Sguera et al (2012) have proposed classifiers based on the notion of data depth that are well suited for datasets containing outliers, Rossi and Villa (2006) and Martin-Barragan et al (2013) have investigated the use of support vector machines (SVMs) for functional data, Wang et al (2007) have considered classification for functional data by Bayesian modeling with wavelet basis functions, Epifanio (2008) has developed classifiers based on shape descriptors, Araki et al (2009) have considered functional logistic classification, and, finally, Alonso et al (2012) have proposed a weighted distance approach. Note that, when a distance is required, these papers use the L 1 , L 2 and L ∞ distances which are well defined in Hilbert spaces.…”
Section: Introductionmentioning
confidence: 99%