2007 International Workshop on Anti-Counterfeiting, Security and Identification (ASID) 2007
DOI: 10.1109/iwasid.2007.373728
|View full text |Cite
|
Sign up to set email alerts
|

Classification Performance Comparison between RVM and SVM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 22 publications
(9 citation statements)
references
References 1 publication
0
8
0
Order By: Relevance
“…In [8], it was also shown that RVMs have a much lower minimum descriptor length with up to 28% reduction in the the number of decision vectors required across all features used in the experiment. To further support this, experiments by [40] show that RVMs are less susceptible to over fitting when compared to SVM. This is a desirable characteristic as we want our pain recognition model to generalize well to real world data.…”
Section: Pain Intensity Estimationmentioning
confidence: 92%
“…In [8], it was also shown that RVMs have a much lower minimum descriptor length with up to 28% reduction in the the number of decision vectors required across all features used in the experiment. To further support this, experiments by [40] show that RVMs are less susceptible to over fitting when compared to SVM. This is a desirable characteristic as we want our pain recognition model to generalize well to real world data.…”
Section: Pain Intensity Estimationmentioning
confidence: 92%
“…Note that w MP , the maximum posterior of w, is obtained by Laplace's Method in [24], which approximates a Normal distribution with the mean value w 0 to the posterior of w. [4,15,30] concluded that RVM is better than SVM in the fields of classification and regression. They also showed that the conduction speed of RVM is faster than SVM.…”
Section: (4)mentioning
confidence: 99%
“…All the variables in X −1,j and X 1,j have the class labels −1 (Majority) and 1 (Minority), respectively. We set five kinds of sizes, (n p , n n ) = (30,30), (15,30), (12,30), (6,30), and (3, 30) to illustrate the performance of different algorithms in different-sized data. b = 1, 2, 2.5, 5, 10 for these five cases and a larger b indicates a more severely imbalanced dataset.…”
Section: Numeric Studies a Simulation Data Studiesmentioning
confidence: 99%
“…Regarding this, in this paper an authentication approach using Relevance Vector Machine (RVM) [12] proposed. RVM is also like SVM -a sparse linearly parameterized model -but RVM requires fewer relevant vectors [12], [13]; hence, it increases the speed of the authentication process. This paper presents a method using face and fingerprint traits and feature level fusion for the aims of finding effective ways to fuse at feature level of different characteristics, and constructing templates from the combined features.…”
Section: Introductionmentioning
confidence: 99%