2011
DOI: 10.1109/tkde.2010.193
|View full text |Cite
|
Sign up to set email alerts
|

On the Design and Analysis of the Privacy-Preserving SVM Classifier

Abstract: The support vector machine (SVM) is a widely used tool in classification problems. The SVM trains a classifier by solving an optimization problem to decide which instances of the training data set are support vectors, which are the necessarily informative instances to form the SVM classifier. Since support vectors are intact tuples taken from the training data set, releasing the SVM classifier for public use or shipping the SVM classifier to clients will disclose the private content of support vectors. This vi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
65
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 140 publications
(65 citation statements)
references
References 20 publications
0
65
0
Order By: Relevance
“…For accurate classification only support vectors are required while rest of the dataset becomes redundant. Mathematically, SVM can be considered as ( , ) [3]. Hence, the subjective function ( ) can be written as, (6) where, is the weight matrix.…”
Section: Support Vector Machinementioning
confidence: 99%
See 2 more Smart Citations
“…For accurate classification only support vectors are required while rest of the dataset becomes redundant. Mathematically, SVM can be considered as ( , ) [3]. Hence, the subjective function ( ) can be written as, (6) where, is the weight matrix.…”
Section: Support Vector Machinementioning
confidence: 99%
“…Hence, the subjective function ( ) can be written as, (6) where, is the weight matrix. SVM optimizes above equation under the objective function [3], (7) Above equation generates coefficients of hyper-plane which maximizes separation between and . This places features vectors corresponding positive class on one of the sides of the hyperplane, while feature vectors corresponding to negative class resides on the opposite side of the hyper-plane.…”
Section: Support Vector Machinementioning
confidence: 99%
See 1 more Smart Citation
“…The recent work in [37] discusses the issue of releasing the trained SVM classifier without violating the privacy of support vectors. While the Gaussian kernel was considered, a Taylor series was exploited to approximate the infinite dimension of the Gaussian kernel into finite dimension and adhere negligible performance loss.…”
Section: Related Workmentioning
confidence: 99%
“…Zhou et al [8] use knowledge-based fuzzy clustering (KBFC), maximum likelihood, and seed growing to identify tumor region and analyze each method's accuracy. Support Vector Machines (SVMs) [9,10] have been widely used in tumor segmentation. Zhang et al [11] compare the tumor segmentation results with one-class SVM and two-class SVM.…”
Section: Introductionmentioning
confidence: 99%