Learning Theory
DOI: 10.1007/978-3-540-72927-3_13
|View full text |Cite
|
Sign up to set email alerts
|

Transductive Rademacher Complexity and Its Applications

Abstract: We develop a technique for deriving data-dependent error bounds for transductive learning algorithms based on transductive Rademacher complexity. Our technique is based on a novel general error bound for transduction in terms of transductive Rademacher complexity, together with a novel bounding technique for Rademacher averages for particular algorithms, in terms of their "unlabeled-labeled" representation. This technique is relevant to many advanced graph-based transductive algorithms and we demonstrate its e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
74
0

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 40 publications
(75 citation statements)
references
References 32 publications
1
74
0
Order By: Relevance
“…In this subsection, we derive a generalization error bound for LLGC using the tool of transductive Rademacher complexity for general function classes [8].…”
Section: B Generalization Error Bound For Llgcmentioning
confidence: 99%
See 1 more Smart Citation
“…In this subsection, we derive a generalization error bound for LLGC using the tool of transductive Rademacher complexity for general function classes [8].…”
Section: B Generalization Error Bound For Llgcmentioning
confidence: 99%
“…In particular, we choose Learning with Local and Global Consistency (LLGC) [17] as the classifier on graphs, because it is comparable to or even better than Minimum Cut (MinCut) [4] and GFHF [18]. We present a data-dependent generalization error bound for LLGC using the tool of transductive Rademacher Complexity [8], which is an extension of inductive Rademacher Complexity [2] and measures the richness of a class of real-valued functions with respect to a probability distribution. We show that the empirical transductive Rademacher complexity is a good surrogate for active learning on graphs.…”
Section: Introductionmentioning
confidence: 99%
“…In this section, we derive bounds for the GE of the MC in (1), KMC in (2) and KKMCEX in (4) algorithms. There are two approaches to GE analysis, namely the inductive [24] and the transductive one in [25]. In the inductive one GE measures the difference between the expected value of a loss function and the empirical loss over a finite number of samples.…”
Section: Generalization Error In MCmentioning
confidence: 99%
“…However, this definition of GE does not fit the MC framework because it assumes that: i) the data distribution is known; and, ii) the entries are sampled with repetition. In order to come up with distributionfree claims for MC, one may resort to the transductive GE analysis [25]. In this scenario, we are given S n = S m ∪ S u of n data comprising the union of the training set S m and the testing set S u , where |S u | = u.…”
Section: Generalization Error In MCmentioning
confidence: 99%
See 1 more Smart Citation