2008
DOI: 10.1016/j.patcog.2008.04.005
|View full text |Cite
|
Sign up to set email alerts
|

An efficient kernel matrix evaluation measure

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2009
2009
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 51 publications
(23 citation statements)
references
References 3 publications
0
23
0
Order By: Relevance
“…Another technique is the feature space-based kernel matrix (FSM) evaluation measure [37]. Both KTA and FSM, however, may not work well for small sample sizes and have a tendency of overfitting [38].…”
Section: Other More General Methods For Estimating σmentioning
confidence: 99%
“…Another technique is the feature space-based kernel matrix (FSM) evaluation measure [37]. Both KTA and FSM, however, may not work well for small sample sizes and have a tendency of overfitting [38].…”
Section: Other More General Methods For Estimating σmentioning
confidence: 99%
“…This is because the formulation of the ideal kernel often requires a large set of labeled examples. Furthermore, it has been shown in [62] that it is possible for a kernel function to have a low alignment measure for a particular dataset and still run well for that dataset. Therefore, a new measure of kernel alignment is devised in [62] based on the distribution of data in the feature space.…”
Section: Kernel Alignment Measurementioning
confidence: 99%
“…Previous "filter" based kernel matrix learning algorithm can tackle the scaling variation (K' u = tK u (t > 0)), but the translation variance could not be considered. KTA is invariant to scaling problem, but as (Nguyen and Ho, 2008) points out, KTA is not invariant under data translation in the feature space. (He, Chang and Xie, 2008) normalizes the sub-kernel K u by dividing K u by the largest absolute value of K u .…”
Section: Kernel Normalizationmentioning
confidence: 99%