2020
DOI: 10.1007/s11263-020-01376-1
|View full text |Cite
|
Sign up to set email alerts
|

Beyond Covariance: SICE and Kernel Based Visual Feature Representation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 22 publications
(12 citation statements)
references
References 89 publications
0
12
0
Order By: Relevance
“…The Micchelli � s theorem [16] has proven that K i is guaranteed to be nonsingular no matter the values of feature dimensionality p and feature vector number hw. In summary, RBF kernel matrix K i could capture non-linear second order statistics and does not suffer from singular issue, the advantages of K i over covariance matrix C i have been corroborated theoretically and empirically in [66].…”
Section: Kernel Poolingmentioning
confidence: 74%
See 2 more Smart Citations
“…The Micchelli � s theorem [16] has proven that K i is guaranteed to be nonsingular no matter the values of feature dimensionality p and feature vector number hw. In summary, RBF kernel matrix K i could capture non-linear second order statistics and does not suffer from singular issue, the advantages of K i over covariance matrix C i have been corroborated theoretically and empirically in [66].…”
Section: Kernel Poolingmentioning
confidence: 74%
“…The covariance matrix is second order pooling in essence, which is able to learn the co-occurrence information between each pair of features. However, covariance matrix C i is type of linear kernel function [66], as a result, it is prone to be a singular matrix when feature dimension p is larger than feature vector number hw.…”
Section: Kernel Poolingmentioning
confidence: 99%
See 1 more Smart Citation
“…( 9) is only in the covariance component. For the robust estimation methods of covariance, we adopt the iSQRT-COV rather than other powerful covariance estimation methods [29]- [32] due to its friendly performance on GPU and its original version considered in the Gaussian setting.…”
Section: Robust Distribution Knowledge Embeddingmentioning
confidence: 99%
“…Covariance descriptors have been extended to many other applications [83,58,87,17] including end-toend training of CNNs, leading to state-of-art results on action recognition, texture classification, scene and fine-grained recognition [34,35,42]. As second-order representations capture correlation patterns of features, they are a powerful tool used in several recognition pipelines [22,68,49,39,42,48,59,41,37,101].…”
Section: Introductionmentioning
confidence: 99%