2015
DOI: 10.1016/j.patrec.2015.03.010
|View full text |Cite
|
Sign up to set email alerts
|

Exponential family Fisher vector for image classification

Abstract: a b s t r a c tOne of the fundamental problems in image classification is to devise models that allow us to relate the images to higher-level semantic concepts in an efficient and reliable way. A widely used approach consists on extracting local descriptors from the images and to summarize them into an image-level representation. Within this framework, the Fisher vector (FV) is one of the most robust signatures to date. In the FV, local descriptors are modeled as samples drawn from a mixture of Gaussian pdfs. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 16 publications
(16 citation statements)
references
References 37 publications
(57 reference statements)
0
16
0
Order By: Relevance
“…FV encoding is considered as an extension of BoW such that, rather than encoding the relative frequency of the descriptors, it encodes the information on distribution of the descriptors (Perronnin et al, 2010). Fisher kernels have been used to compute FVs utilizing a mechanism that incorporates generative probability models into discriminative classifiers in applications, including classification of protein domains (Jaakkola et al, 1999), action and event recognition (Oneata et al, 2013;Sekma et al, 2015), image classification (Liu et al, 2014;Simonyan et al, 2013;Sánchez and Redolfi, 2015) and 3D object retrieval (Savelonas et al, 2016). FVs have been applied to model effective connectivity of the networks using MRI and PET data (Zhou et al, 2016).…”
Section: Introductionmentioning
confidence: 99%
“…FV encoding is considered as an extension of BoW such that, rather than encoding the relative frequency of the descriptors, it encodes the information on distribution of the descriptors (Perronnin et al, 2010). Fisher kernels have been used to compute FVs utilizing a mechanism that incorporates generative probability models into discriminative classifiers in applications, including classification of protein domains (Jaakkola et al, 1999), action and event recognition (Oneata et al, 2013;Sekma et al, 2015), image classification (Liu et al, 2014;Simonyan et al, 2013;Sánchez and Redolfi, 2015) and 3D object retrieval (Savelonas et al, 2016). FVs have been applied to model effective connectivity of the networks using MRI and PET data (Zhou et al, 2016).…”
Section: Introductionmentioning
confidence: 99%
“…Faraki et al proposed BOW, FV, and VLAD codings on covariance matrices [32], [33]. Sánchez et al extended Gaussian FV coding to a broader family of distribution, including Wishart distribution, which is a distribution of covariance matrices [34]. Ilea et al proposed a Riemannian FV coding based on the mixture model of Riemannian Gaussian distributions [35].…”
Section: Related Workmentioning
confidence: 99%
“…An extension of the FV by using the BMM has been also carried in [73,64]. Our approach differs from the one proposed in [73] in the approximation of the square root of the inverse of the FIM (i.e., L λ ) .…”
Section: Gmm-fv [63]mentioning
confidence: 99%
“…the natural parameters of the Gaussian distribution rather than the mean and the variance parameters which are typically used in literature for the FV representation [56,54,63]. Unfortunately, the authors of [64] didn't experimentally compare the FVs obtained using or not the natural parameters.…”
Section: Gmm-fv [63]mentioning
confidence: 99%
See 1 more Smart Citation