2016
DOI: 10.1080/2150704x.2016.1212418
|View full text |Cite
|
Sign up to set email alerts
|

Feature-based non-parametric estimation of Kullback–Leibler divergence for SAR image change detection

Abstract: In this article, a method based on a non-parametric estimation of the Kullback-Leibler divergence using a local feature space is proposed for synthetic aperture radar (SAR) image change detection. First, local features based on a set of Gabor filters are extracted from both preand post-event images. The distribution of these local features from a local neighbourhood is considered as a statistical representation of the local image information. The Kullback-Leibler divergence as a probabilistic distance is used … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 22 publications
0
3
0
Order By: Relevance
“…In case if the two probability density functions are close to each other, the Kullback-Leibler divergence is small. In contrast, it is larger if there is a great deviation between them [19]. The KLD is always nonnegative and equals zero only if the two distributions are identical.…”
Section: Applying Kullback Leibler Divergencementioning
confidence: 98%
“…In case if the two probability density functions are close to each other, the Kullback-Leibler divergence is small. In contrast, it is larger if there is a great deviation between them [19]. The KLD is always nonnegative and equals zero only if the two distributions are identical.…”
Section: Applying Kullback Leibler Divergencementioning
confidence: 98%
“…The Kullback-Leibler divergence introduced in [1] is used for quantification of similarity of two probability measures. It plays important role in various domains such as statistical inference (see, e.g., [2,3]), metric learning [4,5], machine learning [6,7], computer vision [8,9], network security [10], feature selection and classification [11][12][13], physics [14], biology [15], medicine [16,17], finance [18], among others. It is worth to emphasize that mutual information, widely used in many research directions (see, e.g., [19][20][21][22][23]), is a special case of the Kullback-Leibler divergence for certain measures.…”
Section: Introductionmentioning
confidence: 99%
“…The Kullback -Leibler divergence plays important role in various domains such as statistical inference (see, e.g., [25], [28]), machine learning ( [5], [32]), computer vision ( [11], [13]), network security ( [23], [44]), feature selection and classification ( [22], [29], [41]), physics ( [17]), biology ( [9]), finance ( [45]), among others. Recall that this divergence measure between probabilities P and Q on a space (S, B) is defined by way of…”
Section: Introductionmentioning
confidence: 99%