2010 IEEE International Geoscience and Remote Sensing Symposium 2010
DOI: 10.1109/igarss.2010.5653437
|View full text |Cite
|
Sign up to set email alerts
|

Aspects of multivariate statistical theorywith the application to change detection

Abstract: This paper proposes a new method for change detection measurement including whole SAR imaging modes such as PolIn-SAR, partial PolInSAR and InSAR in a set of multi-temporalmultidimensional SAR images. The method is based on the special case of Kullback-Leibler (KL-divergence) test, known as Mutual Information. In order to develop an algorithm, firstly the joint distribution of PolInSAR data set, based on the second order statistics has been derived. Such a derivation accounts for the whole multi-temporal SAR i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
11
0

Year Published

2011
2011
2018
2018

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(11 citation statements)
references
References 9 publications
0
11
0
Order By: Relevance
“…) denotes the composite function, and (v, w) denotes a pair of random variables. [29] has shown that the optimization problem (20) includes many important applications in statistical learning and finance, such as reinforcement learning, statistical estimation, dynamic programming and portfolio management. Let us consider the following version of Stochastic Composite Gradient Descent (SCGD) algorithm in [29, Algorithm 1] whose iteration takes the form…”
Section: 2mentioning
confidence: 99%
See 1 more Smart Citation
“…) denotes the composite function, and (v, w) denotes a pair of random variables. [29] has shown that the optimization problem (20) includes many important applications in statistical learning and finance, such as reinforcement learning, statistical estimation, dynamic programming and portfolio management. Let us consider the following version of Stochastic Composite Gradient Descent (SCGD) algorithm in [29, Algorithm 1] whose iteration takes the form…”
Section: 2mentioning
confidence: 99%
“…random vectors following some distribution D over the parameter space; 3 f v k : R m → R and g w k : R n → R m are functions indexed by the aformentioned random vectors; the vector ∇f v k (y k+1 ) is the gradient column mvector of f v k evaluated at y k+1 and the matrix ∇g w k (x k ) is the n×m matrix formed by the gradient column n-vector of each of the m components of g w k evaluated at x k . The SCGD algorithm (21) is a provably effective method that solves (20); see early optimization literatures on the convergence and rates of convergence analysis in [8,29]. However, the convergence rate of SCGD algorithm and its variations is not known to be comparable to its SGD counterpart [29,30].…”
Section: 2mentioning
confidence: 99%
“…Erten et al presented a test which allows the compared matrices to be statistically dependent [5]. Other methods by Erten et al based on information theory followed in [6,7]. Marino et al proposed a test which is reported to suppress intensity information and perform well in detection of changes in the internal structure of the covariance matrix.…”
Section: Introductionmentioning
confidence: 98%
“…This group is minto . Thus for any optimal estimation of we can use only[43],[44]. Example 5: This example presents an IFD which is not minimally invariant.…”
mentioning
confidence: 98%