2011
DOI: 10.1007/978-3-642-24471-1_4
|View full text |Cite
|
Sign up to set email alerts
|

On a Non-monotonicity Effect of Similarity Measures

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
3
0

Year Published

2012
2012
2014
2014

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 13 publications
0
3
0
Order By: Relevance
“…The interesting point about this is that based on Weyl's discrepancy concept distance measures can be constructed that guarantee the desirable registration properties: (R1) the measure vanishes if and only if the lag vanishes, (R2) the measure increases monotonically with an increasing lag, and (R3) the measure obeys a Lipschitz condition that guarantees smooth changes also for patterns with high frequencies. As proven in [26], properties (R1)-(R3) are not satisfied simultaneously by commonly used measures in this context like mutual information, Kullback-Leibler distance or the Jensen-Rényi divergence measure which are special variants of f -divergence and f -information measures, see, e.g., [4,20,29,30,37], nor do the standard measures based on p-norms or the widely used correlation measures due to Pearson or Spearman, see [8,17,34].…”
mentioning
confidence: 77%
See 1 more Smart Citation
“…The interesting point about this is that based on Weyl's discrepancy concept distance measures can be constructed that guarantee the desirable registration properties: (R1) the measure vanishes if and only if the lag vanishes, (R2) the measure increases monotonically with an increasing lag, and (R3) the measure obeys a Lipschitz condition that guarantees smooth changes also for patterns with high frequencies. As proven in [26], properties (R1)-(R3) are not satisfied simultaneously by commonly used measures in this context like mutual information, Kullback-Leibler distance or the Jensen-Rényi divergence measure which are special variants of f -divergence and f -information measures, see, e.g., [4,20,29,30,37], nor do the standard measures based on p-norms or the widely used correlation measures due to Pearson or Spearman, see [8,17,34].…”
mentioning
confidence: 77%
“…Such problems encounter as autocorrelation in signal processing, see [5]. In computer vision such problems are particularly encountered in stereo matching as point correspondence problem, see, e.g., [32] and [26], in template matching, e.g., for the purpose of print inspection, see, e.g., [7] and [23], in superpixel matching [16] or in defect detection in textured images, see [6,25,35]. In these cases, for high-frequency patterns, the discrepancy norm leads to cost functions with less local extrema and a more distinctive region of convergence in the neighborhood of the global minimum compared to commonly used (dis-)similarity measures.…”
mentioning
confidence: 99%
“…Furthermore it is more robust to Gaussian noise than traditional measures, for examples see Moser et al 21 .…”
Section: Discrepancy Norm For Robust Periodicity Estimationmentioning
confidence: 98%