2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition 2010
DOI: 10.1109/cvpr.2010.5539979
|View full text |Cite
|
Sign up to set email alerts
|

Total Bregman divergence and its applications to shape retrieval

Abstract: Shape database search is ubiquitous in the world of biometric systems, CAD systems etc. Shape data in these domains is experiencing an explosive growth and usually requires search of whole shape databases to retrieve the best matches with accuracy and efficiency for a variety of tasks. In this paper, we present a novel divergence measure between any two given points in double-struckRn or two distribution functions. This divergence measures the orthogonal distance between the tangent to the convex function (use… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
38
0

Year Published

2012
2012
2018
2018

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 37 publications
(38 citation statements)
references
References 17 publications
0
38
0
Order By: Relevance
“…To deal with these circumstances, S. Yin et al [3] have proposed a robust one-class SVM technique which modifies the penalty factor  N / 1 of the traditional OCSVM and is formulated as is the maximum distance between the point x i and the centre of the dataset x c . The centre x c is calculated by the tSL-centre technique [6], [7] which is robust to noise and outliers and it is expressed as…”
Section: B Robust One-class Svmmentioning
confidence: 99%
See 1 more Smart Citation
“…To deal with these circumstances, S. Yin et al [3] have proposed a robust one-class SVM technique which modifies the penalty factor  N / 1 of the traditional OCSVM and is formulated as is the maximum distance between the point x i and the centre of the dataset x c . The centre x c is calculated by the tSL-centre technique [6], [7] which is robust to noise and outliers and it is expressed as…”
Section: B Robust One-class Svmmentioning
confidence: 99%
“…According to (6), the influence of outliers in the formation of the decision boundary is reduced by applying a small penalty factor to samples which are far away from the centre so that the outliers will be forced to fall outside the boundary.…”
Section: B Robust One-class Svmmentioning
confidence: 99%
“…The aim of a similarity measure ´Ë Ë µ is to reliably define the similarity between the two instances Ë and Ë . In previous works, various measures such as [13,12,24,15,14] have been proposed for shapes. However, most of these only consider the similarity in shape-based features of a pair of shape instances while calculating ´Ë Ë µ and increase the chances that two shape instances will incorrectly be characterized as very similar (or dissimilar).…”
Section: Population Cuesmentioning
confidence: 99%
“…Specifically, Ling and Jacobs [13] introduce Inner-Distance Shape Context (IDSC) which further improves the widely-known Shape Context descriptor [5], while [14] introduces a framework called Aspect Space which aims to balance deformability and discriminability for describing a shape. Further, Liu et al [15] apply Total Bregman Divergence to characterize the similarity between two shapes and classify them into groups of similar shapes, while Lin et al [12] introduce the Layered Graph Matching algorithm. In addition to these measures, Temlyakov et al [23] present the "strand" and "stretch" strategies (S&S) that simplify the geometry of a given shape to improve the performance of these shape descriptors and similarity measures.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation