2012
DOI: 10.1007/978-3-642-34778-8_5
|View full text |Cite
|
Sign up to set email alerts
|

Robust Feature Bundling

Abstract: Abstract. In this work we present a feature bundling technique that aggregates individual local features with features from their spatial neighborhood into bundles. The resulting bundles carry more information of the underlying image content than single visual words. As in practice an exact search for such bundles is infeasible, we employ a robust approximate similarity search with min-hashing in order to retrieve images containing similar bundles. We demonstrate the benefits of these bundles for small object … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2013
2013
2015
2015

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 10 publications
0
6
0
Order By: Relevance
“…The authors show that PmH is significantly faster than mH and has identical collision probabilities for sketches as mH in the worst case, but theoretically better recall and precision if the duplicate image region only covers a small area. However, in [15] we observed that PmH performs worse than mH on the logo dataset.…”
Section: Geometric Min-hashing (Gmh)mentioning
confidence: 67%
See 2 more Smart Citations
“…The authors show that PmH is significantly faster than mH and has identical collision probabilities for sketches as mH in the worst case, but theoretically better recall and precision if the duplicate image region only covers a small area. However, in [15] we observed that PmH performs worse than mH on the logo dataset.…”
Section: Geometric Min-hashing (Gmh)mentioning
confidence: 67%
“…In [15] we have already shown that IDF-weighting is always beneficial in the bag-of-words framework, even for large vocabularies greater than 1 million words. Thus tf-idf weighting was used in all cases.…”
Section: Bag-of-words As Baseline On This Particular Dataset Wementioning
confidence: 93%
See 1 more Smart Citation
“…3. In [22] we have already shown that IDF-weighting is always beneficial in the bagof-words framework, even for large vocabularies greater than 1 million words. Thus, tf-idf weighting was used in all cases.…”
Section: Discussionmentioning
confidence: 92%
“…Inspired by this observation we present a feature bundling technique [22] that builds on visual words. It does not describe each visual word individually but rather aggregates spatial neighboring visual words into feature bundles.…”
mentioning
confidence: 99%