2010
DOI: 10.1007/978-3-642-15555-0_51
|View full text |Cite
|
Sign up to set email alerts
|

Object Classification Using Heterogeneous Co-occurrence Features

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
22
0

Year Published

2011
2011
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(22 citation statements)
references
References 14 publications
0
22
0
Order By: Relevance
“…The performance of our approach on this dataset (see Table 1) is 80.66%, which outperforms all previous known methods in the literature (some by as much as 4 to 8%) [3,12,17,18]. One important thing to note is that the improvement of our algorithm over our baseline is about 4%, and the only difference between the two is the addition of the proposed segmentation algorithm and the features extracted from the segmented image.…”
Section: Oxford 102 Flower Species Datasetmentioning
confidence: 66%
See 2 more Smart Citations
“…The performance of our approach on this dataset (see Table 1) is 80.66%, which outperforms all previous known methods in the literature (some by as much as 4 to 8%) [3,12,17,18]. One important thing to note is that the improvement of our algorithm over our baseline is about 4%, and the only difference between the two is the addition of the proposed segmentation algorithm and the features extracted from the segmented image.…”
Section: Oxford 102 Flower Species Datasetmentioning
confidence: 66%
“…Accuracy (in %) Our baseline (no segmentation) 76.7 Nilsback and Zisserman [17] 72.8 Ito and Cubota [12] 74.8 Nilsback and Zisserman [18] 76.3 Chai, Bicos method [3] 79.4 Chai, BicosMT method [3] 80.0 Ours 80.66 Ours: improvement over our baseline +3.94 Table 1. Classification performance on Oxford 102 flower dataset.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Although the extracted features, such as color, texture, shape, or motion features, can be quite weak individually, an appropriate combination of them will bring a strong feature which is much more discriminative [31] [29] [3] [16] [2] [9]. There has been a recent trend in mining co-occurrence patterns for visual recognition.…”
Section: Introductionmentioning
confidence: 99%
“…In spite of many previous works in mining and integrating co-occurrence patterns [22] [20] [3] [29] [33] [16] [24] [9] [27], none of these methods is targeted at finding the most discriminative co-occurrence pattern with the smallest classification error. Given N binary features, because the co-occurrence pattern can contain an arbitrary number of features (up to N ), the total number of candidates of co-occurrence patterns is exponentially large (e.g.…”
Section: Introductionmentioning
confidence: 99%