2021
DOI: 10.3390/s21020596
|View full text |Cite
|
Sign up to set email alerts
|

Revisiting the CompCars Dataset for Hierarchical Car Classification: New Annotations, Experiments, and Results

Abstract: We address the task of classifying car images at multiple levels of detail, ranging from the top-level car type, down to the specific car make, model, and year. We analyze existing datasets for car classification, and identify the CompCars as an excellent starting point for our task. We show that convolutional neural networks achieve an accuracy above 90% on the finest-level classification task. This high performance, however, is scarcely representative of real-world situations, as it is evaluated on a biased … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 24 publications
0
6
0
Order By: Relevance
“…• SoftMax: unilized for multi-classification functions; sometimes called the normalized exponential function [10]. Normalizes the input into a probability distribution that sums to 1.…”
Section: Theoretical Background 1sigmoid and Softmax Activation Funct...mentioning
confidence: 99%
“…• SoftMax: unilized for multi-classification functions; sometimes called the normalized exponential function [10]. Normalizes the input into a probability distribution that sums to 1.…”
Section: Theoretical Background 1sigmoid and Softmax Activation Funct...mentioning
confidence: 99%
“…Considering hierarchical class taxonomies in the classification problem has been widely studied in the literature [20][21][22][23][24]. The problem of hierarchical novelty detection actually comprises hierarchical classification of the known classes.…”
Section: Hierarchical Classificationmentioning
confidence: 99%
“…With CoLabel-SMBL, we could reduce model parameters, since we use a single branch. We compare CoLabel-SMBL against CoLabel and HML [5] in Table 8. CoLabel-SMBL sacrifices accuracy with the reduced parameters.…”
Section: Collaborative Learningmentioning
confidence: 99%