2023
DOI: 10.1109/tnnls.2021.3089566
|View full text |Cite
|
Sign up to set email alerts
|

Discriminative Fisher Embedding Dictionary Transfer Learning for Object Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 50 publications
0
6
0
Order By: Relevance
“…Automating image annotation, or at least offering automated assistance in this process, can speed up the work and increase its accuracy and consistency [20]. The application of techniques designed to maximize learning from limited data such as transfer learning [21,22,23,24,25], data augmentation [26,27,28,29] and few-shot learning [30,31,32,33,34] complements this move towards automation. While these methods are valuable for training robust models with sparse annotated datasets, the ultimate goal remains to minimize their need by improving the automation of the annotation process itself.…”
Section: A Motivationmentioning
confidence: 99%
“…Automating image annotation, or at least offering automated assistance in this process, can speed up the work and increase its accuracy and consistency [20]. The application of techniques designed to maximize learning from limited data such as transfer learning [21,22,23,24,25], data augmentation [26,27,28,29] and few-shot learning [30,31,32,33,34] complements this move towards automation. While these methods are valuable for training robust models with sparse annotated datasets, the ultimate goal remains to minimize their need by improving the automation of the annotation process itself.…”
Section: A Motivationmentioning
confidence: 99%
“…Supervised hashing-based multi-modal methods like semantic correlation maximization (SCM) [16,17] continuously learn hash codes by semantic label similarity. The semantic gaps are narrowed by making a probabilistic affinity matrix and minimizing the KL-divergences in the semantic preserving hashing (SePH) technique [5]. However, because of memory overhead and increased computational complexity, SePH can't handle large-scale datasets.…”
Section: Related Workmentioning
confidence: 99%
“…Dictionary learning attempts to learn the most naive feature as sparse representation to reserve the information comprehensively. Owing to the ability to extract essential data features into a sparse representation; Recently, numerous strategies based on dictionary learning [4][5][6] have been put out in the literature. However, technical hurdles exist to provide a more accurate and efficient hashing-based cross-modal retrieval by leveraging dictionary learning.…”
Section: Introductionmentioning
confidence: 99%
“…Combining these strengths holds significant potential. For instance, hybrid quantum-classical convolutional neural networks have demonstrated better performance compared to classical networks with the same architecture [14,15]. This is because generalized feature maps with variational quantum circuits can explore correlations between adjacent data points in exponentially large linear spaces, potentially allowing our algorithms to more accurately capture patterns within datasets using quantum computers.…”
Section: Introductionmentioning
confidence: 99%