2021
DOI: 10.1007/978-3-030-69535-4_6
|View full text |Cite
|
Sign up to set email alerts
|

Any-Shot Object Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
12
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 15 publications
(12 citation statements)
references
References 36 publications
0
12
0
Order By: Relevance
“…Learning without forgetting using word vectors: Word vectors have shown promising success on various computer vision tasks such as zero-shot learning, few-shot learning, image/video captioning and visual question answering [15,29,39,43,9,46,47,6,5,7]. Lately, some works [25,8,49] have shown word vectors can likewise be beneficial for learning without forgetting. Rahman et al [25] has used semantic word vectors in the any-shot object detection problem in order to detect both unseen and few-shot objects simultaneously.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Learning without forgetting using word vectors: Word vectors have shown promising success on various computer vision tasks such as zero-shot learning, few-shot learning, image/video captioning and visual question answering [15,29,39,43,9,46,47,6,5,7]. Lately, some works [25,8,49] have shown word vectors can likewise be beneficial for learning without forgetting. Rahman et al [25] has used semantic word vectors in the any-shot object detection problem in order to detect both unseen and few-shot objects simultaneously.…”
Section: Related Workmentioning
confidence: 99%
“…Lately, some works [25,8,49] have shown word vectors can likewise be beneficial for learning without forgetting. Rahman et al [25] has used semantic word vectors in the any-shot object detection problem in order to detect both unseen and few-shot objects simultaneously. Word vectors helped to reduce the forgetting of seen classes during fine-tuning.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Word embedding for catastrophic forgetting: The use of semantic representation to prevent catastrophic forgetting is relatively new [24,4,42]. Such approaches explore the semantic relation between old and new classes to reduce the forgetting of old classes while training new classes.…”
Section: Related Workmentioning
confidence: 99%
“…Because of the difficulties of 3D data, this approach exhibits a large amount of forgetting of old classes. To minimize forgetting, we employ semantic word vectors of classes inside the network pipeline [24,4,42]. During both new and old task training, the network tries to align point cloud features to their corresponding semantics.…”
Section: Introductionmentioning
confidence: 99%