2021
DOI: 10.1016/j.cviu.2021.103229
|View full text |Cite
|
Sign up to set email alerts
|

SID: Incremental learning for anchor-free object detection via Selective and Inter-related Distillation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 27 publications
(22 citation statements)
references
References 8 publications
0
18
0
Order By: Relevance
“…In addition, we also compared our method with LwF (Li & Hoiem, 2018), RILOD , and SID (Peng et al, 2021). Both Table 1 and Table 2 show that although LwF works well in incremental classification, it is even lower AP than directly fine-tuning in detection tasks.…”
Section: Experiments and Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…In addition, we also compared our method with LwF (Li & Hoiem, 2018), RILOD , and SID (Peng et al, 2021). Both Table 1 and Table 2 show that although LwF works well in incremental classification, it is even lower AP than directly fine-tuning in detection tasks.…”
Section: Experiments and Discussionmentioning
confidence: 99%
“…Thereafter, some researchers move this area forward. Peng et al (2021) proposed SID approach for incremental object detection on anchor-free detector and conducted experiments on FCOS (Tian et al, 2019) and CenterNet (Zhou et al, 2019). Li et al (2021a) studied object detection based on class-incremental learning on Faster RCNN detector with emphasis given to few-shot scenarios, which is also the focus of ONCE algorithm (Pérez-Rúa et al, 2020).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…They additionally investigated the negative impact that having old class objects within the new class images has on the performance of the RPN and concluded that it was not that significant, which explains why Faster-RCNN networks generalize better than solutions with external proposals. Peng et al [91] presented the use of distillation not only on intermediate features, but also on the relations (distances) between features of different samples for anchor-free object detectors. Yang et al [70] proposed the preservation of channelwise, point-wise, and instance-wise correlations between some feature maps of the teacher and student networks in order to maintain the performance on the old classes while optimizing for the new ones.…”
Section: Knowledge Distillationmentioning
confidence: 99%
“…Knowledge Distillation: a strong baseline with caveats: It is easy to notice from Table 2 that most proposed strategies in the CIOD field use knowledge distillation as their primary mechanism to mitigate the effects of catastrophic forgetting. Comparing the results of the selected papers on the PASCAL VOC 2007 and MS COCO incremental benchmarks considering the metrics that assess the stability-plasticity of solutions, the differences between a recently proposed distillation technique such as Peng et al [91] and the first work of Shmelkov et al [7] are subtle. This either means that researchers might have been overfitting their solutions to the benchmarks or that simple logits and bounding box distillation are a strong baseline.…”
Section: Trends and Research Directionsmentioning
confidence: 99%