Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2017
DOI: 10.1145/3097983.3098135
|View full text |Cite
|
Sign up to set email alerts
|

Learning from Multiple Teacher Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
158
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
3
3

Relationship

1
8

Authors

Journals

citations
Cited by 313 publications
(158 citation statements)
references
References 11 publications
0
158
0
Order By: Relevance
“…Motivated by ensemble learning methods, You et.al. [22] simultaneously utilized multiple teacher networks to learn a better student network. Moreover, several algorithms have been developed to investigate the restriction between teacher and student.…”
Section: Knowledge Distillationmentioning
confidence: 99%
See 1 more Smart Citation
“…Motivated by ensemble learning methods, You et.al. [22] simultaneously utilized multiple teacher networks to learn a better student network. Moreover, several algorithms have been developed to investigate the restriction between teacher and student.…”
Section: Knowledge Distillationmentioning
confidence: 99%
“…You et.al. [22] simultaneously utilized multiple teacher networks for learning a more accurate student network. Zagoruyko et.al.…”
Section: A Teacher-student Interactionsmentioning
confidence: 99%
“…Lee et al [17] studied the performance of different ensemble methods under the framework of multi-task learning. You et al [29] presented a method to train a thin deep network by incorporating in the intermediate layers and imposing a constraint about the dissimilarity among examples. Wu et al [27] propose a multi-teacher knowledge distillation framework for compressed video action recognition to compress this model.…”
Section: Multi-task Learningmentioning
confidence: 99%
“…Transfer learning is proposed to transfer knowledge from source domain to target domain to save data on target domain [24]. It contains two main research directions: cross-domain transfer learning [22,12,10,4] and cross-task one [9,3,5,35]. In the case of cross-domain transfer learning, the dataset adopted by source domain and the counterpart of target domain are different in domain but the same in category.…”
Section: Related Workmentioning
confidence: 99%