Interspeech 2019 2019
DOI: 10.21437/interspeech.2019-1989
|View full text |Cite
|
Sign up to set email alerts
|

Acoustic Scene Classification Using Teacher-Student Learning with Soft-Labels

Abstract: Acoustic scene classification identifies an input segment into one of the pre-defined classes using spectral information. The spectral information of acoustic scenes may not be mutually exclusive due to common acoustic properties across different classes, such as babble noises included in both airports and shopping malls. However, conventional training procedure based on one-hot labels does not consider the similarities between different acoustic scenes. We exploit teacher-student learning with the purpose to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 13 publications
(15 citation statements)
references
References 13 publications
0
15
0
Order By: Relevance
“…Knowledge distillation was first devised in two studies conducted by Hinton et al and Li et al [25], [26] and was introduced for the ASC task in [24]. Although the two papers convey similar methodologies, Hinton et al focused on the transfer of knowledge between DNNs, therefore, referred to as knowledge distillation.…”
Section: A Overviewmentioning
confidence: 99%
See 4 more Smart Citations
“…Knowledge distillation was first devised in two studies conducted by Hinton et al and Li et al [25], [26] and was introduced for the ASC task in [24]. Although the two papers convey similar methodologies, Hinton et al focused on the transfer of knowledge between DNNs, therefore, referred to as knowledge distillation.…”
Section: A Overviewmentioning
confidence: 99%
“…The KD framework itself has been validated to be effective for different input features (e.g. Mel-spectrogram) in previous works [24], [27]. Table 3 describes comparison results with different composition of loss functions for knowledge distillation.…”
Section: B Experimental Configurationsmentioning
confidence: 99%
See 3 more Smart Citations