2023 31st European Signal Processing Conference (EUSIPCO) 2023
DOI: 10.23919/eusipco58844.2023.10289800
|View full text |Cite
|
Sign up to set email alerts
|

On Data Sampling Strategies for Training Neural Network Speech Separation Models

William Ravenscroft,
Stefan Goetze,
Thomas Hain

Abstract: This is a repository copy of On data sampling strategies for training neural network speech separation models.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…Training signal lengths (TSLs) are limited to 4s and randomly sampled from the original training example [23]. The feature dimension of the conformers layers are the same as the TD-Conformer-XL model in [9], i.e.…”
Section: Training Configurationmentioning
confidence: 99%
See 1 more Smart Citation
“…Training signal lengths (TSLs) are limited to 4s and randomly sampled from the original training example [23]. The feature dimension of the conformers layers are the same as the TD-Conformer-XL model in [9], i.e.…”
Section: Training Configurationmentioning
confidence: 99%
“…The computational complexity of models is assessed using mutiply-accumulate operations (MACs). MACs are computed on a signal length of 5.79s, equal to the mean signal length in the WHAMR and WSJ0-2Mix corpora [23]. Model size is reported in number of parameters.…”
Section: Evaluation Metricsmentioning
confidence: 99%