2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA) 2017
DOI: 10.1109/icmla.2017.0-151
|View full text |Cite
|
Sign up to set email alerts
|

Deep Transductive Nonnegative Matrix Factorization for Speech Separation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 14 publications
0
5
0
Order By: Relevance
“…Nonnegative matrix factorization (NMF) preserves the non-negativity property of the magnitude spectrogram of the speech signal [15]. Given a sequence of time-domain signal v(t), its spectrogram is expressed by formula ( 1…”
Section: Nmf For Single-channel Speech Separationmentioning
confidence: 99%
See 2 more Smart Citations
“…Nonnegative matrix factorization (NMF) preserves the non-negativity property of the magnitude spectrogram of the speech signal [15]. Given a sequence of time-domain signal v(t), its spectrogram is expressed by formula ( 1…”
Section: Nmf For Single-channel Speech Separationmentioning
confidence: 99%
“…Since based on a family of statistical models [10], the NMF is easy to extend in many ways, such as adding sparsity [11], dynamics across time [12][13], and phase awareness [14]. However, these extensions are still limited to monolayer structures, which are difficult to extract non-linear features in speech [15]. According to the big success of deep neural networks in speech recognition, the hierarchical features can represent speech signals well [16].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The algorithm also directly factorizes the basis images of each layer. To obtain the final factorization form in (24), it is necessary to solve the following optimization problem with regularization by hierarchically optimizing the basis images matrix of each layer.…”
Section: The Regularized Deep Nonlinear Non-negative Basis Matrix Facmentioning
confidence: 99%
“…Deep learning based methods have been applied to many practical data analysis recently [21], [22]. Therefore, researchers focus largely on studying the multilayered NMF algorithm in recent years [23], [24], [25], [26], [27], [28]. Ahn et al proposed a multiple NMF (MNMF) network structure in which the deep feature of the sample under the basis images is obtained by continuously performing non-negative factorization on the coefficient matrix [29].…”
Section: Introductionmentioning
confidence: 99%