ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2022
DOI: 10.1109/icassp43922.2022.9746039
|View full text |Cite
|
Sign up to set email alerts
|

Deep Adaptive Aec: Hybrid of Deep Learning and Adaptive Acoustic Echo Cancellation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
1

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 22 publications
(15 citation statements)
references
References 26 publications
0
14
1
Order By: Relevance
“…The PESQ is improved by 0.05 and WER is relatively improved by 11.8%. The results presented in [8] show that NLM-SNet substantially outperforms the fully DNN-based method, which is different from what we obtained here. The major reason is that the signals used for training and testing in [8] are recorded in scenarios with continuous echo path changing, which helps show the benefits of using hybrid methods.…”
Section: Nonlinear Transition Function T(•)contrasting
confidence: 99%
See 2 more Smart Citations
“…The PESQ is improved by 0.05 and WER is relatively improved by 11.8%. The results presented in [8] show that NLM-SNet substantially outperforms the fully DNN-based method, which is different from what we obtained here. The major reason is that the signals used for training and testing in [8] are recorded in scenarios with continuous echo path changing, which helps show the benefits of using hybrid methods.…”
Section: Nonlinear Transition Function T(•)contrasting
confidence: 99%
“…Such methods usually treat AEC as a source separation problem and directly estimate the nearend signal based on the microphone and far-end reference signal. While achieving good performance in general, DNN-based methods have shown limited utility in dealing with continuously changing echo paths [8]. In recent AEC challenges [9], two-stage hybrid systems [10,11,12] that use DNN as a nonlinear post-processor of a DSP-based adaptive filtering algorithm have shown promising results.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, we deploy learned optimizers to solve AF tasks as the endgoal and do not use them to train downstream neural networks. We also note recent work that uses a supervised DNN to control the step-size of a D-KF for AEC [61] and another that uses a supervised DNN to predict both the step-size and a nonlinear reference signal for AEC [62]. Compared to these, we replace the entire update with a neural network, do not need supervisory signals, and investigate many tasks.…”
Section: E Related Workmentioning
confidence: 99%
“…In contrast, there are a small number of works that more tightly couple neural networks and AFs and use DNNs for optimal control of AFs. Recently, it was shown that DNNs can estimate statistics to control step-sizes [61], [62] or estimate entire updates [63] for a single-channel AEC. Similarly, past work has used DNNs to predict updates for the internal statistics of multi-channel beamformers [64] and to learn source-models for multi-channel source separation [65].…”
Section: Introductionmentioning
confidence: 99%