2023
DOI: 10.2139/ssrn.4596210
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

What is Behind the Meta-Learning Initialization of Adaptive Filter?-- a Naive Method for Accelerating Convergence of Adaptive Multichannel Active Noise Control

DONGYUAN SHI,
Woon-Seng Gan,
xiaoyi Shen
et al.
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(5 citation statements)
references
References 39 publications
0
5
0
Order By: Relevance
“…Introduction of adaptive filtering layer: Unlike traditional graph convolutional networks that use shared weight matrices, the FEGNS framework designs adaptive filtering weights for each node, enabling a more sensitive handling of local features (Shi et al , 2024).…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…Introduction of adaptive filtering layer: Unlike traditional graph convolutional networks that use shared weight matrices, the FEGNS framework designs adaptive filtering weights for each node, enabling a more sensitive handling of local features (Shi et al , 2024).…”
Section: Methodsmentioning
confidence: 99%
“…To accurately predict output locations while reducing computational demands, we introduce an adaptive filtering graph convolution network (AF-GCN). The AF-GCN processes input features from each node, X(i), and adaptively predicts output features, Y(i), as per equation (5) (Shi et al , 2024): The GNN process involves aggregating node vectors to amalgamate features from a node’s neighborhood, N ( i ), through a weighted sum with attentive weights, α ( i, j ), encapsulating local interactions. Dense layers elevate aggregated features into abstract representations, aligning with hierarchical feature learning.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations