2020 International Conference on Wireless Communications and Signal Processing (WCSP) 2020
DOI: 10.1109/wcsp49889.2020.9299821
|View full text |Cite
|
Sign up to set email alerts
|

Channel Estimation Method Based on Transformer in High Dynamic Environment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(9 citation statements)
references
References 16 publications
0
9
0
Order By: Relevance
“…We have also implemented a transformer-based method similar to [9] with 31,829 parameters for comparison. The output of TR is interpolated by the bilinear method to estimate the channel matrix of the whole slot.…”
Section: Transformer-based Methods (Tr) For Comparisonmentioning
confidence: 99%
See 2 more Smart Citations
“…We have also implemented a transformer-based method similar to [9] with 31,829 parameters for comparison. The output of TR is interpolated by the bilinear method to estimate the channel matrix of the whole slot.…”
Section: Transformer-based Methods (Tr) For Comparisonmentioning
confidence: 99%
“…A graph attention network has also been utilized for channel estimation [8]. The papers [9] [10] [11] also deploy the attention mechanism to improve the performance.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Its remarkable success in the fields of NLP and CV has inspired communication researchers to investigate its applications to various communication problems. Not surprisingly transformers have also shown remarkable success in certain communication tasks, specifically, for channel estimation and semantic communication [7], [8].…”
Section: B Self-attention and Transformermentioning
confidence: 99%
“…Self-attention mechanism and transformer network [16] have great successes in various NLP applications, for example, Devlin and others [17], by its powerful feature extractions, and also for various image processing applications [18]. Several works [19][20][21][22] presented channel estimations based on transformer network or attention mechanisms. A transformer network can be designed to process variable lengths of sequences as inputs and outputs, which is beneficial for designing scalable channel estimators.…”
Section: Related Workmentioning
confidence: 99%