2022
DOI: 10.1016/j.icte.2021.11.007
|View full text |Cite
|
Sign up to set email alerts
|

Machine learning-based adaptive CSI feedback interval

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 11 publications
0
4
0
Order By: Relevance
“…Multi-domain correlations have been adopted to leverage the geometrical wireless propagation information embedded within the CSI. For instance, the prediction neural network in [24] generates the current CSI based on the previous CSI sequence capitalizing on time correlation, which is shared by both the UE and the BS. Furthermore, the DualNet-MAG architecture in [25] exploits the correlation between the magnitudes of the bidirectional channels, with the uplink channel serving as a feedback channel for the quantized CSI phase and a neural network-based encoder compressing the CSI magnitude.…”
Section: Related Workmentioning
confidence: 99%
“…Multi-domain correlations have been adopted to leverage the geometrical wireless propagation information embedded within the CSI. For instance, the prediction neural network in [24] generates the current CSI based on the previous CSI sequence capitalizing on time correlation, which is shared by both the UE and the BS. Furthermore, the DualNet-MAG architecture in [25] exploits the correlation between the magnitudes of the bidirectional channels, with the uplink channel serving as a feedback channel for the quantized CSI phase and a neural network-based encoder compressing the CSI magnitude.…”
Section: Related Workmentioning
confidence: 99%
“…The other 𝑇 − 1 CSI "frames" are compressed by low-CR encoders; LSTM refines the reconstructed CSI with the information extracted from the former CSI; RecCsiNet [86] The LSTM at the encoder compresses the CSI based on the current and previous CSI matrices; [87] Feedback overhead is reduced by dynamically adjusting feedback interval of time varying channel; Feedback is not needed if prediction errors are tolerable;…”
Section: ) Gan and Vaementioning
confidence: 99%
“…Unlike [85], [86], [97], the feedback overhead is reduced in [87] by dynamically adjusting the feedback interval of the time-varying channel instead of reducing the overhead of each CSI feedback. A prediction NN in [87], which produces the current CSI based on the knowledge of the past CSI sequence, is shared by the user and the BS. If prediction errors are tolerable, the user does not need to feed back the current CSI, and the BS directly uses the CSI produced by the shared prediction NN.…”
Section: ……mentioning
confidence: 99%
“…Moreover, the complexity of ML is too high to be implemented in the lightweight IoT device. Recent work of [ 33 ] proposed aperiodic CSI feedback based on the deep neural network (DNN)-based channel prediction, where the terminal decides whether or not to feed back its CSI relying on the DNN-based channel prediction. However, the work of [ 33 ] did not take into account the feedback delay, and moreover, applied a classical DNN rather than an LSTM model in time-series forecasting.…”
Section: Introductionmentioning
confidence: 99%