Recently much research work has focused on employing deep learning (DL) algorithms to perform channel estimation in the upcoming 6G communication systems. However, these DL algorithms are usually computationally demanding and require a large number of training samples. Hence, this work investigates the feasibility of designing efficient machine learning (ML) algorithms that can effectively estimate and track time-varying, frequency-selective channels. The proposed algorithm is integrated with orthogonal frequency-division multiplexing (OFDM) to eliminate intersymbol interference (ISI) induced by the frequency-selective multipath channel and compared with the well-known least square (LS) and linear minimum mean square error (LMMSE) channel estimation algorithms. The obtained results have demonstrated that even when a small number of pilot samples, N P , is inserted before the N subcarriers OFDM symbol, the introduced ML-based channel estimation is superior to the LS and LMMSE algorithms. This dominance is reflected in the bit-error-rate (BER) performance of the proposed algorithm, which attains a gain of 2.5 dB and 5.5 dB over the LMMSE and LS algorithms, respectively, when N P = N 8 . Furthermore, the BER performance of the proposed algorithm is shown to degrade by only 0.2 dB when the maximum Doppler frequency is randomly varied. Finally, the number of iterations required by the proposed algorithm to converge to the smallest achievable mean-squared error (MSE) are thoroughly examined for various signalto-noise ratio (SNR) levels.
INDEX TERMS Deep learning (DL), machine learning (ML), orthogonal frequency-division multiplexing (OFDM), intersymbol interference (ISI).The associate editor coordinating the review of this manuscript and approving it for publication was F. K. Wang . computational requirements, which, in turn, increases the complexity of the communications systems. Moreover, since DL networks usually contain a large number of neurons distributed among many layers, these networks require a large number of training samples to converge to the desired output. Unlike DL algorithms, shallow ML algorithms exhibit low computational requirements that make them suitable for implementing fast signal processing schemes, which can fulfil the increasing data traffic requirements imposed by the new generation of wireless communication systems. Consequently, immense attention has been shifted towards exploiting ML algorithms to revolutionize modern wireless communication systems [6]. In essence, conventional communication systems are generally based on well-established mathematical models. Hence, channel estimation algorithms such as the minimum mean square error (MMSE) algorithm