Summary
Cost‐effective satellite terminals (STs) that are designed for a large scale deployment in the Internet of things (IoT) applications are constrained by consumer‐grade local oscillators (LOs) to lower the costs, on the one hand, and by limited equivalent isotropic radiated power (EIRP) due to limited peak transmit power and/or low antenna gain, on the other hand. To close the link budget with the low EIRP constraint, STs can resort to adopting robust forward error correcting (FEC) schemes and/or limiting the transmission baud rate, hence increasing the link margin with respect to the required receiver threshold. In general, the LOs at the terminal side are responsible for injecting multiplicative phase distortions to the transmitted signal, known as phase noise (PN). The PN caused by low‐cost LOs could be significant, especially for low baud rates, thus affecting the demodulator performance. Random access (RA) techniques are particularly encouraged on the return link in networks with low‐cost terminals and bursty/low‐duty cycle types of traffic. The state‐of‐the‐art time‐slotted RA techniques are often evaluated considering high EIRP at the terminal side and high baud rates. This paper investigates the impact of the PN in time‐slotted RA techniques and optimizes different carrier phase estimation (CPE) algorithms. Furthermore, it evaluates the impact of CPE errors on RA performance for a wide range of baud rates and typical STs phase noise masks.