In the last two decades, Radio Frequency Identification (RFID) technology has attained prominent performance improvement and has been recognized as one of the key enablers of the Internet of Things (IoT) concepts. In parallel, extensive employment of Machine Learning (ML) algorithms in diverse IoT areas has led to numerous advantages that increase successful utilization in different scenarios. The work presented in this paper provides a use-case feasibility analysis of the implementation of ML algorithms for the estimation of ALOHA-based frame size in the RIFD Gen2 system. Findings presented in this research indicate that the examined ML algorithms can be deployed on modern state-of-the-art resource-constrained microcontrollers enhancing system throughput. In addition, such utilization can cope with latency since the execution time is sufficient to meet protocol needs.