This article introduces a novel approach, termed the Asymmetric Probabilistic Tsetlin (APT) Machine, which incorporates the Stochastic Point Location (SPL) algorithm with the Asymmetric Steps technique into the Tsetlin Machine (TM). APT introduces stochasticity into the state transitions of Tsetlin Automata (TA) by leveraging the SPL algorithm, thereby enhancing pattern recognition capabilities. To enhance random search processes, we introduced a decaying normal distribution into the procedure. Meanwhile, the Asymmetric Steps approach biases state transition probabilities towards specific input patterns, further elevating operational efficiency.
The efficacy of the proposed approach is assessed across diverse benchmark datasets for classification tasks. The performance of APT is compared with traditional machine learning algorithms and other Tsetlin Machine models, including the Asymmetric Tsetlin (AT) Machine, characterized by deterministic rules for Asymmetric transitions, and the Classical Tsetlin (CT) Machine, employing deterministic rules for symmetric transitions. Strikingly, the introduced APT methodology demonstrates highly competitive outcomes compared to established machine learning methods. Notably, both APT and AT exhibit state-of-the-art performance, surpassing the Classical Tsetlin Machine, emphasizing the efficacy of asymmetric models for achieving superior outcomes. Remarkably, APT exhibits even better performance than AT, particularly in handling complex datasets.