This work analyzes the use of the K-means clustering algorithm to mitigate nonlinear phase noise in singlespan coherent systems, such as long-reach passive optical networks (LR-PONs). Simulations revealed that for a 100-km LR-PON employing 16-ary quadrature amplitude modulation (QAM) and considering a 1:64 splitting ratio, the adoption of K-means with K-means++ initialization achieves an optimum bit error ratio (BER) of 6.3 • 10 −4 , whereas employing maximum likelihood, 10 −3 is obtained. We also show that in order to achieve this performance improvement in 90% of the cases, K-means requires only 2,000 symbols.