Different distortions that affect gridless wavelength division multiplexing (WDM) systems due to nonlinear impairments of the optical fiber and linear interchannel interference (ICI) errors cause signal degradation and a decrease in the transmission system quality. Overcoming these effects is a challenge if there is no information about the source of the distortion. In this work, we propose two asymmetric demodulation methods based on decision tree (DT) algorithms to mitigate distortions associated with linear ICI in gridless WDM systems, even when optical channels are spectrally overlapped. The first method uses the conventional DT and random forest (RDF) algorithms adapted to create asymmetrical thresholds in m-QAM digital demodulation. The second method uses the density-based spatial clustering of applications with a noise (DBSCAN) algorithm, including the K-Dimensional tree (K-D tree) algorithm to treat symbols in boundary conditions. Both methods were experimentally validated in a 3×16GBd gridless Nyquist WDM system modulated in 16-QAM with different channel spacing. DT-based demodulation, including RDF, achieved gains up to ∼1.6dB in the FEC limit 3.8×10−3, while demodulation based on DBSCAN plus the K-D tree achieved gains up to ∼1.2dB compared to conventional demodulation. Additionally, we performed a brief latency analysis in comparison to other previous machine learning-based demodulation methods, where DT and RDF presented latency up to ∼0.3% and ∼32% of the DBSCAN + K-D tree latency, respectively. Finally, the proposed asymmetric demodulation methods can improve the performance of future elastic optical networks by offering easy interpretation of the digital demodulation process and the possibility of adapting them to any m-QAM modulation format being agnostic to signal distortion.