Measuring the optical signal to noise ratio (OSNR) at certain network points is essential for failure handling, for single connection but also global network optimization. Estimating OSNR is inherently difficult in dense wavelength routed networks, where connections accumulate noise over different paths and tight filters do not allow the observation of the noise level at signal sides. We propose an in-band OSNR estimation process, which relies on a machine learning (ML) method, in particular on Gaussian process (GP) or support vector machine (SVM) regression. We acquired high-resolution optical spectra, through an experimental setup, using a Brillouin optical spectrum analyzer (BOSA), on which we applied our method and obtained excellent estimation accuracy. We also verified the accuracy of this approach for various resolution scenarios. To further validate it, we generated spectral data for different configurations and resolutions through simulations. This second validation confirmed the estimation quality of the proposed approach.