The simulation of magnetic bearings involves highly non-linear physics, with high dependency on the input variation. Moreover, such a simulation is time consuming and can’t run, within realistic computation time for control purposes, when using classical computation methods. On the other hand, classical model reduction techniques fail to achieve the required precision within the allowed computation window. To address this complexity, this work proposes a combination of physics-based computing methods, model reduction techniques and machine learning algorithms, to tackle the requirements. The physical model used to represent the magnetic bearing is the classical Cauer Ladder Network method, while the model reduction technique is applied on the error of the physical model’s solution. Later on, in the latent space a machine learning algorithm is used to predict the evolution of the correction in the latent space. The results show an improvement of the solution without scarifying the computation time. The solution is computed in almost real-time (few milliseconds), and compared to the finite element reference solution.