“…In particular, in [1,6,14,23,34,35,39] an L p -error rate of at least 1/2 has been proven for approximating X 1 by explicit Euler-type methods, e.g., tamed, projected or truncated Euler schemes, for suitable ranges of the values of p and for subclasses of such SDEs with coefficients that at least satisfy a monotone-type condition and a coercivity condition and are locally Lipschitz continuous with a polynomially growing (local) Lipschitz constant. We add that important applications of these results are emerging in areas of intense interest, due to their central role in Data Science and AI, such as MCMC sampling algorithms, see [3,36], and stochastic optimizers for fine tuning (artificial) neural networks and, more broadly, for solving non-convex stochastic optimization problems, see [21,20].…”