“…Furthermore, in (Nedich, 2011) and (Nedich and Necoara, 2019) sublinear convergence rates are established either for convex or strongly convex deterministic objective functions, respectively, while in this paper we prove (sub)linear rates under an expected composite objective function which is either convex or satisfies relaxed strong convexity conditions. Moreover, (Nedich, 2011;Nedich and Necoara, 2019) present separately the convergence analysis for smooth and nonsmooth objective, while in this paper we present a unified convergence analysis covering both cases through the so-called stochastic bounded gradient condition. Hence, since we deal with stochastic composite objective functions, smooth or nonsmooth, and relaxed strong convexity assumptions, and since we consider a stochastic proximal gradient with new stepsize rules, our convergence analysis requires additional insights that differ from that of (Nedich, 2011;Nedich and Necoara, 2019).…”