In this paper we consider iterative methods for stochastic variational inequalities (s.v.i.) with monotone operators. Our basic assumption is that the operator possesses both smooth and nonsmooth components. Further, only noisy observations of the problem data are available. We develop a novel Stochastic Mirror-Prox (SMP) algorithm for solving s.v.i. and show that with the convenient stepsize strategy it attains the optimal rates of convergence with respect to the problem parameters. We apply the SMP algorithm to Stochastic composite minimization and describe particular applications to Stochastic Semidefinite Feasibility problem and deterministic Eigenvalue minimization.1. Introduction. Variational inequalities with monotone operators form a convenient framework for unified treatment (including algorithmic design) of problems with "convex structure", like convex minimization, convexconcave saddle point problems and convex Nash equilibrium problems. In this paper we utilize this framework to develop first order algorithms for stochastic versions of the outlined problems, where the precise first order information is replaced with its unbiased stochastic estimates. This situation arises naturally in convex Stochastic Programming, where the precise first order information is unavailable (see examples in section 4). In some situations, e.g. those considered in [4, Section 3.3] and in Section 4.4, where passing from available, but relatively computationally expensive precise first order information to its cheap stochastic estimates allows to accelerate the solution process, with the gain from randomization growing progressively with problem's sizes.Our "unifying framework" is as follows. Let Z be a convex compact set in Euclidean space E with inner product ·, · , · be a norm on E (not