In this paper, we study a class of stochastic optimization problems, referred to as the Conditional Stochastic Optimization (CSO), in the form of min x∈X E ξ f ξ E η|ξ [g η (x, ξ)] . CSO finds a wide spectrum of applications including portfolio selection, reinforcement learning, robust and invariant learning. We establish the sample complexity of the sample average approximation (SAA) for CSO, under a variety of structural assumptions, such as Lipschitz continuity, smoothness, and error bound conditions. We show that the total sample complexity improves from O(d/ 4 ) to O(d/ 3 ) when assuming smoothness of the outer function, and further to O(1/ 2 ) when the empirical function satisfies the quadratic growth condition. We also establish the sample complexity of a modified SAA, when ξ and η are independent. Our numerical results from several experiments further support our theoretical findings.