2019
DOI: 10.48550/arxiv.1905.11957
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Sample Complexity of Sample Average Approximation for Conditional Stochastic Optimization

Abstract: In this paper, we study a class of stochastic optimization problems, referred to as the Conditional Stochastic Optimization (CSO), in the form of min x∈X E ξ f ξ E η|ξ [g η (x, ξ)] . CSO finds a wide spectrum of applications including portfolio selection, reinforcement learning, robust and invariant learning. We establish the sample complexity of the sample average approximation (SAA) for CSO, under a variety of structural assumptions, such as Lipschitz continuity, smoothness, and error bound conditions. We sh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(6 citation statements)
references
References 35 publications
0
6
0
Order By: Relevance
“…This improves to O( −2 ) if both conditions hold, where we use O(•) to represent the rate hiding the logarithmic factors. In contrast to the SAA results in Hu et al [2019], these sample complexities are independent of the problem's dimensions. Furthermore, we show that, for weakly convex CSO problems (which are not necessarily smooth nor convex), BSGD requires a total sample complexity of O( −8 ) to achieve an -stationary point.…”
Section: Our Contributionsmentioning
confidence: 65%
See 4 more Smart Citations
“…This improves to O( −2 ) if both conditions hold, where we use O(•) to represent the rate hiding the logarithmic factors. In contrast to the SAA results in Hu et al [2019], these sample complexities are independent of the problem's dimensions. Furthermore, we show that, for weakly convex CSO problems (which are not necessarily smooth nor convex), BSGD requires a total sample complexity of O( −8 ) to achieve an -stationary point.…”
Section: Our Contributionsmentioning
confidence: 65%
“…The above result indicates that smoothness conditions make a difference in the total sample complexity of BSGD when solving CSO. It is worth pointing out that the sample complexity of BSGD matches with the that of ERM for strongly convex objectives esatblished in Hu et al [2019].…”
Section: Convergence For Strongly Convex Objectivesmentioning
confidence: 66%
See 3 more Smart Citations