2017
DOI: 10.48550/arxiv.1705.00822
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Sample average approximation with heavier tails I: non-asymptotic bounds with weak assumptions and stochastic constraints

Abstract: We derive exponential finite-sample nonasymptotic deviation inequalities for the sample average approximation (SAA) estimator's near-optimal solution set and optimal value. In that respect, we give three contributions. First, our bounds do not require sub-Gaussian assumptions as in previous literature of stochastic optimization (SO). Instead, we just assume random Hölder continuity and a heavy-tailed distribution with finite 2nd moments, a framework more suited for risk-averse portfolio optimization. Second, w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
4
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 42 publications
1
4
0
Order By: Relevance
“…min x∈X f (x)) by Guigues, Juditsky, and Nemirovski in [9]. The two recent papers by Oliveira and Thompson [28,29] which have been mentioned at the end of Section 2 contain results that are closest to ours.…”
Section: Assumption 25 (Integrability Of the Hessiansupporting
confidence: 63%
See 1 more Smart Citation
“…min x∈X f (x)) by Guigues, Juditsky, and Nemirovski in [9]. The two recent papers by Oliveira and Thompson [28,29] which have been mentioned at the end of Section 2 contain results that are closest to ours.…”
Section: Assumption 25 (Integrability Of the Hessiansupporting
confidence: 63%
“…N in detail, let us compare the outcome of Theorem 2.9 with the current state of the art. Our focus is on recent non-asymptotic results by Oliveira and Thompson [28,29]; a general literature review will be presented in Section 2.5.…”
Section: Assumption 25 (Integrability Of the Hessianmentioning
confidence: 99%
“…It is clear from the proof of the theorem that the assumptions about independent sampling and subgaussian random variables can be relaxed. We only needed that the error in R m (w) relative to R(w) can be bounded for a finite number of w; see [5,23] for possible extensions.…”
Section: Theorem (Rate Of Convergence In Drm)mentioning
confidence: 99%
“…Traditional approaches for solving SOECs can lead to large data complexity. For instance, consider the popular strategy of replacing each expectation in (1) by a sample average approximation (SAA; Shapiro 2013, Oliveira andThompson 2017) and solving the resulting model using a deterministic iterative method (see, e.g., Nesterov 2004, Soheili andPena 2012, andreferences therein). If the number of samples used to construct SAAs is small, the solution from the deterministic approximation may be highly infeasible to the original SOEC, in addition to being suboptimal (Shapiro 2013, Oliveira andThompson 2017).…”
Section: Introductionmentioning
confidence: 99%
“…For instance, consider the popular strategy of replacing each expectation in (1) by a sample average approximation (SAA; Shapiro 2013, Oliveira andThompson 2017) and solving the resulting model using a deterministic iterative method (see, e.g., Nesterov 2004, Soheili andPena 2012, andreferences therein). If the number of samples used to construct SAAs is small, the solution from the deterministic approximation may be highly infeasible to the original SOEC, in addition to being suboptimal (Shapiro 2013, Oliveira andThompson 2017). Instead, if a large number of samples are used in each SAA, then the data complexity becomes large because the gradient or objective function evaluation at each iteration requires using a significant portion of each of the data sets.…”
Section: Introductionmentioning
confidence: 99%