2016
DOI: 10.1287/moor.2015.0776
|View full text |Cite
|
Sign up to set email alerts
|

Robust Sensitivity Analysis for Stochastic Systems

Abstract: We study a worst-case approach to measure the sensitivity to model misspecification in the performance analysis of stochastic systems. The situation of interest is when only minimal parametric information is available on the form of the true model. Under this setting, we post optimization programs that compute the worst-case performance measures, subject to constraints on the amount of model misspecification measured by KullbackLeibler (KL) divergence. Our main contribution is the development of infinitesimal … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

5
126
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
3
2

Relationship

1
9

Authors

Journals

citations
Cited by 130 publications
(132 citation statements)
references
References 58 publications
5
126
0
Order By: Relevance
“…In recent years there has been increasing interest in robust analysis of queueing systems. We consider uncertainty in the diffusion scale of the QCP in a way that is often referred to as model uncertainty or Knightian uncertainty, see e.g., [35,26,25,9] and in the context of queueing systems see [28,13,34], see also [38] in a discrete time setup. This is not to be confused with the terminology robust queueing theory, which is often referred to an optimization based performance analysis, see e.g., [7,44].…”
Section: Introductionmentioning
confidence: 99%
“…In recent years there has been increasing interest in robust analysis of queueing systems. We consider uncertainty in the diffusion scale of the QCP in a way that is often referred to as model uncertainty or Knightian uncertainty, see e.g., [35,26,25,9] and in the context of queueing systems see [28,13,34], see also [38] in a discrete time setup. This is not to be confused with the terminology robust queueing theory, which is often referred to an optimization based performance analysis, see e.g., [7,44].…”
Section: Introductionmentioning
confidence: 99%
“…For example, Hu and Hong [19] suggests removing ambiguity by the Lagrangian dual when the divergence measure associated with the ambiguity set is the Kullback-Leibler (KL) divergence. On the other hand, one can remove ambiguity by considering series expansion similar to Henry Lam's approach [23] for the KL divergence and the approach used by Gotoh et al [17] for more general divergence measure on objective function with a penalty term.…”
Section: Introductionmentioning
confidence: 99%
“…Qian (2016, 2017) study the use of empirical likelihood, and Xie et al (2018) studies nonparametric Bayesian methods to construct CIs. Glasserman and Xu (2014), Hu et al (2012), Lam (2016b) and Ghosh and Lam (2015) study input uncertainty from a robust optimization viewpoint. In the parametric regime, Barton et al (2013) and Xie et al (2016) investigate the basic bootstrap with a metamodel built in advance, a technique known as the metamodel-assisted bootstrap.…”
Section: Introductionmentioning
confidence: 99%