2017
DOI: 10.1080/10556788.2017.1333613
|View full text |Cite
|
Sign up to set email alerts
|

Algorithmic differentiation for piecewise smooth functions: a case study for robust optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 10 publications
0
5
0
Order By: Relevance
“…Here, we provide examples of nonlinear robust optimization problems considered in the literature. We emphasize cases with infinite-cardinality uncertainty sets, but we note that finite-cardinality uncertainty set examples are also commonplace; see, for example, the minimax regret and test problems in Fiege et al (2018), Hare and Macklem (2013), and Hare and Nutini (2013). The illustrative examples in Section 3.2 are used in later sections to demonstrate solution techniques and available software.…”
Section: Applications and Illustrative Examplesmentioning
confidence: 99%
“…Here, we provide examples of nonlinear robust optimization problems considered in the literature. We emphasize cases with infinite-cardinality uncertainty sets, but we note that finite-cardinality uncertainty set examples are also commonplace; see, for example, the minimax regret and test problems in Fiege et al (2018), Hare and Macklem (2013), and Hare and Nutini (2013). The illustrative examples in Section 3.2 are used in later sections to demonstrate solution techniques and available software.…”
Section: Applications and Illustrative Examplesmentioning
confidence: 99%
“…Obviously, (28) involves ś m i"1 k i and Proof. We will consider the representations (27) from which (26) can be directly obtained in the form (28). Firstly, the independent variables x j are linear functions of themselves with gradient a " e j and inhomogeneity α " 0.…”
Section: The Two-term Polyhedral Decompositionmentioning
confidence: 99%
“…Finally, one should always keep in mind that the task of minimizing a piecewise linear function will most likely occur as an inner problem in the optimization of a piecewise smooth and nonlinear function. As we have shown in [27] the local piecewise linear model problem can be obtained easily by a slight generalization of automatic or algorithmic differentiation, e.g., ADOL-C [28] and Tapenade [29]. as LIKQ in [7] and obviously requires that no more than n switches are active atx.…”
mentioning
confidence: 99%
“…The theory of analytical differentiation has been fruitfully applied to many areas of computer science, like deep learning [Baydin et al 2017] (choosing weights in neural nets by gradient descent optimisation methods), algorithmic differentiation [Fiege et al 2018; (providing techniques for efficient differentiation via source code transformation) and, recently, programming languages with first-class differentiation have begun to appear, see e.g. [Walter and Lehmann 2013].…”
Section: Introductionmentioning
confidence: 99%