2017
DOI: 10.1007/s10601-016-9267-5
|View full text |Cite
|
Sign up to set email alerts
|

Domain reduction techniques for global NLP and MINLP optimization

Abstract: Optimization solvers routinely utilize presolve techniques, including model simplification, reformulation and domain reduction techniques. Domain reduction techniques are especially important in speeding up convergence to the global optimum for challenging nonconvex nonlinear programming (NLP) and mixedinteger nonlinear programming (MINLP) optimization problems. In this work, we survey the various techniques used for domain reduction of NLP and MINLP optimization problems. We also present a computational analy… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
39
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 55 publications
(39 citation statements)
references
References 180 publications
(223 reference statements)
0
39
0
Order By: Relevance
“…Preprocessing ideas for gas network problems can be found in [38], including more complex procedures like pressure and flow propagation heuristics. For a broader overview of different preprocessing ideas for mixed-integer nonlinear programs, see the review article [41]. However, we cannot use most of the mentioned ideas as is since they are tailored towards nominal problems without uncertainty.…”
Section: Appendix A: Reducing Model Size By Preprocessingmentioning
confidence: 99%
See 1 more Smart Citation
“…Preprocessing ideas for gas network problems can be found in [38], including more complex procedures like pressure and flow propagation heuristics. For a broader overview of different preprocessing ideas for mixed-integer nonlinear programs, see the review article [41]. However, we cannot use most of the mentioned ideas as is since they are tailored towards nominal problems without uncertainty.…”
Section: Appendix A: Reducing Model Size By Preprocessingmentioning
confidence: 99%
“…Since the problems in (A5) are LPs, we also call this the LP-based relaxation preprocessing approach. In the literature, this type of preprocessing strategy is also known as feasibility-based bounds tightening; see [41]. We remark that in case of demand uncertainty, the problems (A5) contain description of the uncertainty set.…”
Section: Bounds Due To Linear Relaxationsmentioning
confidence: 99%
“…The application of GO algorithms using deterministic mathematical models allows to obtain a solution for a given global optimality tolerance. As is well known from a computational point of view, the calculation of good lower and upper bounds is crucial for the success of any GO algorithm [23]. Sherali et al [24] proposed an improved method to develop tight linear relaxations to calculate global lower bounds for a design problem associated with the water distribution network.…”
Section: Cost Itemmentioning
confidence: 99%
“…OBBT solves a sequence of convex minimization and maximization problems on the variables that appear in nonconvex terms. The solutions to these problems tighten domains of the variables and the associated relaxation to the nonconvex terms [42,4,16,37]. Recent work has observed the effectiveness of applying OBBT in various applications [13,54,39].…”
Section: Introductionmentioning
confidence: 99%