2018
DOI: 10.1007/s10957-018-1396-0
|View full text |Cite
|
Sign up to set email alerts
|

Deterministic Global Optimization with Artificial Neural Networks Embedded

Abstract: 1 Abstract Artificial neural networks (ANNs) are used in various applications for datadriven black-box modeling and subsequent optimization. Herein, we present an efficient method for deterministic global optimization of ANN embedded optimization problems. The proposed method is based on relaxations of algorithms using McCormick relaxations in a reduced-space [SIOPT, 20 (2009), pp. 573-601] including the convex and concave envelopes of the nonlinear activation function of ANNs. The optimization problem is solv… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
125
0
3

Year Published

2019
2019
2021
2021

Publication Types

Select...
8

Relationship

3
5

Authors

Journals

citations
Cited by 172 publications
(130 citation statements)
references
References 70 publications
2
125
0
3
Order By: Relevance
“…We are thus compelled to search for a good tradeoff between model depth and optimization efficiency. The same observation was reported in [55,36] for global optimization of DNNs, supporting the claim that deeper networks are more challenging to optimize. To conclude, we observe, as others have before us and as theory predicts, that the feasibility of using the MILP formulation quickly fades with increasing network sizes.…”
Section: Discussionsupporting
confidence: 81%
“…We are thus compelled to search for a good tradeoff between model depth and optimization efficiency. The same observation was reported in [55,36] for global optimization of DNNs, supporting the claim that deeper networks are more challenging to optimize. To conclude, we observe, as others have before us and as theory predicts, that the feasibility of using the MILP formulation quickly fades with increasing network sizes.…”
Section: Discussionsupporting
confidence: 81%
“…We impose the annual operation cost C operation as a constraint and minimize the permeate concentration. [55] which is favorable for flowsheet problems [38,56] and optimization problems with ANNs embedded [20,21,35,36,57]. We also considered a full-space formulation with the solver BARON (version 18.5.8 using default options) [51] in GAMS [58] (version 25.1.1).…”
Section: Numerical Optimization Approachmentioning
confidence: 99%
“…For layer-by-layer (LbL) nanofiltration membranes, we demonstrated in our recent work that an artificial neural network (ANN) approach can predict ion separation properties and pure water flux directly based on synthesis protocols, i.e., the fabrication parameters of the membrane [20]. Contrary to conventional screening procedures or educated guess experimental design, we developed a deterministic global optimization method [20,21] that is applied in this work to enable optimal tailoring of synthesis protocols towards desired membrane retention and permeability in process optimization. Moreover, this framework is versatile enough to compute the resulting trade-off boundary (Pareto front) between permeability and retention.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Together with other classes of approximators, both these kinds of networks are applied extensively by the so-called Extended Ritz Method [43], [44], [45], allowing to get suboptimal solutions to functional optimization problems with guarantees on their performance. Likewise Gaussian RBF networks, also feedforward neural networks with sigmoidal computational units can be applied to perform surrogate optimization [46]; d) still in order to use a coarser grid and allow at the same time a precise evaluation of the band gap amplitude, machine-learning methods could be applied for dispersion curve identification in the presence of curve intersections [47].…”
Section: Conclusion and Further Research Directionsmentioning
confidence: 99%