2004
DOI: 10.1023/b:jogo.0000044768.75992.10
|View full text |Cite
|
Sign up to set email alerts
|

Computational Experience with a New Class of Convex Underestimators: Box-constrained NLP Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
38
0

Year Published

2005
2005
2017
2017

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 55 publications
(39 citation statements)
references
References 21 publications
1
38
0
Order By: Relevance
“…Thus, an optimal function P (r) can be determined as the one for which the transition from the old probabilities (3) to the new probabilities (4), (6) can be described by the (fractionally linear) Bayes formula (8).…”
Section: Main Ideamentioning
confidence: 99%
See 1 more Smart Citation
“…Thus, an optimal function P (r) can be determined as the one for which the transition from the old probabilities (3) to the new probabilities (4), (6) can be described by the (fractionally linear) Bayes formula (8).…”
Section: Main Ideamentioning
confidence: 99%
“…There are numerous effective global optimization techniques that reduce the general global optimization problems to convex ones; see, e.g., [17,38]. Empirically, among these techniques, the best are αBB method [4,5,17,27] and its modifications recently proposed in [6,7]. It turns out [18] that this empirical optimality can also be explained via shift-and scale-invariance.…”
Section: Other Examplesmentioning
confidence: 99%
“…Adjiman et al [5] presented the detailed implementation of the alpha BB approach and computational studies in process design problems such as heat exchanger networks, reactor-separator networks, and batch design under uncertainty. Akrotirianakis and Floudas [6] presented computational results of the new class of convex underestimators embedded in a branch-and-bound framework for box-constrained NLPs. They also proposed a hybrid global optimization method that includes the random-linkage stochastic approach with the aim of improving the computational performance.…”
Section: Introductionmentioning
confidence: 99%
“…In other words, the optimal generalized αBB scheme is either the original αBB, or the scheme with exponential functions described in [3,4]. Thus, we have answers to both above questions:…”
Section: Case Study: Selecting Convex Underestimatorsmentioning
confidence: 99%
“…In [3,4], several different non-linear functions have been tried, and it turned out that among the tested functions, the best results were achieved for the exponential functions g i (x i ) = exp(γ i · x i ) and h i (x i ) = − exp(−γ i · x i ). For these functions, the expression (2) can be somewhat simplified: indeed,…”
Section: Case Study: Selecting Convex Underestimatorsmentioning
confidence: 99%