2019
DOI: 10.1007/s10898-019-00746-5
|View full text |Cite
|
Sign up to set email alerts
|

A proximal method for solving nonlinear minmax location problems with perturbed minimal time functions via conjugate duality

Abstract: We investigate via a conjugate duality approach general nonlinear minmax location problems formulated by means of an extended perturbed minimal time function, necessary and sufficient optimality conditions being delivered together with characterizations of the optimal solutions in some particular instances. A parallel splitting proximal point method is employed in order to numerically solve such problems and their duals. We present the computational results obtained in matlab on concrete examples, successfully… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 27 publications
0
4
0
Order By: Relevance
“…In this section, we apply Theorem 3.1 to the optimality conditions of a constrained convex minmax location problem with perturbed minimal time functions and set-up costs. Such problems were recently investigated via conjugate duality approach; see [10] for more details.…”
Section: Applicationsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, we apply Theorem 3.1 to the optimality conditions of a constrained convex minmax location problem with perturbed minimal time functions and set-up costs. Such problems were recently investigated via conjugate duality approach; see [10] for more details.…”
Section: Applicationsmentioning
confidence: 99%
“…Remark 4.1. Note that the decomposition of the objective function of problem (M L P) is completely different from that of [10], that is, the special construction of g, h (1) Let (x 1 , ...,…”
Section: Wherementioning
confidence: 99%
“…The other two applications, one in Medical Imaging (more precisely in Tomography) and one in Machine Learning (Support Vector Machines) were discussed in [10], too, and we compare the performance of our algorithm to the stochastic version of the method introduced there. We use the proximal points of the smoothed objective functions instead of their subgradients, motivated also by the fact (noted, for instance, in [18]) that proximal point algorithms tend to solve certain optimization problems faster and cheaper than subgradient methods. To this end we smooth the involved functions in the second and third application with the Moreau-envelope, in the first application with Nesterov's smoothing approach.…”
Section: Applicationsmentioning
confidence: 99%
“…Multi-composed optimization problems deal with optimization models whose objective functions are written as the compositions of more than two functions (see [1,2,3]). In fact, the study of multi-composed optimization problems has been a subject matter of great interest because this new class of mathematical optimization models can be applied to many practical problems that arise in different fields of modern research, such as deep learning [4], facility location theory [5,6], fractional programming problems, and entropy optimization [2], etc.…”
Section: Introductionmentioning
confidence: 99%