We identify a class of root-searching methods that surprisingly outperform the bisection method on the average performance while retaining minmax optimality. The improvement on the average applies for any continuous distributional hypothesis. We also pinpoint one specific method within the class and show that under mild initial conditions it can attain an order of convergence of up to 1.618, i.e., the same as the secant method. Hence, we attain both an improved average performance and an improved order of convergence with no cost on the minmax optimality of the bisection method. Numerical experiments show that, on regular functions, the proposed method requires a number of function evaluations similar to current state-of-the-art methods, about 24% to 37% of the evaluations required by the bisection procedure. In problems with non-regular functions, the proposed method performs significantly better than the state-of-the-art, requiring on average 82% of the total evaluations required for the bisection method, while the other methods were outperformed by bisection. In the worst case, while current state-of-the-art commercial solvers required two to three times the number of function evaluations of bisection, our proposed method remained within the minmax bounds of the bisection method.
Current state-of-the-art multi-objective optimization solvers, by computing gradients of all π objective functions per iteration, produce after π iterations a measure of proximity to critical conditions that is upperbounded by π (1/ β π) when the objective functions are assumed to have πΏβLipschitz continuous gradients; i.e. they require π (π/π 2 ) gradient and function computations to produce a measure of proximity to critical conditions bellow some target π. We reduce this to π (1/π 2 ) with a method that requires only a constant number of gradient and function computations per iteration; and thus, we obtain for the first time a multi-objective descent-type method with a query complexity cost that is unaffected by increasing values of π. For this, a brand new multi-objective descent direction is identified, which we name the central descent direction, and, an incremental approach is proposed. Robustness properties of the central descent direction are established, measures of proximity to critical conditions are derived, and, the incremental strategy for finding solutions to the multi-objective problem is shown to attain convergence properties unattained by previous methods. To the best of our knowledge, this is the first method to achieve this with no additional a-priori information on the structure of the problem, such as done by scalarizing techniques, and, with no pre-known information on the regularity of the objective functions other than Lipschitz continuity of the gradients.
Backtracking is an inexact line search procedure that selects the first value in a sequence π₯ 0 , π₯ 0 π½, π₯ 0 π½ 2 ... that satisfies π(π₯) β€ 0 on R + with π(π₯) β€ 0 iff π₯ β€ π₯ * . This procedure is widely used in descent direction optimization algorithms with Armijo-type conditions. It both returns an estimate in (π½π₯ * , π₯ * ] and enjoys an upper-bound βlog π½ π/π₯ 0 β on the number of function evaluations to terminate, with π a lower bound on π₯ * . The basic bracketing mechanism employed in several root-searching methods is adapted here for the purpose of performing inexact line searches, leading to a new class of inexact line search procedures. The traditional bisection algorithm for root-searching is transposed into a very simple method that completes the same inexact line search in at most βlog 2 log π½ π/π₯ 0 β function evaluations. A recent bracketing algorithm for root-searching which presents both minmax function evaluation cost (as the bisection algorithm) and superlinear convergence is also transposed, asymptotically requiring βΌ log log log π/π₯ 0 function evaluations for sufficiently smooth functions. Other bracketing algorithms for root-searching can be adapted in the same way. Numerical experiments suggest time savings of 50% to 80% in each call to the inexact search procedure.CCS Concepts: β’ Mathematics of computing β Solvers; Numerical analysis.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citationsβcitations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright Β© 2024 scite LLC. All rights reserved.
Made with π for researchers
Part of the Research Solutions Family.