This paper proposes and develops a new Newton-type algorithm to solve subdifferential inclusions defined by subgradients of extended-real-valued prox-regular functions. The proposed algorithm is formulated in terms of the second-order subdifferential of such functions that enjoys extensive calculus rules and can be efficiently computed for broad classes of extended-real-valued functions. Based on this and on metric regularity and subregularity properties of subgradient mappings, we establish verifiable conditions ensuring well-posedness of the proposed algorithm and its local superlinear convergence. The obtained results are also new for the class of equations defined by continuously differentiable functions with Lipschitzian derivatives (C 1,1 functions), which is the underlying case of our consideration. The developed algorithm for prox-regular functions is formulated in terms of proximal mappings related to and reduces to Moreau envelopes. Besides numerous illustrative examples and comparison with known algorithms for C 1,1 functions and generalized equations, the paper presents applications of the proposed algorithm to the practically important class of Lasso problems arising in statistics and machine learning.
This paper proposes and justifies two new globally convergent Newton-type methods to solve unconstrained and constrained problems of nonsmooth optimization by using tools of variational analysis and generalized differentiation. Both methods are coderivative-based and employ generalized Hessians (coderivatives of subgradient mappings) associated with objective functions, which are either of class C 1,1 , or are represented in the form of convex composite optimization, where one of the terms may be extended-real-valued. The proposed globally convergent algorithms are of two types. The first one extends the damped Newton method and requires positive-definiteness of the generalized Hessians for its well-posedness and efficient performance, while the other algorithm is of the Levenberg-Marquardt type being well-defined when the generalized Hessians are merely positive-semidefinite. The obtained convergence rates for both methods are at least linear, but becomes superlinear under the so-called semismooth * property of subgradient mappings. Problems of convex composite optimization are investigated with and without the strong convexity assumption on of smooth parts of objective functions by implementing the machinery of forward-backward envelopes. Numerical experiments are conducted for a basic class of Lasso problems by providing performance comparisons of the new algorithms with some other first-order and second-order methods that are highly recognized in nonsmooth optimization.
The paper proposes and develops new globally convergent algorithms of the generalized damped Newton type for solving important classes of nonsmooth optimization problems. These algorithms are based on the theory and calculations of second-order subdifferentials of nonsmooth functions with employing the machinery of second-order variational analysis and generalized differentiation. First we develop a globally superlinearly convergent damped Newton-type algorithm for the class of continuously differentiable functions with Lipschitzian gradients, which are nonsmooth of second order. Then we design such a globally convergent algorithm to solve a class of nonsmooth convex composite problems with extended-real-valued cost functions, which typically arise in machine learning and statistics. Finally, the obtained algorithmic developments and justifications are applied to solving a major class of Lasso problems with detailed numerical implementations. We present the results of numerical experiments and compare the performance of our main algorithm applied to Lasso problems with those achieved by other first-order and second-order methods.
This paper proposes and develops a new Newton-type algorithm to solve subdifferential inclusions defined by subgradients of extended real-valued prox-regular functions. The proposed algorithm is formulated in terms of the second order subdifferential of such functions that enjoy extensive calculus rules and can be efficiently computed for broad classes of extended real-valued functions. Based on this and on the metric regularity and subregularity properties of subgradient mappings, we establish verifiable conditions ensuring the well-posedness of the proposed algorithm and its local superlinear convergence. The obtained results are also new for the class of equations defined by continuously differentiable functions with Lipschitzian gradients ([Formula: see text] functions), which is the underlying case of our consideration. The developed algorithms for prox-regular functions and their extension to a structured class of composite functions are formulated in terms of proximal mappings and forward–backward envelopes. Besides numerous illustrative examples and comparison with known algorithms for [Formula: see text] functions and generalized equations, the paper presents applications of the proposed algorithms to regularized least square problems arising in statistics, machine learning, and related disciplines. Funding: Research of P. D. Khanh is funded by Ho Chi Minh City University of Education Foundation for Science and Technology [Grant CS.2022.19.20TD]. Research of B. Mordukhovich and V. T. Phat was partly supported by the U.S. National Science Foundation [Grants DMS-1808978 and DMS-2204519]. The research of B. Mordukhovich was also supported by the Air Force Office of Scientific Research [Grant 15RT0462] and the Australian Research Council under Discovery Project DP-190100555. This work was supported by the Air Force Office of Scientific Research [Grant 15RT0462]. Supplemental Material: The e-companion is available at https://doi.org/10.1287/moor.2022.1320 .
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.