2022
DOI: 10.1007/s10107-021-01742-y
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic first-order methods for convex and nonconvex functional constrained optimization

Abstract: The monotone Variational Inequality (VI) is a general model that has important applications in various engineering and scientific domains. In numerous instances, the VI problems are accompanied by function constraints which can possibly be data-driven, making the usual projection operator challenging to compute. In this paper, we present novel first-order methods for the function constrained VI (FCVI) problem under various settings, including smooth or nonsmooth problems with stochastic operator and/or stochas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

1
31
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 34 publications
(32 citation statements)
references
References 53 publications
1
31
0
Order By: Relevance
“…Consider first the case in which z k is the exact solution of (5). Indeed, in this case, it follows from (6) and the optimality condition of (5) that 0 ∈ ∇f (z k ) + ∂h(z k ) + A * p k + (z k − z k−1 )/λ, and hence that (ẑ k , vk , pk ) := (z k , (z k−1 − z k )/λ, p k ) satisfies the above inclusion. Assume now that z k is an approximate solution of (5) in the sense that there exists a residual pair (v k , ε k ) which together with z k satisfies the approximation criterion (32) below.…”
mentioning
confidence: 96%
See 1 more Smart Citation
“…Consider first the case in which z k is the exact solution of (5). Indeed, in this case, it follows from (6) and the optimality condition of (5) that 0 ∈ ∇f (z k ) + ∂h(z k ) + A * p k + (z k − z k−1 )/λ, and hence that (ẑ k , vk , pk ) := (z k , (z k−1 − z k )/λ, p k ) satisfies the above inclusion. Assume now that z k is an approximate solution of (5) in the sense that there exists a residual pair (v k , ε k ) which together with z k satisfies the approximation criterion (32) below.…”
mentioning
confidence: 96%
“…Finally, the third one [6] studies a primal-dual proximal point type method for computing approximate stationary solution to a constrained smooth nonconvex composite optimization problem and establishes its iteration-complexity bounds under different sets of assumptions.…”
mentioning
confidence: 99%
“…Our main contributions are briefly summarized as follows. Firstly, inspired by the constraint-extrapolation (ConEx) method for function constrained convex optimization in [4], we develop a novel constraint-extrapolated conditional gradient (CoexCG) method for solving problem (1.1). While both methods are single-loop primal-dual type methods for solving convex optimization problems with function constraints, CoexCG only requires us to minimize a linear function, rather than to perform projection, over X.…”
mentioning
confidence: 99%
“…We also extend CoexDurCG for solving the aforementioned structured nonsmooth problems, and demonstrate that it is not necessary to explicitly define the smooth approximation problem. We note that this technique of adding a diminishing regularization term can be applied for solving problems with either unbounded primal feasible region (e.g., stochastic subgradient descent [29] and stochastic accelerated gradient descent [19]), or unbounded dual feasible region (e.g., ConEx [4]), for which one often requires the number of iterations fixed in advance.…”
mentioning
confidence: 99%
See 1 more Smart Citation