2015
DOI: 10.1007/s40314-015-0228-1
|View full text |Cite
|
Sign up to set email alerts
|

Supermemory gradient methods for monotone nonlinear equations with convex constraints

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(9 citation statements)
references
References 21 publications
0
9
0
Order By: Relevance
“…It should be pointed out that for the existing derivative-free projection methods for solving the problem (1), Assumption A2 or A2' can be verified to be always satisfied, see [13,1,12,7,17,25,27,11,3,28,2,16] for example.…”
Section: Algorithm Model (Am)mentioning
confidence: 99%
See 4 more Smart Citations
“…It should be pointed out that for the existing derivative-free projection methods for solving the problem (1), Assumption A2 or A2' can be verified to be always satisfied, see [13,1,12,7,17,25,27,11,3,28,2,16] for example.…”
Section: Algorithm Model (Am)mentioning
confidence: 99%
“…Since the derivative-free projection methods do not involve any matrix computation and storage, they are particularly effective for solving large-scale nonlinear system of equations. Therefore, this kind of method has attracted much more attention and many numerical methods have been proposed, see for example [13,1,12,7,17,25,27,11,3,28] and references therein.…”
mentioning
confidence: 99%
See 3 more Smart Citations