2023
DOI: 10.1109/tsipn.2023.3277591
|View full text |Cite
|
Sign up to set email alerts
|

Proportionate Adaptive Graph Signal Recovery

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 29 publications
0
2
0
Order By: Relevance
“…As mentioned in [7], for the saddle point problem, the optimistic gradient descent-ascent (OGDA) method updates the variable via the difference of the preceding two gradients. Besides, for the least mean squares (LMS) estimation of graph signals, the extended LMS algorithm in [8] and proportionate-type graph LMS algorithm in [9] update the variable via the weighted sum of the preceding several gradients. Inspired by the above works, we here propose an extended gradient method, in which the variables are updated along the direction of the sum of the gradients of the preceding two iterates.…”
Section: Introductionmentioning
confidence: 99%
“…As mentioned in [7], for the saddle point problem, the optimistic gradient descent-ascent (OGDA) method updates the variable via the difference of the preceding two gradients. Besides, for the least mean squares (LMS) estimation of graph signals, the extended LMS algorithm in [8] and proportionate-type graph LMS algorithm in [9] update the variable via the weighted sum of the preceding several gradients. Inspired by the above works, we here propose an extended gradient method, in which the variables are updated along the direction of the sum of the gradients of the preceding two iterates.…”
Section: Introductionmentioning
confidence: 99%
“…As mentioned in [7,8], the optimistic gradient descent-ascent (OGDA) method and the inertial gradient algorithm with Hessian dampling (IGAHD) update the variable via the difference of the gradients of the preceding two iterates. Besides, for the least mean squares (LMS) estimation of graph signals, the extended LMS algorithm in [9] and proportionate-type graph LMS algorithm in [10] update the variable via the weighted sum of the gradients of the preceding several iterates. Further, the Anderson acceleration gradient descent method (AA-PGA) in [11] updates the variable by employing the convex combination of the gradients of the preceding several iterates.…”
Section: Introductionmentioning
confidence: 99%