2021
DOI: 10.1016/j.sigpro.2021.108201
|View full text |Cite
|
Sign up to set email alerts
|

Multi-stage stochastic gradient method with momentum acceleration

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 10 publications
0
2
0
Order By: Relevance
“…To accelerate the SG algorithm, modifying step size and modifying gradient are two common methods. [18][19][20] Among these acceleration methods, the multi-innovation (MI) technology is extensively employed in recent years. [21][22][23] For example, a three-stage MI stochastic gradient algorithm was developed for time-series models, 21 an MI fractional-order adaptive algorithm was presented for nonlinear systems" 22 and a correlation analysis-based MI algorithm was proposed for linear errors-in-variables systems.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…To accelerate the SG algorithm, modifying step size and modifying gradient are two common methods. [18][19][20] Among these acceleration methods, the multi-innovation (MI) technology is extensively employed in recent years. [21][22][23] For example, a three-stage MI stochastic gradient algorithm was developed for time-series models, 21 an MI fractional-order adaptive algorithm was presented for nonlinear systems" 22 and a correlation analysis-based MI algorithm was proposed for linear errors-in-variables systems.…”
Section: Introductionmentioning
confidence: 99%
“…But this algorithm converges slowly. To accelerate the SG algorithm, modifying step size and modifying gradient are two common methods 18‐20 . Among these acceleration methods, the multi‐innovation (MI) technology is extensively employed in recent years 21‐23 .…”
Section: Introductionmentioning
confidence: 99%