2019
DOI: 10.1609/aaai.v33i01.33019348
|View full text |Cite
|
Sign up to set email alerts
|

Calibrated Stochastic Gradient Descent for Convolutional Neural Networks

Abstract: In stochastic gradient descent (SGD) and its variants, the optimized gradient estimators may be as expensive to compute as the true gradient in many scenarios. This paper introduces a calibrated stochastic gradient descent (CSGD) algorithm for deep neural network optimization. A theorem is developed to prove that an unbiased estimator for the network variables can be obtained in a probabilistic way based on the Lipschitz hypothesis. Our work is significantly distinct from existing gradient optimization methods… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 11 publications
0
1
0
Order By: Relevance
“…Unbiased random noise is applied in the output of skip connects [52]. This not only suppresses unfair competition but also helps the training of deep models [53]. Thus, a small and unbiased noise is introduced, which has zero mean and small variance.…”
Section: Adding Noise In Skip-connectionmentioning
confidence: 99%
“…Unbiased random noise is applied in the output of skip connects [52]. This not only suppresses unfair competition but also helps the training of deep models [53]. Thus, a small and unbiased noise is introduced, which has zero mean and small variance.…”
Section: Adding Noise In Skip-connectionmentioning
confidence: 99%