2021 25th International Conference on Methods and Models in Automation and Robotics (MMAR) 2021
DOI: 10.1109/mmar49549.2021.9528460
|View full text |Cite
|
Sign up to set email alerts
|

Weight Perturbation as a Method for Improving Performance of Deep Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 7 publications
0
1
0
Order By: Relevance
“…The optimizer learning rate was decreased from 10 −3 to 10 −4 , then to 10 −5 , and finally to 10 −6 during this phase. We found it beneficial to apply additive, normally distributed noise to the network weights during this phase of training (approach closer examined in [22]); 3.…”
Section: Training Processmentioning
confidence: 99%
“…The optimizer learning rate was decreased from 10 −3 to 10 −4 , then to 10 −5 , and finally to 10 −6 during this phase. We found it beneficial to apply additive, normally distributed noise to the network weights during this phase of training (approach closer examined in [22]); 3.…”
Section: Training Processmentioning
confidence: 99%