2024
DOI: 10.31237/osf.io/4u32w
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Gradient Update using Layer-based Importance Ratio in Deep Neural Networks

Aditya Shah,
Ioan Hughes

Abstract: Our study delves into the widespread utilization of Deep Neural Networks (DNNs) across diverse domains, showcasing their notable successes. We evaluate the effectiveness of our approach on well-established network architectures such as AlexNet, VGG16, and ResNet50, employing widely used image classification datasets namely CIFAR-10 and CIFAR-100. Our findings reveal significant advancements in terms of both accuracy and compression ratio when compared to existing methodologies. Notably, our method achieves an … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 21 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?