2020 IEEE 30th International Workshop on Machine Learning for Signal Processing (MLSP) 2020
DOI: 10.1109/mlsp49062.2020.9231550
|View full text |Cite
|
Sign up to set email alerts
|

Self-Compression in Bayesian Neural Networks

Abstract: Machine learning models have achieved human-level performance on various tasks. This success comes at a high cost of computation and storage overhead, which makes machine learning algorithms difficult to deploy on edge devices. Typically, one has to partially sacrifice accuracy in favor of an increased performance quantified in terms of reduced memory usage and energy consumption. Current methods compress the networks by reducing the precision of the parameters or by eliminating redundant ones. In this paper, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

2
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 19 publications
2
2
0
Order By: Relevance
“…In Table 4 , we also note that the performance of BDNN is better than its deterministic counterpart under all types and levels of adversarial attacks. This performance trend shows that BDNN models are robust to adversarial attacks as already demonstrated in our previous work ( 10 , 13 , 14 , 30 ).…”
Section: Discussionsupporting
confidence: 85%
See 3 more Smart Citations
“…In Table 4 , we also note that the performance of BDNN is better than its deterministic counterpart under all types and levels of adversarial attacks. This performance trend shows that BDNN models are robust to adversarial attacks as already demonstrated in our previous work ( 10 , 13 , 14 , 30 ).…”
Section: Discussionsupporting
confidence: 85%
“…The accuracy drop for OrganAMNIST BDNN is from 90.22 → 22.87% which is almost similar to its deterministic counterpart, 90.12 → 20.05%. This performance comparison between BDNNs and deterministic models shows that generally, BDNNs are more robust, as already established in our previous work (10,13,14,30). In other experiments we observe a similar accuracy trend between BDNN and deterministic models for the speckle additive noise.…”
Section: Predictive Variance Under Noisesupporting
confidence: 88%
See 2 more Smart Citations