2020 IEEE International Symposium on Circuits and Systems (ISCAS) 2020
DOI: 10.1109/iscas45731.2020.9180701
|View full text |Cite
|
Sign up to set email alerts
|

MC2RAM: Markov Chain Monte Carlo Sampling in SRAM for Fast Bayesian Inference

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

5
4

Authors

Journals

citations
Cited by 23 publications
(7 citation statements)
references
References 11 publications
0
7
0
Order By: Relevance
“…However, evaluation of likelihood models based on GMM is computationally intensive. A dimension of each mixture function in GMM requires subtractions, multiplications, additions (to compute the exponent of a Gaussian function) as well as look-ups (to add exponents using a log-ADD table [33]- [35]). Since these operations are repeated on all dimensions, mixture functions as well as the hypotheses,…”
Section: Ultralow-power Probabilistic Localizationmentioning
confidence: 99%
“…However, evaluation of likelihood models based on GMM is computationally intensive. A dimension of each mixture function in GMM requires subtractions, multiplications, additions (to compute the exponent of a Gaussian function) as well as look-ups (to add exponents using a log-ADD table [33]- [35]). Since these operations are repeated on all dimensions, mixture functions as well as the hypotheses,…”
Section: Ultralow-power Probabilistic Localizationmentioning
confidence: 99%
“…However, evaluation of likelihood models based on GMM is computationally intensive. A dimension of each mixture function in GMM requires subtractions, multiplications, additions (to compute the exponent of a Gaussian function) as well as look-ups (to add exponents using a log-ADD table [28]- [30]). Since these operations are repeated on all dimensions, mixture functions as well as the hypotheses, likelihood evaluation of a typical depth map may require tens of thousands of arithmetic operations and hundreds of memory look ups to evaluate the likelihood of a single measurement.…”
Section: Ultra-low-power Probabilistic Localizationmentioning
confidence: 99%
“…The new inference operators can further benefit from customdesigned computing modes. For example, multiplication-free and binary weight operators are quite suited for compute-inmemory [8], [13], [14]. A deep-shift operator can be more efficiently implemented using digital shifters [10].…”
Section: Introductionmentioning
confidence: 99%