2021
DOI: 10.1016/bs.adcom.2020.11.005
|View full text |Cite
|
Sign up to set email alerts
|

Hardware accelerator systems for artificial intelligence and machine learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 16 publications
(8 citation statements)
references
References 35 publications
0
8
0
Order By: Relevance
“…To test the efficacy of our proposed dynamic inference method, DynaTran, we evaluate encoder-only models (because of their high parallelization capabilities [17]) on different tasks. We use BERT-Tiny [26] and BERT-Base [1], two commonly used pre-trained models.…”
Section: A Evaluation Models and Datasetsmentioning
confidence: 99%
See 4 more Smart Citations
“…To test the efficacy of our proposed dynamic inference method, DynaTran, we evaluate encoder-only models (because of their high parallelization capabilities [17]) on different tasks. We use BERT-Tiny [26] and BERT-Base [1], two commonly used pre-trained models.…”
Section: A Evaluation Models and Datasetsmentioning
confidence: 99%
“…For mobile platforms, we compare the inference of BERT-Tiny on AccelTran-Edge with off-the-shelf platforms that include Raspberry Pi 4 Model-B [47] that has the Broadcom BCM2711 ARM SoC, Intel Neural Compute Stick (NCS) v2 [48] with its neural processing unit (NPU), and Apple M1 ARM SoC [49] with an 8-core CPU, an 8-core GPU, and 16 GB unified memory on an iPad (for easier evaluations, we performed experiments on a MacBook Pro laptop with the same SoC instead). For serverside platforms, we compare the inference of BERT-Base on AccelTran-Server with a modern NVIDIA A100 GPU (40GB of video RAM) and previously proposed accelerators, namely, OPTIMUS [17], SpAtten [15], and Energon [16]. We chose the maximum batch size possible for each platform, based on its memory capacity.…”
Section: Evaluation Baselinesmentioning
confidence: 99%
See 3 more Smart Citations