2022
DOI: 10.48550/arxiv.2205.10042
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ALPINE: Analog In-Memory Acceleration with Tight Processor Integration for Deep Learning

Abstract: Analog in-memory computing (AIMC) cores offers significant performance and energy benefits for neural network inference with respect to digital logic (e.g., CPUs). AIMCs accelerate matrix-vector multiplications, which dominate these applications' run-time. However, AIMC-centric platforms lack the flexibility of general-purpose systems, as they often have hardcoded data flows and can only support a limited set of processing functions. With the goal of bridging this gap in flexibility, we present a novel system … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 37 publications
0
3
0
Order By: Relevance
“…Their solution employs a multiplicity of crossbars interfaced to content-addressable memories. A tightly-coupled solution for AIMC integration is introduced in ALPINE [8]. The authors of this paper focus on different applications with respect to us: multi-layer perceptrons, recurrent neural networks, and convolutional neural networks, where matrix-vector multiplications (as opposed to the GEMMs) are the main computational bottleneck.…”
Section: Related Workmentioning
confidence: 99%
“…Their solution employs a multiplicity of crossbars interfaced to content-addressable memories. A tightly-coupled solution for AIMC integration is introduced in ALPINE [8]. The authors of this paper focus on different applications with respect to us: multi-layer perceptrons, recurrent neural networks, and convolutional neural networks, where matrix-vector multiplications (as opposed to the GEMMs) are the main computational bottleneck.…”
Section: Related Workmentioning
confidence: 99%
“…With the advent of deep networks for artificial intelligence [1], and the increasing need of special purpose low-power devices that can complement general-purpose power-hungry computers in 'edge computing' applications [2, 3], several types of event-based approaches for implementing spiking neural networks (SNNs) in dedicated hardware have been proposed [4][5][6][7][8][9]. While many of these approaches are focusing on supporting the simulation of large scale SNNs [4,6,9], on converting rate-based artificial neural networks (ANNs) into their spike-based equivalent networks [10][11][12], or on processing digitally stored data with digital hardware implementations [13][14][15][16][17], the original neuromorphic engineering approach, first introduced in the early '90 s, proposed to implement biologically plausible SNNs by exploiting the physics of subthreshold analog complementary metal-oxide-semiconductor (CMOS) circuits to directly emulate the bio-physics of biological neurons and synapses [18,19].…”
Section: Introductionmentioning
confidence: 99%
“…With the advent of deep networks for artificial intelligence [1], and the increasing need of special purpose low-power devices that can complement general-purpose power-hungry computers in “edge computing” applications [2, 3], several types of event-based approaches for implementing Spiking Neural Networks (SNNs) in dedicated hardware have been proposed [49]. While many of these approaches are focusing on supporting the simulation of large scale SNNs [4, 6, 9], on converting rate-based Artificial Neural Networks (ANNs) into their spike-based equivalent networks [10–12], or on processing digitally stored data with digital hardware implementations [1317], the original neuromorphic engineering approach, first introduced in the early’90s, proposed to implement biologically plausible SNNs by exploiting the physics of subthreshold analog Complementary Metal-Oxide-Semiconductor (CMOS) circuits to directly emulate the bio-physics of biological neurons and synapses [18, 19].…”
Section: Introductionmentioning
confidence: 99%