2022
DOI: 10.1016/j.neucom.2022.02.043
|View full text |Cite
|
Sign up to set email alerts
|

MemTorch: An Open-source Simulation Framework for Memristive Deep Learning Systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 41 publications
(20 citation statements)
references
References 47 publications
0
20
0
Order By: Relevance
“…This is mainly due to unavailability of experimental data, which resulted in developing an empirical, rather than a physics-based model. Additionally, while this work only focuses on endurance and retention and their impact on memristive deep learning networks performance, future improvements of our model can account for modelling a nite number of conductance states and other device nonidealities [32,36].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…This is mainly due to unavailability of experimental data, which resulted in developing an empirical, rather than a physics-based model. Additionally, while this work only focuses on endurance and retention and their impact on memristive deep learning networks performance, future improvements of our model can account for modelling a nite number of conductance states and other device nonidealities [32,36].…”
Section: Discussionmentioning
confidence: 99%
“…Vstop dependence is validated in log(Cycles to Failure) architectures contain several modular crossbar tiles connected using a shared bus. (D) Modular crossbar tiles consist of crossbar arrays with supporting peripheral circuitry, and can represent weights using a dual-array scheme (as depicted), a dual row scheme, where double the number of rows are required, or a current-mirror scheme, that is capable of operation using a singular device to represent each weight [32].…”
Section: Model Validationmentioning
confidence: 99%
“…Memristive devices can be arranged within crossbar architectures to perform Vector Matrix Multiplications (VMMs) in-memory, in Op1q [14], which are used extensively in forward and backward propagations within Convolutional Neural Networks (CNNs) to compute the output of fully connected and unrolled convolutional layers. Scaled weight matrices can either be represented using two crossbars per layer, g pos and g neg , to represent positive and negative weights, respectively, or using a singular crossbar per layer with current mirrors, so that the effective conductance of each device is offset by a fixed value, g m , that can be determined using (1) [15]…”
Section: B Memristive DL Systemsmentioning
confidence: 99%
“…The MemTorch [15] simulation framework was used to simulate RRAM devices during inference using the VTEAM [21] model. Performance metrics for our trained conventional and equivalent MDLS are reported in Table II.…”
Section: Performance Evaluationmentioning
confidence: 99%
See 1 more Smart Citation