2023
DOI: 10.36227/techrxiv.21982208.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Quantized Magnetic Domain Wall Synapse for Efficient Deep Neural Networks

Abstract: <p> Quantization of synaptic weights using emerging non-volatile memory devices has emerged as a promising solution to implement computationally efficient neural networks on resource constrained hardware. However, the practical implementation of such synaptic weights is hampered by the imperfect memory characteristics, specifically the availability of limited number of quantized states and the presence of large intrinsic device variation and stochasticity involved in writing the synaptic state. This arti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…A negative input voltage is applied to G parallel to accommodate both positive and negative linear weight updates. The synapse conductance can be calculated using Kirchhoff 's current law for a single column [55]:…”
Section: Symbolmentioning
confidence: 99%
“…A negative input voltage is applied to G parallel to accommodate both positive and negative linear weight updates. The synapse conductance can be calculated using Kirchhoff 's current law for a single column [55]:…”
Section: Symbolmentioning
confidence: 99%