2018 IEEE 18th International Conference on Nanotechnology (IEEE-NANO) 2018
DOI: 10.1109/nano.2018.8626224
|View full text |Cite
|
Sign up to set email alerts
|

Binary Weighted Memristive Analog Deep Neural Network for Near-Sensor Edge Processing

Abstract: The memristive crossbar aims to implement analog weighted neural network, however, the realistic implementation of such crossbar arrays is not possible due to limited switching states of memristive devices. In this work, we propose the design of an analog deep neural network with binary weight update through backpropagation algorithm using binary state memristive devices. We show that such networks can be successfully used for image processing task and has the advantage of lower power consumption and small on-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(17 citation statements)
references
References 12 publications
0
17
0
Order By: Relevance
“…The summary of these architectures is shown in Table I. Also, there are several other memristive architectures proposed in the recent years, which are less common and not considered in this paper, such as Probabilistic Neural Networks [88], [89] and Binarized Neural Networks [90]. 1) One layer neural network with learning: The structure of one-layer ANN with learning is similar to the feed-forward neural network but contains the learning phase.…”
Section: Neuromorphic Architectures a Neural Network Architecturesmentioning
confidence: 99%
“…The summary of these architectures is shown in Table I. Also, there are several other memristive architectures proposed in the recent years, which are less common and not considered in this paper, such as Probabilistic Neural Networks [88], [89] and Binarized Neural Networks [90]. 1) One layer neural network with learning: The structure of one-layer ANN with learning is similar to the feed-forward neural network but contains the learning phase.…”
Section: Neuromorphic Architectures a Neural Network Architecturesmentioning
confidence: 99%
“…A neural network which uses any combination of binary weight or hard threshold activation functions is typically known as BNN. There have been several successful implementations of BNN algorithms in software [26], [27], [28], [29] and an attempt to implement it on hardware [30], [31], [32]. The analog hardware implementation of BNN system with learning remains as an open problem [33], [32].…”
Section: Background a Learning Algorithms And Biologically Inspimentioning
confidence: 99%
“…There have been several successful implementations of BNN algorithms in software [26], [27], [28], [29] and an attempt to implement it on hardware [30], [31], [32]. The analog hardware implementation of BNN system with learning remains as an open problem [33], [32]. MNN is a systematic arrangement of the artificial neural networks that can process information from different data sources to perform data fusion and classification.…”
Section: Background a Learning Algorithms And Biologically Inspimentioning
confidence: 99%
“…Sometimes it is used in a 1 transistor 1 RRAM (1T1R) configuration, to avoid unwanted or sneak current paths. In [15], authors have presented a memristor-based implementation of a BNN able to achieve both high accuracy on MNIST and IRIS dataset and low power consumption. In some others, improvements in memristor architectures have been proposed that enable multiple bits per cell.…”
Section: A Quick Overviewmentioning
confidence: 99%
“…Some of them are considering the binary approximations, choosing an implementation based on emerging technologies. Some works [12,13,26,27] are based on MTJ technology while [15][16][17][18]28,29] have used RRAM. In each of these works the resistive element is used to perform simple logical operations based on current sensing technique.…”
Section: Nn Implementations Based On Lim Conceptmentioning
confidence: 99%