2012 IEEE International Symposium on Circuits and Systems 2012
DOI: 10.1109/iscas.2012.6271922
|View full text |Cite
|
Sign up to set email alerts
|

Architecture and implementation of an associative memory using sparse clustered networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2013
2013
2015
2015

Publication Types

Select...
7
1

Relationship

5
3

Authors

Journals

citations
Cited by 24 publications
(20 citation statements)
references
References 6 publications
0
20
0
Order By: Relevance
“…-architectures based on the original GBNN model [Jarollahi et al 2012] are referred as V0; -architectures based on the fully binary model we proposed are referred to as V1.0; -architectures based on binary model and triangular synaptic weight matrices are referred to as V1.1; -architectures based on binary model and cluster-based serialization are referred as V1.2; -architectures based on binary model and neuron-based serialization are referred as V1.3.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…-architectures based on the original GBNN model [Jarollahi et al 2012] are referred as V0; -architectures based on the fully binary model we proposed are referred to as V1.0; -architectures based on binary model and triangular synaptic weight matrices are referred to as V1.1; -architectures based on binary model and cluster-based serialization are referred as V1.2; -architectures based on binary model and neuron-based serialization are referred as V1.3.…”
Section: Methodsmentioning
confidence: 99%
“…A fully parallel GBNN implementation, as proposed in Jarollahi et al [2012], requires a huge amount of wires to connect all the neurons and a large amount of computing logic to process data concurrently. Indeed, in the GBNN model, a neuron n i, j from a given 35:10 P. Coussy et al cluster j must be connected to all other neurons of all distant clusters.…”
Section: Serialized Communicationmentioning
confidence: 99%
“…In this process the value of a neuron in other clusters does not affect that of one in the decoding cluster, hence the name. The conventional mapping process as described and implemented in [3]- [5] employs matrix multiplication to compute a score for each neuron followed by finding the maximum score and the winner-take-all rule.…”
Section: B Message Retrievalmentioning
confidence: 99%
“…with a good message retrieval ability (probability to retrieve a message). Based on binary units, binary connections and a simple decoding algorithm, this associative network model allows efficient fully-parallel hardware implementations [7,8,3]. However, like other models of associative networks [9], diversity strongly depends on the distribution of stored messages.…”
Section: Introductionmentioning
confidence: 99%