2007
DOI: 10.1016/j.neunet.2006.05.045
|View full text |Cite
|
Sign up to set email alerts
|

Capacity analysis for a two-level decoupled Hamming network for associative memory under a noisy environment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
6
0

Year Published

2007
2007
2014
2014

Publication Types

Select...
3
3

Relationship

3
3

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 14 publications
0
6
0
Order By: Relevance
“…each cell contributes to the decision by simply "voting" for one of the stored patterns, Electoral College / regional matching is always more stable than Direct Popular Vote / national matching; For Electoral College/regional matching, the larger the number of regions of a partition, the more noise can be accommodated without changing the original voting results, up to a certain limit after which the amount of noise it can accommodate will decrease as the region number of a partition further increases. Chen et al (2007) show the same behavior in the context of Associative Memory. An associative memory is a memory that is addressed using its contents.…”
Section: Electoral College Framework and Stabilitymentioning
confidence: 54%
See 1 more Smart Citation
“…each cell contributes to the decision by simply "voting" for one of the stored patterns, Electoral College / regional matching is always more stable than Direct Popular Vote / national matching; For Electoral College/regional matching, the larger the number of regions of a partition, the more noise can be accommodated without changing the original voting results, up to a certain limit after which the amount of noise it can accommodate will decrease as the region number of a partition further increases. Chen et al (2007) show the same behavior in the context of Associative Memory. An associative memory is a memory that is addressed using its contents.…”
Section: Electoral College Framework and Stabilitymentioning
confidence: 54%
“…The associate memory should be able to recall a stored pattern that is similar to the memory key such that noise polluted inputs can also be recognized. Chen et al (2007) prove that "Two-Level Hamming Associate memory" is the more stable than "One-level Hamming Associate memory". That is, the "Electoral College" type of voting memory is again more stable than the "Popular voting" type of memory; and the stability increases while the window sizes decreases up to a certain limit, after which it starts to decrease.…”
Section: Electoral College Framework and Stabilitymentioning
confidence: 83%
“…the "Electoral College" is more stable than the Paper [5] further shows the same behavior "Direct Popular Vote". That is: when an input in the context of Associative Memory.…”
mentioning
confidence: 72%
“…Some of the most successful deep learning methods have been implemented into artificial neural networks. In [5] , it was concluded that "the two-level decoupled Hamming network with middle-sized windows should be a more elegant associative memory model than the one-level Hamming associative memory in all the senses of efficiency, hardware implementation and capacity." [5] The whole process of how the closest memory pattern is selected to the two-level decoupled…”
Section: Eep Learning For Aimentioning
confidence: 99%
“…In each sub-memory set, the closest local sub-memory pattern to the sub-memory key is determined by running the Hamming associative memory operations. After all of the sub-memory sets obtain their closest sub-memory patterns to their submemory keys, the second level of two-level decoupled Hamming memory (the decision network) applies the voting mechanism on all sets of local sub-memory patterns and generates the closest local memory pattern to the two-level decoupled Hamming memory network [5]. In [1], it 14 is explained that it is important to break down AI problems (such as machine vision or natural language processing) into smaller problems and different levels of distributed representations for the purpose of understanding higher level of abstraction (In computer science, "abstraction" refers to a process of representing data and programs in a form very similar to its meaning, which also reduces the amount of engagements between programmer and tedious implementation details [11]).…”
Section: Eep Learning For Aimentioning
confidence: 99%