ISCAS'99. Proceedings of the 1999 IEEE International Symposium on Circuits and Systems VLSI (Cat. No.99CH36349)
DOI: 10.1109/iscas.1999.777582
|View full text |Cite
|
Sign up to set email alerts
|

Design of bidirectional associative memories based on the perceptron training technique

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 2 publications
0
4
0
Order By: Relevance
“…As inspired by the perceptron-learning algorithm, an optimal learning scheme for a class of BAM is advanced [11], which has superior convergence and stability properties. In [12], the synthesis problem of bidirectional associative memories is formulated as a set of linear inequalities that can be solved using the perceptron training algorithm. A multilayer recursive neural network with symmetrical interconnections is introduced in [14] for improved recognition performance under noisy conditions and for increased storage capacity.…”
Section: Related Workmentioning
confidence: 99%
“…As inspired by the perceptron-learning algorithm, an optimal learning scheme for a class of BAM is advanced [11], which has superior convergence and stability properties. In [12], the synthesis problem of bidirectional associative memories is formulated as a set of linear inequalities that can be solved using the perceptron training algorithm. A multilayer recursive neural network with symmetrical interconnections is introduced in [14] for improved recognition performance under noisy conditions and for increased storage capacity.…”
Section: Related Workmentioning
confidence: 99%
“…A weighted-pattern learning algorithm for BAM is described in [11] by means of global minimization. As inspired by the perceptron-learning algorithm, an optimal learning scheme for a class of BAM is advanced [12]- [13], which has superior convergence and stability properties. In [14], the synthesis problem of bidirectional associative memories is formulated as a set of linear inequalities that can be solved using the perceptron training algorithm.…”
Section: Related Workmentioning
confidence: 99%
“…Due to the nature of the Hebbian learning rule, a Hebb rule trained BAM has a limited storage and mapping capability. Studies (such as [8]- [13]) have been carried out to address the latter problem. In the high-order BAM [8], a high-order nonlinearity is applied to forward and backward information flows to increase the memory capacity and to improve error-correction capability.…”
Section: Introductionmentioning
confidence: 99%
“…As inspired by the perceptronlearning algorithm, an optimal learning scheme for a class of BAM is advanced [12], which has a superior convergence and stability properties. In [13], the synthesis problem of bidirectional associative memories is formulated as a set of linear inequalities that can be solved using the perception-training algorithm. In this paper, three-layer bidirectional symmetrical and asymmetrical associative memories are presented, and a least mean square (LMS) type of delta rule learning algorithm [14] is used for training the networks.…”
Section: Introductionmentioning
confidence: 99%