1990
DOI: 10.1364/ao.29.001591
|View full text |Cite
|
Sign up to set email alerts
|

Generalization of the backpropagation neural network learning algorithm to permit complex weights

Abstract: The backpropagation neural network learning algorithm is generalized to include complex-valued interconnections for possible optical implementations.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

1993
1993
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 22 publications
(4 citation statements)
references
References 4 publications
0
4
0
Order By: Relevance
“…Other studies in the area of complex-valued networks include a complex backpropagation algorithm to train multilayered feed-forward networks. Details of this work can be found in Little et al (1990), among others. Recently we have proposed discrete and continuous versions of a complex associative memory and have made capacity estimates for the discrete model (Chakravarthy and Ghosh 1994).…”
Section: The Complex Neuron Modelmentioning
confidence: 99%
“…Other studies in the area of complex-valued networks include a complex backpropagation algorithm to train multilayered feed-forward networks. Details of this work can be found in Little et al (1990), among others. Recently we have proposed discrete and continuous versions of a complex associative memory and have made capacity estimates for the discrete model (Chakravarthy and Ghosh 1994).…”
Section: The Complex Neuron Modelmentioning
confidence: 99%
“…g1 (IV(r, t)I) a1 IV(r, t)I, (5) where a is a positive constant. g1 (IV(r, t)I) a1 IV(r, t)I, (5) where a is a positive constant.…”
Section: Pcmmentioning
confidence: 99%
“…In all-optical neural networks implemented by using coherent lights[1, 2, 3], the states of neurons and the synaptic weights are represented, respectively, by complex optical fields and complex amplitude transmission functions which have both amplitude and phase information [4,5]. In addition, the complex neural fields in such networks have dynamics that is continuous both in time and space.…”
Section: Introductionmentioning
confidence: 99%
“…Another interesting extension would be to study a network where all the network parameters are complex numbers. Details of this work can be found in [13], and [14] among others. Recently, a complex version of Hopfield's continous model has been proposed by Hirose [7].…”
Section: Introductionmentioning
confidence: 99%