2023
DOI: 10.1016/j.amc.2023.127864
|View full text |Cite
|
Sign up to set email alerts
|

A note on the complex and bicomplex valued neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(16 citation statements)
references
References 40 publications
0
16
0
Order By: Relevance
“…Moreover, proceeding similarly as in the proof of Theorem 3.3 of [4], we have the following result: Theorem 4.2. Let T ⊂ C 2 (i) a bounded domain and f : T → C a threshold function and (w 0 , 0) a weighting vector of f (z 1 , z 2 ).…”
Section: Bicomplex Convolutional Neural Networkmentioning
confidence: 70%
See 3 more Smart Citations
“…Moreover, proceeding similarly as in the proof of Theorem 3.3 of [4], we have the following result: Theorem 4.2. Let T ⊂ C 2 (i) a bounded domain and f : T → C a threshold function and (w 0 , 0) a weighting vector of f (z 1 , z 2 ).…”
Section: Bicomplex Convolutional Neural Networkmentioning
confidence: 70%
“…In the context of bicomplex convolutional neural networks, there are some activation functions proposed in the literature involving bicomplex numbers. For example, in [4] the authors considered the activation function P (z) = l in T ⊂ C n , where = exp 2πi k be the root of the unity of order k, and whenever 2πil k ≤ arg (z) < 2πi(l+1) k .…”
Section: Bicomplex Algebramentioning
confidence: 99%
See 2 more Smart Citations
“…These new approaches have been especially successful in using particular examples of hypercomplex numbers such as bicomplex numbers, quaternions, and Clifford ones to develop different machine learning techniques, see, for example, [18–20, 23, 25], and references therein. The authors have also extended the complex perceptron algorithm to the bicomplex case in [5, 6]. As we have already discussed, the LMS algorithm discovered by Widrow and Hoff was extended to the complex domain in [37] for the first time and the gradient descent technique was derived with respect to the real and imaginary part.…”
Section: Introductionmentioning
confidence: 99%