2017
DOI: 10.1007/s00521-017-3134-1
|View full text |Cite
|
Sign up to set email alerts
|

Distributed cooperative learning algorithms using wavelet neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 39 publications
0
8
0
Order By: Relevance
“…In this work, the WNN algorithm [ 21 ] was used to classify and discriminate real and fake blood because it combines the wavelet transform and the artificial neural network. It has the great capacities of strong learning, self-adaptability and fault tolerance because it avoids nonlinear optimization problems, including the blindness of structure design and local optimization of BPNN [ 22 ].…”
Section: Theoriesmentioning
confidence: 99%
“…In this work, the WNN algorithm [ 21 ] was used to classify and discriminate real and fake blood because it combines the wavelet transform and the artificial neural network. It has the great capacities of strong learning, self-adaptability and fault tolerance because it avoids nonlinear optimization problems, including the blindness of structure design and local optimization of BPNN [ 22 ].…”
Section: Theoriesmentioning
confidence: 99%
“…However, the computational resources of traditional centralized WNNs will tend to saturate with the increase of data. Notice that the distributed cooperative learning algorithm of the WNNs model [20], which uses the communication structure of graph G(υ, ε, Λ), where each non-empty node set υ i in υ jointly maintains the coefficient matrix Λ. Furthermore, the existence of the edge ε(i, j ) represents the connection of two adjacent nodes i and j , and the weight ω ij > 0 when edge exist, and ω ij = 0 in other cases.…”
Section: Problem Formulationmentioning
confidence: 99%
“…Inspired by [7,14], the optimization function and training data can be thus formed in distributed learning structure, which is of great interest in WNNs. In response to this, a distributed cooperative learning (DCL) algorithm based on WNNs was developed in [20], which shared parameters with neighbors in calculating the estimates. However, a large amount of weight information needs to be communicated in the sharing process, and it cannot be work well in high-dimensional scenario.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations