2021
DOI: 10.1109/tnnls.2020.2978857
|View full text |Cite
|
Sign up to set email alerts
|

Evolving Deep Neural Networks via Cooperative Coevolution With Backpropagation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 25 publications
(13 citation statements)
references
References 33 publications
0
13
0
Order By: Relevance
“…According to Lemma 11, V 1 (t) = 0 when t > t * , which illustrates that network (2) achieves FTS under controller (5). â–Ș Now, we discuss a special case of network (2), that is, when the feedback gain 𝜀 1 = 0, the state feedback controller ( 5) is reduced to…”
Section: Fixed Coupling Weightsmentioning
confidence: 99%
See 1 more Smart Citation
“…According to Lemma 11, V 1 (t) = 0 when t > t * , which illustrates that network (2) achieves FTS under controller (5). â–Ș Now, we discuss a special case of network (2), that is, when the feedback gain 𝜀 1 = 0, the state feedback controller ( 5) is reduced to…”
Section: Fixed Coupling Weightsmentioning
confidence: 99%
“…In recent years, neural networks have received a lot of attention because of their extensive application in some fields, and a number of results have been reported in the literature, such as convolutional neural networks 1 and backpropagation neural networks. 2 Coupled neural networks (CNNs) as a special case of neural networks are made up of multiple interconnected neurons of neural networks, which can be applied to model a wide range of actual networks, including biological networks 3 and Internet. 4 Moreover, CNNs have a more sophisticated dynamic feature in contrast to regular neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…3) The final is using the EA and the gradient-based method alternatively where the output of one method servers the start point for another, and this procedure is iterated until some stopping criteria are met. Existing works mainly differ in the order of applying these two methods in the iterative procedure and the individuals chosen from being continually optimized by using the gradient-based method [38], [41]- [43], [47]. For example, in [42], the EA is applied first, and the top 10% individuals with higher fitness in the final population of the EA are further optimized by the gradient-based methods.…”
Section: B Taxonomy and Survey Of Existing Ea-based Approaches 1) Taxonomymentioning
confidence: 99%
“…For example, in [42], the EA is applied first, and the top 10% individuals with higher fitness in the final population of the EA are further optimized by the gradient-based methods. In [38], the model is trained by using the gradient-based method until the performance improvement is below a certain threshold. Then, the EA is applied to further optimize model parameters.…”
Section: B Taxonomy and Survey Of Existing Ea-based Approaches 1) Taxonomymentioning
confidence: 99%
“…Likewise, Mandal et al [12] also used BP ANN in simulating As(III) removal with R 2 above 0.97 for both training and validation processes. However, the drawbacks of BP ANN that it tends to be trapped by local optima due to severe initialization sensitivity are often put forward by researchers [13]. Apart from that, the high requirements for computational complexity and memory in some BP ANN intrinsic algorithms like Levenberg-Marquardt also deserve proper attention [14].…”
Section: Application Of the Ann Model On Modelling The Watermentioning
confidence: 99%