2021 International Joint Conference on Neural Networks (IJCNN) 2021
DOI: 10.1109/ijcnn52387.2021.9533744
|View full text |Cite
|
Sign up to set email alerts
|

BGADAM: Boosting based Genetic-Evolutionary ADAM for Neural Network Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 12 publications
0
2
0
Order By: Relevance
“…Another graph neural network (GNN) based approach is to train the GNN using subgraphs as mini-batches instead of training a GNN on the full original graph. This method achieved robust performance with less training time and memory resource requirements [29].…”
Section: Literature Reviewmentioning
confidence: 99%
“…Another graph neural network (GNN) based approach is to train the GNN using subgraphs as mini-batches instead of training a GNN on the full original graph. This method achieved robust performance with less training time and memory resource requirements [29].…”
Section: Literature Reviewmentioning
confidence: 99%
“…These methods are referred as Adam-type algorithms since the adaptive learning rates are employed. Further, Adam has attained the most wide application in many deep learning training tasks, such as optimization of convolutional neural networks and recurrent neural networks [ 19 , 20 ]. Despite its popularity, Adam incurs the convergence issue.…”
Section: Introductionmentioning
confidence: 99%