The 2010 International Joint Conference on Neural Networks (IJCNN) 2010
DOI: 10.1109/ijcnn.2010.5596519
|View full text |Cite
|
Sign up to set email alerts
|

Enhanced Two-Phase method in fast learning algorithms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 14 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…It takes 500,000 epochs. In [23] the limited error by less than 3 · 10 −4 . The convergence rate was very slow.…”
Section: Implementation Dbbplr Algorithmmentioning
confidence: 91%
“…It takes 500,000 epochs. In [23] the limited error by less than 3 · 10 −4 . The convergence rate was very slow.…”
Section: Implementation Dbbplr Algorithmmentioning
confidence: 91%
“…The main drawback of the SBP algorithm is slowing down training as it often takes along time to learn and get the desired results. However, the network can get stuck at a local minimum when, O r the output training of the output layer, approach the extremes of 1 or 0 [5].…”
Section: Introductionmentioning
confidence: 99%
“…The Enhanced Two-Phase Method (E2P) is proposed in [22] to solve the local minimum problem and the error overshooting problem. E2P can effectively identify them when one of these two problems exists, and hence assign suitable fast learning algorithms to speed up the learning process with a better global convergence capability.…”
mentioning
confidence: 99%