2006
DOI: 10.1142/s0218127406015805
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Algorithms for Neural Network Supervised Learning: A Deterministic Optimization Approach

Abstract: Networks of neurons can perform computations that even modern computers find very difficult to simulate. Most of the existing artificial neurons and artificial neural networks are considered biologically unrealistic, nevertheless the practical success of the backpropagation algorithm and the powerful capabilities of feedforward neural networks have made neural computing very popular in several application areas. A challenging issue in this context is learning internal representations by adjusting the weights o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2007
2007
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(11 citation statements)
references
References 78 publications
(136 reference statements)
0
11
0
Order By: Relevance
“…Since ANNs traditionally learn internal representations by adjusting the weights of the network connections, the inclusion of an adaptive algorithm improves learning by retropropagation through non-linear optimisation theory. This approach is useful when the learning process of an ANN is affected by accuracy flaws in the numerical computation, and environmental changes that cause unpredictable deviations of the parameter values from the original configuration [ 83 ].…”
Section: Resultsmentioning
confidence: 99%
“…Since ANNs traditionally learn internal representations by adjusting the weights of the network connections, the inclusion of an adaptive algorithm improves learning by retropropagation through non-linear optimisation theory. This approach is useful when the learning process of an ANN is affected by accuracy flaws in the numerical computation, and environmental changes that cause unpredictable deviations of the parameter values from the original configuration [ 83 ].…”
Section: Resultsmentioning
confidence: 99%
“…With the large-scale acceptance ANNs as e±cient feature extractors, it has been applied to¯elds like occupational stress analysis and prediction 27 and prediction of inside air temperature of pillar coolers based on parameters like outside temperature, watering, and airing. 28 Neural networks depend on various factors for getting good classi¯cation results, 29 one of the most important factors being the training algorithm used to train the network. Choosing the best suited and optimized training algorithm for training the network is a crucial step since it a®ects the time required to train the model, its accuracy, precision, and requirement of computing power.…”
Section: Related Workmentioning
confidence: 99%
“…The challenge is then to combine single user QoE optimization with a QoE training mechanism in a closed-loop manner to progressively learn the value of W * . To do so, we develop two sub-frameworks and make them interact together within a closed-loop based framework, one is for QoE optimization and the other is for QoE training (see Fig.4 and Fig.5); 2) QoE training tool: To compute W * , we use a simple neural network [17], where the training samples are couples of QoE metrics and user feedback. We define the training dataset as {(Φ * ru , F ru )} 1≤u≤U , where Φ * ru is the vector of QoE metrics delivered by (10) under throughput r u and vector W. F ru being the corresponding feedback.…”
Section: B Practical Solutionmentioning
confidence: 99%