2019
DOI: 10.15866/irecap.v9i1.15330
|View full text |Cite
|
Sign up to set email alerts
|

Investigation and Comparison of Generalization Ability of Multi-Layer Perceptron and Radial Basis Function Artificial Neural Networks for Signal Power Loss Prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
11
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(11 citation statements)
references
References 0 publications
0
11
0
Order By: Relevance
“…Understanding the rudiments of the algorithm and the involved processes in the computational simulation of learning within the neural network models is very essential. Artificial neural networks such as VNN are the basic building blocks of AI technology and the foundation of ML models, which simulate processes of learning identical to those in the human brain [27], [28].…”
Section: Gradient Descent and Back-propagation In Machine Learning Al...mentioning
confidence: 99%
See 1 more Smart Citation
“…Understanding the rudiments of the algorithm and the involved processes in the computational simulation of learning within the neural network models is very essential. Artificial neural networks such as VNN are the basic building blocks of AI technology and the foundation of ML models, which simulate processes of learning identical to those in the human brain [27], [28].…”
Section: Gradient Descent and Back-propagation In Machine Learning Al...mentioning
confidence: 99%
“…Selection of an adequate model for a given problem complexity as well as tuning suitable parameters require a thorough understanding of the problem domain, and the algorithms must be adequately considered to ensure correct application of the models and interpretation of results. The gradient descent algorithm facilitates the speed of learning the ANNs from the dataset, particularly where modifications to the network's parameter values occur as a result of operations that involve data points and the neural network [28].…”
Section: Gradient Descent and Back-propagation In Machine Learning Al...mentioning
confidence: 99%
“…The processing elements are known as the neurons which are processor that operates on the inputs they are fed through the connections. The processors are vast parallel-spread out and comprises of simple processing units with natural tendency of storing and making available exponential knowledge [18]. Artificial neural networks acquire knowledge from the environment by means of learning process and the acquired knowledge is stored in the network by inter-neuron connection strength called synaptic weights [17], [18].…”
Section: Ii3 Concepts Of Artificial Neural Networkmentioning
confidence: 99%
“…The processors are vast parallel-spread out and comprises of simple processing units with natural tendency of storing and making available exponential knowledge [18]. Artificial neural networks acquire knowledge from the environment by means of learning process and the acquired knowledge is stored in the network by inter-neuron connection strength called synaptic weights [17], [18]. It derives its computing power from its architecture and its learning ability to generalize well [18].…”
Section: Ii3 Concepts Of Artificial Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation