2012
DOI: 10.2478/v10006-012-0034-5
|View full text |Cite
|
Sign up to set email alerts
|

Backpropagation generalized delta rule for the selective attention Sigma-if artificial neural network

Abstract: Backpropagation generalized delta rule for the selective attention Sigma-if artificial neural networkIn this paper the Sigma-if artificial neural network model is considered, which is a generalization of an MLP network with sigmoidal neurons. It was found to be a potentially universal tool for automatic creation of distributed classification and selective attention systems. To overcome the high nonlinearity of the aggregation function of Sigma-if neurons, the training process o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
4
3
2

Relationship

2
7

Authors

Journals

citations
Cited by 31 publications
(12 citation statements)
references
References 28 publications
0
12
0
Order By: Relevance
“…The key element of our findings is a neuron model that can explore some selective attention strategies based on our algorithm of its parameters' values selection [27]. Our study shows that the feature selection problem can be solved by using the model of selective attention neural network that we have developed.…”
Section: Related Workmentioning
confidence: 79%
See 1 more Smart Citation
“…The key element of our findings is a neuron model that can explore some selective attention strategies based on our algorithm of its parameters' values selection [27]. Our study shows that the feature selection problem can be solved by using the model of selective attention neural network that we have developed.…”
Section: Related Workmentioning
confidence: 79%
“…The Sigma-if neural network system proposed is a type of a fully connected, synchronous multilayer perceptron neural network (MLP) with the selective attention abilities [19], [27]. Such a neural networks do not need any separate centralised attention guidance modules.…”
Section: The Proposed Systemmentioning
confidence: 99%
“…In the task of predicting the corrections, the sigmoidal function is applied. The task of neural network training is to determine the weight values [15]. After the training process, the neural network output value should be consistent with the expected value, called the training set.…”
Section: Artificial Neural Networkmentioning
confidence: 99%
“…They all stimulate the need for more precise and general solutions when the tools have some technical limitations. From the other standpoint, some data are valuable only when considered in context of other information [14], [15]. Unstructured text resources are the representative example of it [16], [17].…”
Section: Introductionmentioning
confidence: 99%