2009
DOI: 10.1016/j.neunet.2009.06.033
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of a spiking neural network and an MLP for robust identification of generator dynamics in a multimachine power system

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 16 publications
(7 citation statements)
references
References 7 publications
0
7
0
Order By: Relevance
“…[26] So far, different types of neural network architectures and their performances have been studied. [27][28][29][30][31] This includes multilayer perceptrons (MLPs), radial basis functions (RBFs), recurrent neural networks (RNNs), and echo-state networks (ESNs). In this work, MLP neural network was used.…”
Section: Methods Artificial Neural Networkmentioning
confidence: 99%
“…[26] So far, different types of neural network architectures and their performances have been studied. [27][28][29][30][31] This includes multilayer perceptrons (MLPs), radial basis functions (RBFs), recurrent neural networks (RNNs), and echo-state networks (ESNs). In this work, MLP neural network was used.…”
Section: Methods Artificial Neural Networkmentioning
confidence: 99%
“…The spiking neuron receiving a higher signal fires sooner than when the same neuron receives a weaker signal. The same idea is supported by the [24][25][26] and called delay coding which has been used in numerous research studies [3,6,10]. In this study the following formula is employed for encoding input variables into spike times…”
Section: Coding and Decodingmentioning
confidence: 99%
“…Some effort has been made to take the ease of inputting data into first-generation networks and still capture the power of an SNN. The model in [7] and the application thereof in [8] utilize spiking probabilities as a complicated activation function. It is demonstrated in [8] that this is a more powerful function approximator than an MLP, but it remains merely inspired by SNNs, rather than truly operating on spike propagation and recursion.…”
Section: Introductionmentioning
confidence: 99%
“…Using the centers of each neuron/Gaussian function and their width, the magnitude of firing f(x) for each sample points of the input x is calculated using(8).…”
mentioning
confidence: 99%