2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence) 2008
DOI: 10.1109/ijcnn.2008.4634085
|View full text |Cite
|
Sign up to set email alerts
|

Evolving spiking neural networks for taste recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
28
0
2

Year Published

2009
2009
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 33 publications
(30 citation statements)
references
References 21 publications
0
28
0
2
Order By: Relevance
“…Other studies have utilised eSNN as a general classification method, e.g. in the context of classifying water and wine samples [70]. The first eSNNs were based on the Thorpe's neural model [78], in which the importance of early spikes (after the onset of a certain stimulus) is boosted called rank-order coding and learning.…”
Section: Evolving Spiking Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…Other studies have utilised eSNN as a general classification method, e.g. in the context of classifying water and wine samples [70]. The first eSNNs were based on the Thorpe's neural model [78], in which the importance of early spikes (after the onset of a certain stimulus) is boosted called rank-order coding and learning.…”
Section: Evolving Spiking Neural Networkmentioning
confidence: 99%
“…The classification performance of eSNN was experimentally explored [70,69] based on water and wine samples collected from [8] and [58]. The topology of the model consists of two layers.…”
Section: Taste Recognitionmentioning
confidence: 99%
“…However, since the algorithm was based on an ES, it was extremely time consuming to train. More recently, an online evolving SNN was implemented [9], which adds a new neuron to the output layer for each sample. If the weights of the new neuron are similar to any of the previously added neurons, then the two neurons are merged.…”
Section: A Training Algorithmsmentioning
confidence: 99%
“…where τ + and τ − were set to 15 ms. The second window (Bi and Poo) was asymmetrical about the time axis and can also be described by (9) and (10), where the parameters τ + and τ − were set to 16.8 and 33.7 ms, respectively [13], [25]. The final window (Gerstner) [49] was an asymmetrically skewed "sinusoidal" shaped window where τ + and τ − were set to 12 and 24 ms, respectively.…”
Section: B Wbc Datasetmentioning
confidence: 99%
“…In [5] an evolving SNN (ESNN) was introduced and applied to pattern recognition problems, later this work was extended to speaker authentication tasks and even to audio-visual pattern recognition [10]. A similar spiking neural model was analyzed [7], in which a classification problem for taste recognition was addressed. Based on a simple but efficient neural model, these approaches used the ESNN architecture, which was trained by a fast one-pass learning algorithm.…”
Section: Introductionmentioning
confidence: 99%