2010
DOI: 10.1007/978-3-642-15193-4_56
|View full text |Cite
|
Sign up to set email alerts
|

Multi-objective Evolutionary Algorithms to Investigate Neurocomputational Issues: The Case Study of Basal Ganglia Models

Abstract: Abstract. The basal ganglia (BG) are a set of subcortical nuclei involved in action selection processes. We explore here the automatic parameterization of two models of the basal ganglia (the GPR and the CBG) using multi-objective evolutionary algorithms. We define two objective functions characterizing the supposed winner-takes-all functionality of the BG and obtain a set of solutions lying on the Pareto front for each model. We show that the CBG architecture leads to solutions dominating the GPR ones, this h… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2012
2012
2015
2015

Publication Types

Select...
2
1
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 23 publications
0
6
0
Order By: Relevance
“…Most real world optimization problems involve the simultaneous minimization of several objectives [ 69 ]. Thus when comparing different model architectures, it may be helpful to consider trade-offs separately along different dimensions [ 36 , 70 , 71 ]. The current study focused in particular on the trade-off between model prediction accuracy and parameter count.…”
Section: Discussionmentioning
confidence: 99%
“…Most real world optimization problems involve the simultaneous minimization of several objectives [ 69 ]. Thus when comparing different model architectures, it may be helpful to consider trade-offs separately along different dimensions [ 36 , 70 , 71 ]. The current study focused in particular on the trade-off between model prediction accuracy and parameter count.…”
Section: Discussionmentioning
confidence: 99%
“…To position our methodology among previous studies, it is also relevant to study how other works have used stochastic optimization methods to constrain the parameter choice of BG models. In Wang et al (2007); Liénard et al (2010) the aim of the optimization was to obtain a given functionality, namely selection. But the neuron models used were very abstract leaky integrators, and their main parameters were synaptic connection strengths coded as a numerical value in an arbitrary range.…”
Section: Related Studiesmentioning
confidence: 99%
“…In order to gain a better understanding on the mecanisms governing it, various neuro-computational models have been proposed (for a review, see [72]). Among them are the recent CBG model [1] and the more classical GPR model [2]; the parameters of both of them have been evolved with MOEA in a previous work [73].…”
Section: Discussionmentioning
confidence: 99%
“…Each connection parameter is to be evolved in the range [0.05, 1], as setting a connection weight to zero would be equivalent to delete this connection. For more details on the whereabouts of each models, we refer to the original articles [1,2] or to our previous work [73].…”
Section: Description Of the Modelsmentioning
confidence: 99%