2022
DOI: 10.3389/fncom.2022.885207
|View full text |Cite
|
Sign up to set email alerts
|

Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn

Abstract: Neuroscience models commonly have a high number of degrees of freedom and only specific regions within the parameter space are able to produce dynamics of interest. This makes the development of tools and strategies to efficiently find these regions of high importance to advance brain research. Exploring the high dimensional parameter space using numerical simulations has been a frequently used technique in the last years in many areas of computational neuroscience. Today, high performance computing (HPC) can … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 49 publications
0
3
0
Order By: Relevance
“…In this regard, the utilization of specialized tools becomes pressing. One such tool that holds promise in navigating these immense data-sets is "Learning to Learn" (L2L) [37]. This automated machine learning framework is purpose-built for high-performance computing environments and is adept at employing gradient or evolutionary strategies to traverse expansive data generated by our framework.…”
Section: Discussionmentioning
confidence: 99%
“…In this regard, the utilization of specialized tools becomes pressing. One such tool that holds promise in navigating these immense data-sets is "Learning to Learn" (L2L) [37]. This automated machine learning framework is purpose-built for high-performance computing environments and is adept at employing gradient or evolutionary strategies to traverse expansive data generated by our framework.…”
Section: Discussionmentioning
confidence: 99%
“…Traditionally, hyper‐parameter optimization has been done manually because it is too costly to search for hyper‐parameter automatically 25 . Simple grid search and random search are inadequate for hyper‐parameter search in deep learning models 26 .…”
Section: Methodsmentioning
confidence: 99%
“…Traditionally, hyper-parameter optimization has been done manually because it is too costly to search for hyper-parameter automatically. 25 Simple grid search and random search are inadequate for hyper-parameter search in deep learning models. 26 The modern Bayesian optimization can judge the following search direction according to the search parameters and improve search efficiency.…”
Section: Tree-structured Pazren Estimator Approach (Tpe)mentioning
confidence: 99%
“…Thus, both in scientific projects and for the development of real-world applications of SNNs, model tuning through parameter search typically creates the highest demand in computing time. Currently available methods (Feurer and Hutter, 2019) such as grid search or random search (LaValle et al, 2004;Bergstra and Bengio, 2012), Bayesian optimization (Parsa et al, 2019), and machine learning approaches (Carlson et al, 2014;Yegenoglu et al, 2022) require extensive sampling of the parameter space. We therefore suggest to include the parameter search in benchmarking approaches to the efficient simulation of large-scale SNNs.…”
Section: Benchmarking With Grid Searchmentioning
confidence: 99%