2023
DOI: 10.1109/tnnls.2021.3102378
|View full text |Cite
|
Sign up to set email alerts
|

SpaRCe: Improved Learning of Reservoir Computing Systems Through Sparse Representations

Abstract: Sparse" neural networks, in which relatively few neurons or connections are active, are common in both machine learning and neuroscience. While, in machine learning, "sparsity" is related to a penalty term that leads to some connecting weights becoming small or zero, in biological brains, sparsity is often created when high spiking thresholds prevent neuronal activity. Here, we introduce sparsity into a reservoir computing network via neuron-specific learnable thresholds of activity, allowing neurons with low … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
2
1

Relationship

3
7

Authors

Journals

citations
Cited by 21 publications
(8 citation statements)
references
References 59 publications
0
5
0
Order By: Relevance
“…Finally, activity-dependent compensation may provide useful techniques for machine learning. For example, we found that performance of a reservoir computing network could be improved if thresholds of individual neurons are initialized to achieve a particular activity probability given the distribution of input activities ( 65 ).…”
Section: Discussionmentioning
confidence: 99%
“…Finally, activity-dependent compensation may provide useful techniques for machine learning. For example, we found that performance of a reservoir computing network could be improved if thresholds of individual neurons are initialized to achieve a particular activity probability given the distribution of input activities ( 65 ).…”
Section: Discussionmentioning
confidence: 99%
“…Adding recurrent connections to a layer in a network can enable reverberation of activity, which can be a form of memory in the network [Manneschi et al, 2019[Manneschi et al, , 2021. Recurrent pathways can transform the kinds of classification problems the network is able to solve [Cope et al, 2018].…”
Section: Patterns Of Connectivity: Reverberation and Processing Acros...mentioning
confidence: 99%
“…Here we employ short training-datasets (225 data points) to mimic real-world applications with strict data-collection time and energy requirements. To avoid overfitting, we employ a sophisticated feature-selection algorithm (each FMR output channel is considered a single computational 'feature' 62,76 ) with 10-fold cross validation (see Methods). Performance is evaluated via the mean squared error (MSE) between the reservoir-prediction and target.…”
Section: /24mentioning
confidence: 99%