2022
DOI: 10.1007/s10994-022-06266-w
|View full text |Cite
|
Sign up to set email alerts
|

A brain-inspired algorithm for training highly sparse neural networks

Abstract: Sparse neural networks attract increasing interest as they exhibit comparable performance to their dense counterparts while being computationally efficient. Pruning the dense neural networks is among the most widely used methods to obtain a sparse neural network. Driven by the high training cost of such methods that can be unaffordable for a low-resource device, training sparse neural networks sparsely from scratch has recently gained attention. However, existing sparse training algorithms suffer from various … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 44 publications
0
4
0
Order By: Relevance
“…It considers statistics of the input and hidden node outputs to sparsify a network [23]. The Hebbian learning inspired a number of correlation-based pruning algorithms [23], [37], [41], [42]. If two neurons simultaneously fire (high correlation between their firing rates), the strength of their synaptic connection is increased [43].…”
Section: ) Correlation Based Pruningmentioning
confidence: 99%
See 1 more Smart Citation
“…It considers statistics of the input and hidden node outputs to sparsify a network [23]. The Hebbian learning inspired a number of correlation-based pruning algorithms [23], [37], [41], [42]. If two neurons simultaneously fire (high correlation between their firing rates), the strength of their synaptic connection is increased [43].…”
Section: ) Correlation Based Pruningmentioning
confidence: 99%
“…Then, it removes connections between weakly correlated neurons and keeps ones with strong correlations [41]. The same idea is also tested by using the cosine similarity metric [42]. Weight matrices are sparsified starting from the last layer to the previous layers in sequence [41].…”
Section: ) Correlation Based Pruningmentioning
confidence: 99%
“…It considers statistics of the input and hidden node outputs to sparsify a network [23]. The Hebbian learning inspired a number of correlation-based pruning algorithms [23], [37], [41], [42]. If two neurons simultaneously fire (high correlation between their firing rates), the strength of their synaptic connection is increased [43].…”
Section: ) Correlation Based Pruningmentioning
confidence: 99%
“…Feature selection helps avoid overfitting by removing such features (Vlasic et al, 2023). Enhances Interpretability: By selecting meaningful features, the model becomes more interpretable, allowing users to understand how the predictions are made (Atashgahi et al, 2023). Identifies Correlated Features: Feature selection identifies correlated features and removes them, reducing redundancy in the dataset (Das et al, 2022).…”
Section: Feature Selectionmentioning
confidence: 99%