2012
DOI: 10.1016/j.neucom.2011.10.017
|View full text |Cite
|
Sign up to set email alerts
|

Sparse and silent coding in neural circuits

Abstract: Sparse coding algorithms are about finding a linear basis in which signals can be represented by a small number of active (non-zero) coefficients. Such coding has many applications in science and engineering and is believed to play an important role in neural information processing. However, due to the computational complexity of the task, only approximate solutions provide the required efficiency (in terms of time). As new results show, under particular conditions there exist efficient solutions by minimizing… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2012
2012
2016
2016

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 67 publications
0
5
0
Order By: Relevance
“…These are commonly-used quantitative metrics to evaluate the performance of mitotic cell recognition. λ and γ are again trade-off parameters controlling the balance between the reconstruction quality and sparsity [24, 25], when comparing the performance of different dictionary learning strategies with four visual features and different configurations, λ and γ were set to 0.1 [26, 27] and C is set to 1.…”
Section: Resultsmentioning
confidence: 99%
“…These are commonly-used quantitative metrics to evaluate the performance of mitotic cell recognition. λ and γ are again trade-off parameters controlling the balance between the reconstruction quality and sparsity [24, 25], when comparing the performance of different dictionary learning strategies with four visual features and different configurations, λ and γ were set to 0.1 [26, 27] and C is set to 1.…”
Section: Resultsmentioning
confidence: 99%
“…However, overcompleteness presents a non-trivial challenge for computational models on neural representations. In comparison with biological data, most computational models of SC can find the optimal solution if overcompleteness is 2 to 8-fold at most [16] . Importantly, higher level of overcompleteness may increase the overall metabolic cost of neural coding for two reasons.…”
Section: Resultsmentioning
confidence: 99%
“…Its main limitation is the slow convergence rate. The combination, termed Subspace Cross-Entropy (SCE) [16] method inherits the best of both worlds: it is reasonably fast and still can yield the optimal solution even at a higher level of overcompleteness. Since we are interested in the formation of sparse codes at very high level of overcompleteness, we used SCE in our numerical experiments.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations