2022
DOI: 10.1167/jov.22.14.3592
|View full text |Cite
|
Sign up to set email alerts
|

Bio-inspired divisive normalization improves object recognition performance in ANNs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(9 citation statements)
references
References 0 publications
0
9
0
Order By: Relevance
“…These results demonstrate how the connectivity of inhibition balanced networks is shaped by their input statistics and explain the experience-dependent formation of extra-classical receptive fields. [70][71][72][73][74] Unlike previous models, [75][76][77][78][79] our networks are composed of excitatory and inhibitory neurons with fully plastic recurrent connectivity.…”
Section: Discussionmentioning
confidence: 99%
“…These results demonstrate how the connectivity of inhibition balanced networks is shaped by their input statistics and explain the experience-dependent formation of extra-classical receptive fields. [70][71][72][73][74] Unlike previous models, [75][76][77][78][79] our networks are composed of excitatory and inhibitory neurons with fully plastic recurrent connectivity.…”
Section: Discussionmentioning
confidence: 99%
“…Surround suppression has been considered to have a number of beneficial roles in neural computation, for example, reducing coding redundancy and yielding more efficient neural codes [12, 14, 19, 21, 22, 24, 25, 27, 60, 61]. Some studies in machine learning noticed the lack of more sophisticated forms of brain-like divisive normalization in generic feedforward CNNs, and tried to integrate them into the network [47–51]. These studies found that incorporating divisive normalization in CNNs improves image classification in some limited cases, such as when the network is more shallow [49], when the dataset requires strong center-surround separation [49], or when the divisive normalization is combined with batch normalization [50].…”
Section: Discussionmentioning
confidence: 99%
“…Surround suppression has been considered to have a number of beneficial roles in neural computation, for example, reducing coding redundancy and yielding more efficient neural codes [12,14,19,21,22,24,25,27,60,61]. Some studies in machine learning noticed the lack of more sophisticated forms of brain-like divisive normalization in generic feedforward CNNs, and tried to integrate them into the network [47][48][49][50][51].…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations