2019
DOI: 10.48550/arxiv.1910.08336
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

KerCNNs: biologically inspired lateral connections for classification of corrupted images

Abstract: The state of the art in many computer vision tasks is represented by Convolutional Neural Networks (CNNs). Although their hierarchical organization and local feature extraction are inspired by the structure of primate visual systems, the lack of lateral connections in such architectures critically distinguishes their analysis from biological object processing. The idea of enriching CNNs with recurrent lateral connections of convolutional type has been put into practice in recent years, in the form of learned r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(11 citation statements)
references
References 33 publications
0
11
0
Order By: Relevance
“…The proposed PDE-G-CNNs form a new, unique class of equivariant neural networks, and we show in section 5.4 how regular continuous G-CNNs arise as a special case of our PDE-G-CNNs under a fixed, pre-defined choice of convection parameters KerCNNs An approach to introducing horizontal connectivity in CNNs that does not require a Lie group structure was proposed by Montobbio et al [59,60] in the form of KerCNNs. In this biologically inspired metric model a diffusion process is used to achieve intra-layer connectivity.…”
Section: Related Workmentioning
confidence: 96%
“…The proposed PDE-G-CNNs form a new, unique class of equivariant neural networks, and we show in section 5.4 how regular continuous G-CNNs arise as a special case of our PDE-G-CNNs under a fixed, pre-defined choice of convection parameters KerCNNs An approach to introducing horizontal connectivity in CNNs that does not require a Lie group structure was proposed by Montobbio et al [59,60] in the form of KerCNNs. In this biologically inspired metric model a diffusion process is used to achieve intra-layer connectivity.…”
Section: Related Workmentioning
confidence: 96%
“…A possible interpretation of the proposed iteration with a kernel defined by the SE(2) group as a neural computation in V1 comes from the modeling of the neural connectivity as a kernel operation (Wilson and Cowan, 1972;Ermentrout and Cowan, 1980;Citti and Sarti, 2015;Montobbio et al, 2018), especially if considered in the framework of a neural system that aims to learn group invariant representations of visual stimuli (Anselmi and Poggio, 2014;Anselmi et al, 2020). A direct comparison of the proposed technique with kernel techniques recently introduced with radically different purposes in Montobbio et al (2018) and Montobbio et al (2019) shows, however, two main differences at the level of the kernel that is used: here, we need the dual wavelet to build the projection kernel, and the iteration kernel effectively contains the feature maps. On the other hand, a possible application is the inclusion of a solvability condition such as Equation ( 14) as iterative steps within learning frameworks such as those of Anselmi et al (2019) and Anselmi et al (2020).…”
Section: Discussionmentioning
confidence: 99%
“…Moreover, the ideas behind the LNP model have been a main source of inspiration in other disciplines, notably for the design of relevant mathematical and computational theories, such as wavelets and convolutional neural networks (Marr, 1980;LeCun et al, 2010). We also point out that the use of groups and invariances to describe the variability of the neural activity has proved to be a solid idea to build effective models (Citti and Sarti, 2006;Anselmi and Poggio, 2014;Petitot, 2017), whose influence on the design of artificial learning architectures is still very strong (Anselmi et al, 2019(Anselmi et al, , 2020Montobbio et al, 2019;Lafarge et al, 2021).…”
Section: Introductionmentioning
confidence: 87%
“…A possible interpretation of this iteration with a kernel defined by the SE(2) group as a neural computation in V1 comes from the modeling of the neural connectivity as a kernel operation [53,29,19,43], especially if considered in the framework of a neural system that aims to learn group invariant representations of visual stimuli [7,6]. A direct comparison of the proposed technique with kernel techniques recently introduced with radically different purposes in [44,43] shows however two main differences at the level of the kernel that is used: here we need the dual wavelet to build the projection kernel, and the iteration kernel effectively contains the feature maps. On the other hand, a possible application is the inclusion of a solvability condition such as (14) as iterative steps within learning frameworks such as those of [5,6].…”
Section: Discussionmentioning
confidence: 99%
“…Moreover, the ideas behind the LNP model have been a main source of inspiration in other disciplines, notably for the design of relevant mathematical and computational theories, such as wavelets and convolutional neural networks [41,39]. We also point out that the use of groups and invariances to describe the variability of the neural activity has proved to be a solid idea to build effective models [46,18,7], whose influence on the design of artificial learning architectures is still very strong [5,44,6,38],…”
Section: Introductionmentioning
confidence: 99%