2023
DOI: 10.1016/j.acha.2022.08.006
|View full text |Cite
|
Sign up to set email alerts
|

Understanding neural networks with reproducing kernel Banach spaces

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(16 citation statements)
references
References 44 publications
0
16
0
Order By: Relevance
“…It would then be of practical interest to study the fluctuation and concentration properties of such zero-mean noise term with respect to the size of the neural network. Other approximation methods for the MZ memory and noise terms can be built on the backbone of continuous-time approximation methods, e.g., based on functional analysis [53,54,29], combinatorics [55], data-driven methods [3,40,30,33], Markovian embedding techniques [21,18,32,25,7], or projections based on reproducing kernel Hilbert or Banach spaces [1,39,52].…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…It would then be of practical interest to study the fluctuation and concentration properties of such zero-mean noise term with respect to the size of the neural network. Other approximation methods for the MZ memory and noise terms can be built on the backbone of continuous-time approximation methods, e.g., based on functional analysis [53,54,29], combinatorics [55], data-driven methods [3,40,30,33], Markovian embedding techniques [21,18,32,25,7], or projections based on reproducing kernel Hilbert or Banach spaces [1,39,52].…”
Section: Discussionmentioning
confidence: 99%
“…Other cases where M is not square can be handled similarly [38,22]. By following the same procedure, it is possible also to derive the transition density for the general neural network model (1).…”
Section: Multi-layer Conditional Transition Densitymentioning
confidence: 99%
See 1 more Smart Citation
“…With the corresponding norms, they are able to prove bounds on the generalisation error. On the other hand Bartolucci et al [2023] used reproducing kernel Banach spaces to understand neural networks and prove a general reproducing theorem for these spaces. Nowak [2020, 2021a,b] analysed the neural network using ridge splines, where the norm can be represented in terms of an inverse Radon transform of a bounded real measure.…”
Section: Neural Networkmentioning
confidence: 99%
“…Nowak [2020, 2021a,b] analysed the neural network using ridge splines, where the norm can be represented in terms of an inverse Radon transform of a bounded real measure. Bartolucci et al [2023] showed that this method is a constructive realisation of a reproducing kernel Banach space for certain activation functions like the ReLu.…”
Section: Neural Networkmentioning
confidence: 99%