2021
DOI: 10.1016/j.neunet.2021.04.036
|View full text |Cite
|
Sign up to set email alerts
|

Towards a mathematical framework to inform neural network modelling via polynomial regression

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 23 publications
(13 citation statements)
references
References 24 publications
0
13
0
Order By: Relevance
“…The initial step (Section 3.1) of NN2Poly involves considering several single hidden NNs in a regression setting, i.e., with linear output, situation solved in Morala et al (2021). Each of these NNs is a subnet with their output unit being each one of the units in the second hidden layer of the original NN, and therefore a polynomial is obtained at each unit.…”
Section: Nn2poly: Theoretical Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…The initial step (Section 3.1) of NN2Poly involves considering several single hidden NNs in a regression setting, i.e., with linear output, situation solved in Morala et al (2021). Each of these NNs is a subnet with their output unit being each one of the units in the second hidden layer of the original NN, and therefore a polynomial is obtained at each unit.…”
Section: Nn2poly: Theoretical Discussionmentioning
confidence: 99%
“…1). This situation is solved in Morala et al (2021) and the obtained solution is a PR up to degree Q 1 = q 1 , where q 1 is the degree at which the Taylor expansion is truncated when approximating the activation function at the first layer (l = 1). Therefore, the polynomial regression is:…”
Section: Single Hidden Layer Regression Casementioning
confidence: 99%
See 3 more Smart Citations