2020
DOI: 10.1007/s10208-020-09461-0
|View full text |Cite
|
Sign up to set email alerts
|

Topological Properties of the Set of Functions Generated by Neural Networks of Fixed Size

Abstract: We analyze the topological properties of the set of functions that can be implemented by neural networks of a fixed size. Surprisingly, this set has many undesirable properties. It is highly non-convex, except possibly for a few exotic activation functions. Moreover, the set is not closed with respect to $$L^p$$ L p -norms, $$0< p < \infty $$ 0 < … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
51
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 56 publications
(58 citation statements)
references
References 36 publications
3
51
0
Order By: Relevance
“…MAP means a polynomial model fit was carried out first, DATA means that the ISF was directly fitted to the data, O(α) indicates that order α polynomials were used deep neural networks [5,12], or other kinds of nonlinear approximation methods [9], that allow to represent high-dimensional functions with reasonable efficiency as opposed to polynomials. The challenge with nonlinear approximations, in particular with neural networks, is that they can be difficult to fit to data, because the dis-tance between parameters that provide small improvements in accuracy can be large and therefore not easy to find [21]. Nevertheless, deep neural networks have enabled great advances in many fields of engineering and therefore this approach will be explored elsewhere.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…MAP means a polynomial model fit was carried out first, DATA means that the ISF was directly fitted to the data, O(α) indicates that order α polynomials were used deep neural networks [5,12], or other kinds of nonlinear approximation methods [9], that allow to represent high-dimensional functions with reasonable efficiency as opposed to polynomials. The challenge with nonlinear approximations, in particular with neural networks, is that they can be difficult to fit to data, because the dis-tance between parameters that provide small improvements in accuracy can be large and therefore not easy to find [21]. Nevertheless, deep neural networks have enabled great advances in many fields of engineering and therefore this approach will be explored elsewhere.…”
Section: Discussionmentioning
confidence: 99%
“…We can accurately identify the dynamics on an ISF and determine its instantaneous damping ratio (22) and angular frequency (21). It is, however, not possible to attach a unique amplitude to a leaf within a foliation.…”
Section: The Backbone and Damping Curves Of An Isfmentioning
confidence: 99%
“…We would like to emphasize that the finite-dimensional problem (5) is not any easier than the infinite-dimensional problem (4); they simply require different techniques. In particular, our results do not follow from the results in [3,7,14] for infinitedimensional spaces -we will have more to say about this in Section 2.…”
Section: Introductionmentioning
confidence: 58%
“…The construction of Example 4 parallels the formulation given in [46,47]. However, in [47] elements of F are referred to as neural networks and functions in N N (F , ) are called their realizations.…”
Section: Remarkmentioning
confidence: 95%