2021
DOI: 10.1002/sam.11531
|View full text |Cite
|
Sign up to set email alerts
|

Fourier neural networks as function approximators and differential equation solvers

Abstract: We present a Fourier neural network (FNN) that can be mapped directly to the Fourier decomposition. The choice of activation and loss function yields results that replicate a Fourier series expansion closely while preserving a straightforward architecture with a single hidden layer. The simplicity of this network architecture facilitates the integration with any other higher-complexity networks, at a data pre-or postprocessing stage. We validate this FNN on naturally periodic smooth functions and on piecewise … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 16 publications
(31 reference statements)
0
5
0
Order By: Relevance
“…Ref. [22] proposed trigonometric activation functions, which by coincidence are very similar to the example that will be presented below. The design and choice of these activations are, however, seldomly guided by principles and methods, but rather trial and error.…”
Section: Neural Networkmentioning
confidence: 81%
“…Ref. [22] proposed trigonometric activation functions, which by coincidence are very similar to the example that will be presented below. The design and choice of these activations are, however, seldomly guided by principles and methods, but rather trial and error.…”
Section: Neural Networkmentioning
confidence: 81%
“…Periodic activation functions have seen limited adoption in machine learning outside of specific areas such as Fourier Neural Networks [20] and Implicit Neural Representations [21]. This is attributable primarily to the training difficulties introduced by the non-convex loss landscapes of neurons with periodic activations [22].…”
Section: Periodic Activation Functionsmentioning
confidence: 99%
“…We here capitalize on the straightforward formalization offered by SysId methods. The AI alternatives form a pioneering field of research, which has recently garnered attention [34]: it leverages the algebraic similarities between the structure of conventional time regression methods and the convolution operations at the basis of convolutional neural networks. It is though still not clear whether such networks can be distilled to be ported on resource-constrained devices.…”
Section: Model Parameter Estimationmentioning
confidence: 99%