2020
DOI: 10.48550/arxiv.2012.09940
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Reduced Order Modeling using Shallow ReLU Networks with Grassmann Layers

Abstract: This paper presents a nonlinear model reduction method for systems of equations using a structured neural network. The neural network takes the form of a "three-layer" network with the first layer constrained to lie on the Grassmann manifold and the first activation function set to identity, while the remaining network is a standard two-layer ReLU neural network. The Grassmann layer determines the reduced basis for the input space, while the remaining layers approximate the nonlinear input-output system. The t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 19 publications
0
1
0
Order By: Relevance
“…A recent topic of interest in scientific machine learning is the deployment of neural networks as parametric surrogates [4,6,13,20,22,24,31,32,38]. Of particular relevance to this work are projection based parametric neural surrogates which seek to parametrize high dimensional maps by use of linear and nonlinear dimension reduction strategies [4,6,13,31,38].…”
Section: Relevant Workmentioning
confidence: 99%
“…A recent topic of interest in scientific machine learning is the deployment of neural networks as parametric surrogates [4,6,13,20,22,24,31,32,38]. Of particular relevance to this work are projection based parametric neural surrogates which seek to parametrize high dimensional maps by use of linear and nonlinear dimension reduction strategies [4,6,13,31,38].…”
Section: Relevant Workmentioning
confidence: 99%