2020
DOI: 10.1137/19m128243x
|View full text |Cite
|
Sign up to set email alerts
|

Parameterization Method for Unstable Manifolds of Standing Waves on the Line

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 74 publications
0
7
0
Order By: Relevance
“…The potential of this feature is shown in many preceding works (e.g. [3,8,30,50,59]) for obtaining global feature of dynamical systems. We then extend the locally validated stable manifolds through the time integration of the time-reversal desingularized vector fields.…”
Section: Basic Methodologymentioning
confidence: 91%
“…The potential of this feature is shown in many preceding works (e.g. [3,8,30,50,59]) for obtaining global feature of dynamical systems. We then extend the locally validated stable manifolds through the time integration of the time-reversal desingularized vector fields.…”
Section: Basic Methodologymentioning
confidence: 91%
“…The sparsity challenges can be handled by incorporating generative adversarial networks (GANs) [123]. The method of manufactured learning can leverage both mathematical functions (e.g., Chebyshev, Dickens, wavelet families) or physical sets (e.g., proper orthogonal decomposition modes [45,193,281], Lagrangian coherent structures [133,144,301], exact coherent structures [39,92,314,346], terrain induced vortices [88,284,329], traveling waves [33,132,155], periodic or relative periodic orbits [72]) to train 𝒩 without requiring an expensive “truth" labeled data. It is also pivotal when we construct self‐evolving surrogate models on locally embedded structures.…”
Section: Hybrid Analysis and Modelingmentioning
confidence: 99%
“…The sparsity challenges can be handled by incorporating generative adversarial networks (GANs) [218] . The method of manufactured learning can leverage both mathematical functions (e.g., Chebyshev, Dickens, wavelet families) or physical sets (e.g., proper orthogonal decomposition modes [96,219,220] , Lagrangian coherent structures [221][222][223] , exact coherent structures [224][225][226][227] , terrain induced vortices [228][229][230] , traveling waves [231][232][233] , periodic or relative periodic orbits [234] ) to train  without requiring an expensive "truth" labeled data. It is also pivotal when we construct self-evolving surrogate models on locally embedded structures.…”
Section: Figurementioning
confidence: 99%