2022
DOI: 10.1103/physreve.105.014217
|View full text |Cite
|
Sign up to set email alerts
|

Constructing periodic orbits of high-dimensional chaotic systems by an adjoint-based variational method

Abstract: Chaotic dynamics in systems ranging from low-dimensional nonlinear differential equations to high-dimensional spatio-temporal systems including fluid turbulence is supported by non-chaotic, exactly recurring time-periodic solutions of the governing equations. These unstable periodic orbits capture key features of the turbulent dynamics and sufficiently large sets of orbits promise a framework to predict the statistics of the chaotic flow. Computing periodic orbits for high-dimensional spatio-temporally chaotic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 11 publications
(12 citation statements)
references
References 50 publications
2
10
0
Order By: Relevance
“…Due to the line search requiring more evaluations, each SV iteration is on average around 40 % slower than the two PV methods, which leads to fewer iterations in 72 h. More importantly, SV shows a significantly slower convergence rate. All three methods show an initial fast improvement of the objective function, followed by a much slower period, consistent with Azimi et al (2022); the convergence rate in this latter region is much faster with the PV methods, as much smaller steps were required to be taken for SV. The precise reason for this is unclear, and we make no claim that our conjugate-gradient algorithm uses optimal parameters, but this result was robust after significant trial-and-error tweaking of them.…”
Section: Comparison Of Methodssupporting
confidence: 53%
See 4 more Smart Citations
“…Due to the line search requiring more evaluations, each SV iteration is on average around 40 % slower than the two PV methods, which leads to fewer iterations in 72 h. More importantly, SV shows a significantly slower convergence rate. All three methods show an initial fast improvement of the objective function, followed by a much slower period, consistent with Azimi et al (2022); the convergence rate in this latter region is much faster with the PV methods, as much smaller steps were required to be taken for SV. The precise reason for this is unclear, and we make no claim that our conjugate-gradient algorithm uses optimal parameters, but this result was robust after significant trial-and-error tweaking of them.…”
Section: Comparison Of Methodssupporting
confidence: 53%
“…The simplest method is gradient descent with a fixed step size at each iteration. Previous authors (Farazmand 2016;Azimi et al 2022) have considered a dynamical system…”
Section: Optimisation Methods For Minimizing the Objective Functionalsmentioning
confidence: 99%
See 3 more Smart Citations