2013
DOI: 10.48550/arxiv.1309.0260
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning from the past, predicting the statistics for the future, learning an evolving system

Abstract: Rough path theory is a mathematical toolbox providing for the deterministic modelling of interactions between highly oscillatory systems (rough paths). The theory is rich enough to capture and extend classical Itô stochastic calculus but has far wider significance. Fundamental to this approach is the realisation that the evolving state of the system is best described or measured over short time intervals by considering the realised effect of the system on certain controlled systems (the measurement instruments… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
69
0
1

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 35 publications
(70 citation statements)
references
References 26 publications
0
69
0
1
Order By: Relevance
“…In order to associate a signature to this set of points, some interpolation is required. Among several approaches proposed in the literature, we present here the piecewise linear and the rectilinear interpolations used by Levin, Lyons and Ni [11].…”
Section: Clustering Paths Via Signaturesmentioning
confidence: 99%
“…In order to associate a signature to this set of points, some interpolation is required. Among several approaches proposed in the literature, we present here the piecewise linear and the rectilinear interpolations used by Levin, Lyons and Ni [11].…”
Section: Clustering Paths Via Signaturesmentioning
confidence: 99%
“…An essential ingredient towards our construction is the signature of a continuous-time process, which we briefly present here. We refer to Chevyrev and Kormilitzin (2016) for a gentle introduction and to Lyons et al (2007); Levin et al (2013) for details.…”
Section: The Signaturementioning
confidence: 99%
“…Although this definition is technical, the signature should simply be thought of as a feature map that embeds a bounded variation process into an infinite-dimensional tensor space. The signature has several good properties that make it a relevant tool for machine learning (e.g., Levin et al, 2013;Chevyrev and Kormilitzin, 2016;Fermanian, 2021). In particular, under certain assumptions, S(X) characterizes X up to translations and reparameterizations, and has good approximation properties.…”
Section: The Signaturementioning
confidence: 99%
See 2 more Smart Citations