Reservoir computing is a best-in-class machine learning algorithm for processing information generated by dynamical systems using observed time-series data. Importantly, it requires very small training data sets, uses linear optimization, and thus requires minimal computing resources. However, the algorithm uses randomly sampled matrices to define the underlying recurrent neural network and has a multitude of metaparameters that must be optimized. Recent results demonstrate the equivalence of reservoir computing to nonlinear vector autoregression, which requires no random matrices, fewer metaparameters, and provides interpretable results. Here, we demonstrate that nonlinear vector autoregression excels at reservoir computing benchmark tasks and requires even shorter training data sets and training time, heralding the next generation of reservoir computing.
We study a theoretical model describing a laser with a modulated parameter, concentrating on the appearance of extreme events, also called optical rogue pulses. It is shown that two conditions are required for the appearance of such events in this type of nonlinear system: the existence of generalized multi-stability and the collisions of chaotic attractors with unstable orbits in external crisis, expanding the attractor to visit new regions in phase space.
We demonstrate experimentally how semiconductor lasers subjected to double optical feedback change the statistics of their chaotic spiking dynamics from Gaussian to long-tail Power Law distributions associated to the emergency of bursting. These chaotic regimes, which are features of excitable complex systems, are quantified by the tail exponent α and appear by changing the ratio between the feedback times. Transitions to bursting occur in the neighbourhood of low order Farey fractions. The physics behind these transitions is related to the variation of threshold pump current in the compound system as obtained from a deterministic set of rate equations. Numerical integration also verifies the observed chaos transitions indicating the possibility of controlling the bursting chaotic statistics.
Forecasting the behavior of high-dimensional dynamical systems using machine learning requires efficient methods to learn the underlying physical model. We demonstrate spatiotemporal chaos prediction using a machine learning architecture that, when combined with a next-generation reservoir computer, displays state-of-the-art performance with a computational time [Formula: see text]–[Formula: see text] times faster for training process and training data set [Formula: see text] times smaller than other machine learning algorithms. We also take advantage of the translational symmetry of the model to further reduce the computational cost and training data, each by a factor of [Formula: see text]10.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.