Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing.
Many information processing challenges are difficult to solve with traditional Turing or von Neumann approaches. Implementing unconventional computational methods is therefore essential and optics provides promising opportunities. Here we experimentally demonstrate optical information processing using a nonlinear optoelectronic oscillator subject to delayed feedback. We implement a neuro-inspired concept, called Reservoir Computing, proven to possess universal computational capabilities. We particularly exploit the transient response of a complex dynamical system to an input data stream. We employ spoken digit recognition and time series prediction tasks as benchmarks, achieving competitive processing figures of merit.
Reservoir computing is a paradigm in machine learning whose processing capabilities rely on the dynamical behavior of recurrent neural networks. We present a mixed analog and digital implementation of this concept with a nonlinear analog electronic circuit as a main computational unit. In our approach, the reservoir network can be replaced by a single nonlinear element with delay via time-multiplexing. We analyze the influence of noise on the performance of the system for two benchmark tasks: 1) a classification problem and 2) a chaotic time-series prediction task. Special attention is given to the role of quantization noise, which is studied by varying the resolution in the conversion interface between the analog and digital worlds.
Reservoir computing is a novel bio-inspired computing method, capable of solving complex tasks in a computationally efficient way. It has recently been successfully implemented using delayed feedback systems, allowing to reduce the hardware complexity of brain-inspired computers drastically. In this approach, the pre-processing procedure relies on the definition of a temporal mask which serves as a scaled time-mutiplexing of the input. Originally, random masks had been chosen, motivated by the random connectivity in reservoirs. This random generation can sometimes fail. Moreover, for hardware implementations random generation is not ideal due to its complexity and the requirement for trial and error. We outline a procedure to reliably construct an optimal mask pattern in terms of multipurpose performance, derived from the concept of maximum length sequences. Not only does this ensure the creation of the shortest possible mask that leads to maximum variability in the reservoir states for the given reservoir, it also allows for an interpretation of the statistical significance of the provided training samples for the task at hand.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.