Reservoir Computing with Computational Matter

Zoran Konkoli, Stefano Nichele, Matthew Dale, Susan Stepney

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

The reservoir computing paradigm of information processing has emerged as a natural response to the problem of training recurrent neural networks. It has been realized that the training phase can be avoided provided a network has some well-defined properties, e.g. the echo state property. This idea has been generalized to arbitrary artificial dynamical systems. In principle, any dynamical system could be used for advanced information processing applications provided that such a system has the separation and the approximation property. To carry out this idea in practice, the only auxiliary equipment that is needed is a simple read-out layer that can be used to access the internal states of the system. In the following, several applications scenarios of this generic idea are discussed, together with some related engineering aspects. We cover both practical problems one might meet when trying to implement the idea, and discuss several strategies of solving such problems.
Original languageEnglish
Title of host publicationComputational Matter
EditorsSusan Stepney, Martyn Amos, Steen Rasmussen
PublisherSpringer
Pages269-293
Number of pages25
Volume1
Edition1
ISBN (Electronic)9783319658261
ISBN (Print)9783319658261
DOIs
Publication statusPublished - 20 Jul 2018

Publication series

NameNatural Computing Series
PublisherSpringer

Cite this