Neuroevolution of hierarchical reservoir computers

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Reservoir Computers such as Echo State Networks (ESN) represent an alternative recurrent neural network model that provides fast training and state-of-the-art performances for supervised learning problems. Classic ESNs suffer from two limitations; hyperparameter selection and learning of multiple temporal and spatial scales. To learn multiple scales, hierarchies are proposed, and to overcome manual tuning, optimisation is used. However, the autonomous design of hierarchies and optimisation of multi-reservoir systems has not yet been demonstrated. In this work, an evolvable architecture is proposed called Reservoir-of-Reservoirs (RoR) where sub-networks of neurons (ESNs) are interconnected to form the reservoir. The design of each sub-network, hyperparameters, global connectivity and its hierarchical structure are evolved using a genetic algorithm (GA) called the Microbial GA.

To evaluate the RoR and microbial GA, single networks and other hierarchical ESNs are evolved. The results show the microbial GA can dramatically increase the performance of single networks compared to other optimisation techniques. The evolutionary process also leads to competitive results with RoRs and other hierarchical ESNs, despite having fewer connections than a single network. In the final section, it is revealed that the RoR architecture may learn generalised features other architectures cannot, offering improvements in network generalisation to other tasks.
Original languageEnglish
Title of host publicationGenetic and Evolutionary Computation Conference
Subtitle of host publicationGECCO '18: Proceedings of the Genetic and Evolutionary Computation Conference
PublisherACM
Pages410-417
Number of pages8
ISBN (Electronic)9781450356183
DOIs
Publication statusPublished - 2 Jul 2018

Cite this