Unified framework for the adaptive operator selection of discrete parameters

Mudita Sharma, Manuel López-Ibáñez, Dimitar Lubomirov Kazakov

Research output: Working paperPreprint

Abstract

We conduct an exhaustive survey of adaptive selection of operators (AOS) in Evolutionary Algorithms (EAs). We simplified the AOS structure by adding more components to the framework to built upon the existing categorisation of AOS methods. In addition to simplifying, we looked at the commonality among AOS methods from literature to generalise them. Each component is presented with a number of alternative choices, each represented with a formula. We make three sets of comparisons. First, the methods from literature are tested on the BBOB test bed with their default hyper parameters. Second, the hyper parameters of these methods are tuned using an offline configurator known as IRACE. Third, for a given set of problems, we use IRACE to select the best combination of components and tune their hyper parameters.
Original languageEnglish
Number of pages36
Publication statusPublished - 12 May 2020

Publication series

NamearXiv

Keywords

  • online tuning
  • IRACE
  • differential evolution

Cite this