Mirrored Variants of the (1,4)-CMA-ES Compared on the Noisy BBOB-2010 Testbed
Abstract
Derandomization by means of mirrored samples has been recently introduced to enhance the performances of $(1,\lambda)$- and $(1+2)$-Evolution-Strategies (ESs) with the aim of designing fast stochastic local search algorithms. In this paper, we investigate the impact of mirrored samples for noisy optimization. Since elitist selection is detrimental for noisy optimization, we investigate non-elitist ESs only here. We compare on the BBOB-2010 noisy benchmark testbed two variants of the (1,4)-CMA-ES where mirrored samples are implemented with the baseline (1,4)-CMA-ES. Each algorithm implements a restart mechanism. A total budget of $10^{4} D$ function evaluations per trial has been used, where $D$ is the dimension of the search space. The comparison shows that using mirroring within the (1,4)-CMA-ES improves the performance in the noisy BBOB-2010 scenario: the (1,4$_m$)-CMA-ES with mirrored mutations improves significantly over the (1,4)-CMA-ES by 13--60% on 6 functions whereas no function with decreased performance can be reported. The (1,4$_m^s$)-CMA-ES, employing in addition to the mirroring a sequential selection, further improves the results over the (1,4$_m$)-CMA-ES by additional 20--62%, depending on the function. Compared to the BBOB-2009 benchmarking, the (1,4$_m^s$)-CMA-ES improves over the function-wise best algorithm on 7 functions with Cauchy noise type by 12--68% (in both 5D and 20D).
Origin | Files produced by the author(s) |
---|
Loading...