We evaluate in this paper the GNN-CMA-ES algorithm on the BBOB noiseless testbed. The GNN-CMA-ES algorithm was recently proposed as a plug-in extension to CMA-ES, introducing the possibility to train flexible search distributions, in contrast to standard search distributions (such as the multivariate Gaussian). By comparing GNN-CMA-ES and CMA-ES, we show the benefits of this extension on some unimodal functions as well as on a variety of multimodal functions. We also identify a family of unimodal functions where GNN-CMA-ES can degrade the performances of CMA-ES and discuss the possible reasons behind this behavior.