Making a case for (Hyper-)parameter tuning as benchmark problems - Sorbonne Université Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Making a case for (Hyper-)parameter tuning as benchmark problems

Résumé

One of the biggest challenges in evolutionary computation concerns the selection and configuration of a best-suitable heuristic for a given problem. While in the past both of these problems have primarily been addressed by building on experts' experience, the last decade has witnessed a significant shift towards automated decision making, which capitalizes on techniques proposed in the machine learning literature. A key success factor in automated algorithm selection and configuration are good training sets, whose performance data can be leveraged to build accurate performance prediction models. With the long-term goal to build landscape-aware parameter control mechanisms for iterative optimization heuristics, we consider in this discussion paper the question how well the 24 functions from the BBOB test bed cover the characteristics of (hyper-)parameter tuning problems. To this end, we perform a preliminary landscape analysis of two hyper-parameter selection problems, and compare their feature values with those of the BBOB functions. While we do see a good fit for one of the tuning problems, our findings also indicate that some parameter tuning problems might not be very well represented by the BBOB functions. This raises the question if one can nevertheless deduce reliable performance-prediction models for hyper-parameter tuning problems from the BBOB test bed, or whether for this specific target the BBOB benchmark should be adjusted, by adding or replacing some of its functions. Independently of the aspect of training automated algorithm selection and configuration techniques, hyper-parameter tuning problems offer a plethora of problems which might be worthwhile to study in the context of benchmarking iterative optimization heuristics.
Fichier principal
Vignette du fichier
bl Doerr.Dreo.Kersche Parameter Tuning as Benchmark Problems (1).pdf (1.7 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02179587 , version 1 (14-01-2020)

Identifiants

Citer

Carola Doerr, Johann Dréo, Pascal Kerschke. Making a case for (Hyper-)parameter tuning as benchmark problems. Genetic and Evolutionary Computation Conference, Companion Material, Jul 2019, Prague, Czech Republic. pp.1755-1764, ⟨10.1145/3319619.3326857⟩. ⟨hal-02179587⟩
73 Consultations
260 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More