Benchmarking and analyzing iterative optimization heuristics with IOHprofiler
Abstract
Comparing and evaluating optimization algorithms is an important part of evolutionary computation, and requires a robust benchmarking setup to be done well. IOHprofiler supports researchers in this task by providing an easy-to-use, interactive, and highly customizable environment for benchmarking iterative optimizers.
IOHprofiler is designed as a modular benchmarking tool. The experimenter module provides easy access to common problem sets (e.g. BBOB functions) and modular logging functionality that can be easily combined with other optimization functions. The resulting logs (and logs from other platforms, e.g. COCO and Nevergrad) are fully interoperable with the IOHanalyzer, which provides access to highly interactive performance analysis, in the form of a wide array of visualizations and statistical analyses. A GUI, hosted at https://iohanalyzer.liacs.nl/ makes these analysis tools easy to access. Data from many repositories (e.g. COCO, Nevergrad) are pre-processed, such that the effort required to compare performance to existing algorithms is greatly reduced.
This tutorial will introduce the key features of IOHprofiler by providing background information on benchmarking in EC and showing how this can be done using the modules of IOHprofiler. The key components will be highlighted and demonstrated by the organizers, with a focus on the functionality introduced in the year since the last tutorial, such as constrained optimization. Guided examples will be provided to highlight the many aspects of algorithm performance which can be explored using the interactive GUI.
Domains
Computer Science [cs]Origin | Files produced by the author(s) |
---|