Automated Dynamic Algorithm Configuration

Authors: Steven Adriaensen, André Biedenkapp, Gresa Shala, Noor Awad, Theresa Eimer, Marius Lindauer, Frank Hutter

JAIR 2022 | Venue PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Specifically, we (iv) conduct empirical case studies for using DAC in evolutionary optimization, AI planning, and machine learning.
Researcher Affiliation Collaboration Steven Adriaensen EMAIL Andre Biedenkapp EMAIL Gresa Shala EMAIL Noor Awad EMAIL University of Freiburg, Machine Learning Lab Theresa Eimer EMAIL Marius Lindauer EMAIL Leibniz University Hannover, Institute of Artificial Intelligence Frank Hutter EMAIL University of Freiburg, Machine Learning Lab & Bosch Center for Artificial Intelligence
Pseudocode Yes Algorithm 1 Stepwise execution of a dynamically configured target algorithm A
Open Source Code Yes Footnote 12: Code for reproducing these experiments is publicly available: https://github.com/automl/2022_JAIR_DAC_experiments
Open Datasets Yes In both cases, the functions used were taken from the BBOB-2009 competition (Hansen et al., 2009). ... Daniel et al. (2016) consider image classification, using examples from the MNIST and CIFAR-10 datasets.
Dataset Splits Yes Here, the training setup consists of 100 training instances: 10 different black box functions, with 10 different initial search distributions each. For testing, 12 other black box functions were used with a specific initial search distribution. ... The target problems consist of 100 training and 100 disjoint test problem instances taken from each of six different domains ... For meta-training the two η-controllers, we used a meta-training set I D of 100 instances and the default parameter settings of SMAC, and optimized λ [ 10, 10]5, using a symmetric log-scale with linear threshold 10 6, for 5000 inner training runs.
Hardware Specification No each performing a total of 40000 CMA-ES runs and taking 8-10 CPU hours on our system. ... an RL agent experiences 106 steps of the planning system, taking 8-12 hours on our system. ... Each SMAC run took less than 2 CPU-days on our system.
Software Dependencies Yes Footnote 8: A new version of https://github.com/automl/DACBench (v. 0.1) was released alongside this article.
Experiment Setup Yes Replicating the original setup, we set population size λ = 10, history length h = 40, terminate CMA-ES after 50 generations, and model policies as fully connected feed-forward neural networks having two hidden layers with 50 hidden units each and Re LU activations. ... Following Speck et al. (2021), we learn a separate policy for each domain, however, to reduce the computational cost, we limit ourselves to a representative set of three out of six domains. ... For meta-training the two η-controllers, we used a meta-training set I D of 100 instances and the default parameter settings of SMAC, and optimized λ [ 10, 10]5, using a symmetric log-scale with linear threshold 10 6, for 5000 inner training runs.