Matthew GrovesMichael PearceJuergen Branke
Parallelizing Bayesian Optimization has recently attracted a lot of attention. The challenge is usually to estimate the effect multiple new samples will have on the posterior distribution of the objective function, and the combinatorial explosion of the possible sample locations. In this paper, we show that at least for multi-task Bayesian Optimization, parallelization is straightforward because the benefit of samples is independent as long as they are sufficiently far apart in the task space. We propose a simple penalization approach proportional to correlation based on the kernel and demonstrate that for the problem settings we considered, the efficacy of our parallel multi-task Bayesian Optimization algorithm is close to the sequential version, while being able to exploit parallel computation to speed up optimization. Depending on the setup considered, we observe efficiencies of parallelization between 69% and 100%.
Matthew GrovesMichael PearceJuergen Branke
Lübsen, JannisHespe, ChristianEichler, Annika
Jim BoelrijkStef R.A. MolenaarTijmen S. BosTina A. DahlseidBernd EnsingDwight R. StollPatrick ForréBob W.J. Pirok