Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

High-dimensional Bayesian optimization with intrinsically low-dimensional response surfaces

Abstract

Bayesian optimization is a powerful technique for the optimization of expensive black-box functions. It is used in a wide range of applications such as in drug and material design and training of machine learning models, e.g. large deep networks. We propose to extend this approach to high-dimensional settings, that is where the number of parameters to be optimized exceeds 10--20. In this thesis, we scale Bayesian optimization by exploiting different types of projections and the intrinsic low-dimensionality assumption of the objective function. We reformulate the problem in a low-dimensional subspace and learn a response surface and maximize an acquisition function in this low-dimensional projection. Contributions include i) a probabilistic model for axis-aligned projections, such as the quantile-Gaussian process and ii) a probabilistic model for learning a feature space by means of manifold Gaussian processes. In the latter contribution, we propose to learn a low-dimensional feature space jointly with (a) the response surface and (b) a reconstruction mapping. Finally, we present empirical results against well-known baselines in high-dimensional Bayesian optimization and provide possible directions for future research in this field.Open Acces

Similar works

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.