Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

Bayesian inference by active sampling

Abstract

Uncertainty quantification is a critical component to increasing the safety and reliability of machine learning systems. Bayesian statistics is immensely useful for the designof learning algorithms to accommodate uncertainty quantification. Bayesian inferenceconstitutes the computation involved in Bayesian methods and is synonymous with learning when applied to parameters. Many models do not allow for exact inference, requiring approximate methods. However, many of the current approximate inference methods are special-purpose, can be challenging to use, or are not robust to problem-specifics.Active sampling is a class of algorithms that chooses instances for which to acquire dataor information. When combined with flexible and expanding representations, this class allows the treatment of black-box problems where details of the problem can be unknown until the algorithm discovers them at runtime.In the thesis, our focus is to address Bayesian inference through active sampling to produce methods that are robust, easy to use, general and yet efficient.We first consider inference of the optimum of an unknown objective function, for whichBayesian Optimization using a Gaussian Process surrogate model is a popular methodology. Many functions arising in real-world applications are challenging to treat using popular GP models, which often leads to unexpectedly poor optimisation performance when a practitioner or system applies BO to a black-box problem. Such functions include, for example, objective functions with discontinuities, which are exceedingly improbable functions under the function prior. We address this by proposing a BO methodology in which the surrogate model is used to predict only the structures of the objective function that are useful for locating the optimum. We show that our approach allows the search to maintain sample-efficiency when faced with challenging structures, increasing the overall reliability of the BO method. We then treat general Bayesian inference through active sampling where the target is ageneral, black-box distribution, such as an intractable posterior distribution. The algorithm we propose is asymptotically exact, efficient and scalable to millions of evaluations of the density function, allowing close approximations to any bounded distribution. The algorithm produces an approximation to the black-box distribution that allows for efficient operations, including fast, constant-time sampling, computation of the normalisation constant, and conditional distributions to be derived. We show how our methodology allows us to address a diverse set of tasks, including posterior inference, evidence estimation, divergence estimation, and propagate uncertainty through probabilistic function models

Similar works

This paper was published in Explore Bristol Research.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.