DISSERTATION

Efficient Deterministic Approximate Bayesian Inference for Gaussian Process models

Thang D. Bui

Year: 2017 University:   Apollo (University of Cambridge)   Publisher: University of Cambridge

Abstract

Gaussian processes are powerful nonparametric distributions over continuous functions that have become a standard tool in modern probabilistic machine learning. However, the applicability of Gaussian processes in the large-data regime and in hierarchical probabilistic models is severely limited by analytic and computational intractabilities. It is, therefore, important to develop practical approximate inference and learning algorithms that can address these challenges. To this end, this dissertation provides a comprehensive and unifying perspective of pseudo-point based deterministic approximate Bayesian learning for a wide variety of Gaussian process models, which connects previously disparate literature, greatly extends them and allows new state-of-the-art approximations to emerge. We start by building a posterior approximation framework based on Power-Expectation Propagation for Gaussian process regression and classification. This framework relies on a structured approximate Gaussian process posterior based on a small number of pseudo-points, which is judiciously chosen to summarise the actual data and enable tractable and efficient inference and hyperparameter learning. Many existing sparse approximations are recovered as special cases of this framework, and can now be understood as performing approximate posterior inference using a common approximate posterior. Critically, extensive empirical evidence suggests that new approximation methods arisen from this unifying perspective outperform existing approaches in many real-world regression and classification tasks. We explore the extensions of this framework to Gaussian process state space models, Gaussian process latent variable models and deep Gaussian processes, which also unify many recently developed approximation schemes for these models. Several mean-field and structured approximate posterior families for the hidden variables in these models are studied. We also discuss several methods for approximate uncertainty propagation in recurrent and deep architectures based on Gaussian projection, linearisation, and simple Monte Carlo. The benefit of the unified inference and learning frameworks for these models are illustrated in a variety of real-world state-space modelling and regression tasks.

Keywords:
Gaussian process Inference Bayesian inference Computer science Bayesian probability Process (computing) Machine learning Artificial intelligence Gaussian Programming language Physics

Metrics

4
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Gaussian Processes and Bayesian Inference
Physical Sciences →  Computer Science →  Artificial Intelligence
Target Tracking and Data Fusion in Sensor Networks
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Deterministic approximate inference techniques for conditionally Gaussian state space models

Onno ZoeterTom Heskes

Journal:   Statistics and Computing Year: 2006 Vol: 16 (3)Pages: 279-292
JOURNAL ARTICLE

Efficient Bayesian inference for Gaussian copula regression models

M. PittDavid ChanRobert Kohn

Journal:   Biometrika Year: 2006 Vol: 93 (3)Pages: 537-554
JOURNAL ARTICLE

Approximate Bayesian inference for hierarchical Gaussian Markov random field models

Håvard RueSara Martino

Journal:   Journal of Statistical Planning and Inference Year: 2007 Vol: 137 (10)Pages: 3177-3192
JOURNAL ARTICLE

Fully Bayesian Inference for Latent Variable Gaussian Process Models

Suraj YerramilliAkshay IyerWei ChenDaniel W. Apley

Journal:   SIAM/ASA Journal on Uncertainty Quantification Year: 2023 Vol: 11 (4)Pages: 1357-1381
JOURNAL ARTICLE

Improving the INLA approach for approximate Bayesian inference for latent Gaussian models

Egil FerkingstadHåvard Rue

Journal:   Electronic Journal of Statistics Year: 2015 Vol: 9 (2)
© 2026 ScienceGate Book Chapters — All rights reserved.