JOURNAL ARTICLE

Towards end‐to‐end likelihood‐free inference with convolutional neural networks

Stefan T. RadevUlf K. MertensAndreas VoßUllrich Köthe

Year: 2019 Journal:   British Journal of Mathematical and Statistical Psychology Vol: 73 (1)Pages: 23-43   Publisher: Wiley

Abstract

Complex simulator‐based models with non‐standard sampling distributions require sophisticated design choices for reliable approximate parameter inference. We introduce a fast, end‐to‐end approach for approximate Bayesian computation ( ABC ) based on fully convolutional neural networks. The method enables users of ABC to derive simultaneously the posterior mean and variance of multidimensional posterior distributions directly from raw simulated data. Once trained on simulated data, the convolutional neural network is able to map real data samples of variable size to the first two posterior moments of the relevant parameter's distributions. Thus, in contrast to other machine learning approaches to ABC , our approach allows us to generate reusable models that can be applied by different researchers employing the same model. We verify the utility of our method on two common statistical models (i.e., a multivariate normal distribution and a multiple regression scenario), for which the posterior parameter distributions can be derived analytically. We then apply our method to recover the parameters of the leaky competing accumulator ( LCA ) model and we reference our results to the current state‐of‐the‐art technique, which is the probability density estimation ( PDA ). Results show that our method exhibits a lower approximation error compared with other machine learning approaches to ABC . It also performs similarly to PDA in recovering the parameters of the LCA model.

Keywords:
Approximate Bayesian computation Computer science Posterior probability Convolutional neural network Inference Bayesian probability Bayesian linear regression Statistical inference Computation Algorithm Bayesian inference Artificial intelligence Bayesian experimental design Multivariate statistics Machine learning Statistics Mathematics

Metrics

41
Cited By
5.88
FWCI (Field Weighted Citation Impact)
34
Refs
0.96
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Markov Chains and Monte Carlo Methods
Physical Sciences →  Mathematics →  Statistics and Probability
Gaussian Processes and Bayesian Inference
Physical Sciences →  Computer Science →  Artificial Intelligence
Statistical Methods and Bayesian Inference
Physical Sciences →  Mathematics →  Statistics and Probability

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.