Approximate Bayesian computation based on pseudo-prior adjustment and its adhibition in bioscience

Gan Liu, Yongzhen Pei, Changguo Li


Approximate Bayesian Computation (ABC) is a powerful tool to solve problem in likelihood-free methods. Markov Chain Monte Carlo and Sequential Monte Carlo based on ABC are effective techniques for obtaining the posterior sample points. However, without consideration of convergence criterion and choice of proposal kernels, these methods will lead to inefficient sampling or large deviations in statistical inference. By contrast, for ABC rejection sampling, despite being computationally inefficient sampling, independent identically distributed samples are obtained from approximate posterior. In order to combine the advantages of the methods mentioned, an alternative method is proposed for the acceleration of likelihood-free Bayesian inference that uses the pseudo-prior to replace the prior in ABC rejection algorithm and weights each sample point obtained, where the prior is obtained based on historical information and experience and the pseudo-prior is a distribution different from the prior. And the weighted sample are considered to be from the target distribution. In our method, choosing a suitable pseudo-prior not only greatly improves the efficiency of the algorithm but also retains the accuracy advantages of rejection sampling. The approach is illustrated by parameter estimation in bioscience.

Full Text: PDF

Published: 2020-03-16

How to Cite this Article:

Gan Liu, Yongzhen Pei, Changguo Li, Approximate Bayesian computation based on pseudo-prior adjustment and its adhibition in bioscience, Commun. Math. Biol. Neurosci., 2020 (2020), Article ID 11

Copyright © 2020 Gan Liu, Yongzhen Pei, Changguo Li. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Commun. Math. Biol. Neurosci.

ISSN 2052-2541

Editorial Office:


Copyright ©2023 CMBN