Accelerated Gibbs sampling of normal distributions using matrix splittings and polynomials

Thumbnail Image

Date

2017-11

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Standard Gibbs sampling applied to a multivariate normal distribution with a specified precision matrix is equivalent in fundamental ways to the Gauss-Seidel iterative solution of linear equations in the precision matrix. Specifically, the iteration operators, the conditions under which convergence occurs, and geometric convergence factors (and rates) are identical. These results hold for arbitrary matrix splittings from classical iterative methods in numerical linear algebra giving easy access to mature results in that field, including existing convergence results for antithetic-variable Gibbs sampling, REGS sampling, and generalizations. Hence, efficient deterministic stationary relaxation schemes lead to efficient generalizations of Gibbs sampling. The technique of polynomial acceleration that significantly improves the convergence rate of an iterative solver derived from a symmetric matrix splitting may be applied to accelerate the equivalent generalized Gibbs sampler. Identicality of error polynomials guarantees convergence of the inhomogeneous Markov chain, while equality of convergence factors ensures that the optimal solver leads to the optimal sampler. Numerical examples are presented, including a Chebyshev accelerated SSOR Gibbs sampler applied to a stylized demonstration of low-level Bayesian image reconstruction in a large 3-dimensional linear inverse problem.

Description

Keywords

Citation

Fox C, Parker AE, " Accelerated Gibbs sampling of normal distributions using matrix splittings and polynomials," Bernoulli; 2017 Nov; 23(4B):3711–43. doi: 10.3150/16-bej863

Endorsement

Review

Supplemented By

Referenced By

Copyright (c) 2002-2022, LYRASIS. All rights reserved.