Browsing by Author "Fox, Colin"
Now showing 1 - 4 of 4
- Results Per Page
- Sort Options
Item Accelerated Gibbs sampling of normal distributions using matrix splittings and polynomials(2017-11) Fox, Colin; Parker, Albert E.Standard Gibbs sampling applied to a multivariate normal distribution with a specified precision matrix is equivalent in fundamental ways to the Gauss-Seidel iterative solution of linear equations in the precision matrix. Specifically, the iteration operators, the conditions under which convergence occurs, and geometric convergence factors (and rates) are identical. These results hold for arbitrary matrix splittings from classical iterative methods in numerical linear algebra giving easy access to mature results in that field, including existing convergence results for antithetic-variable Gibbs sampling, REGS sampling, and generalizations. Hence, efficient deterministic stationary relaxation schemes lead to efficient generalizations of Gibbs sampling. The technique of polynomial acceleration that significantly improves the convergence rate of an iterative solver derived from a symmetric matrix splitting may be applied to accelerate the equivalent generalized Gibbs sampler. Identicality of error polynomials guarantees convergence of the inhomogeneous Markov chain, while equality of convergence factors ensures that the optimal solver leads to the optimal sampler. Numerical examples are presented, including a Chebyshev accelerated SSOR Gibbs sampler applied to a stylized demonstration of low-level Bayesian image reconstruction in a large 3-dimensional linear inverse problem.Item Accelerated Gibbs sampling of normal distributions using matrix splittings and polynomials(2017-11) Fox, Colin; Parker, Albert E.Standard Gibbs sampling applied to a multivariate normal distribution with a specified precision matrix is equivalent in fundamental ways to the Gauss Seidel iterative solution of linear equations in the precision matrix. Specifically, the iteration operators, the conditions under which convergence occurs, and geometric convergence factors (and rates) are identical. These results hold for arbitrary matrix splittings from classical iterative methods in numerical linear algebra giving easy access to mature results in that field, including existing convergence results for antithetic-variable Gibbs sampling, REGS sampling, and generalizations. Hence, efficient deterministic stationary relaxation schemes lead to efficient generalizations of Gibbs sampling. The technique of polynomial acceleration that significantly improves the convergence rate of an iterative solver derived from a symmetric matrix splitting may be applied to accelerate the equivalent generalized Gibbs sampler. Identicality of error polynomials guarantees convergence of the inhomogeneous Markov chain, while equality of convergence factors ensures that the optimal solver leads to the optimal sampler. Numerical examples are presented, including a Chebyshev accelerated SSOR Gibbs sampler applied to a stylized demonstration of low-level Bayesian image reconstruction in a large 3-dimensional linear inverse problem.Item Convergence in variance of Chebyshev accelerated Gibbs samplers(2014-02) Fox, Colin; Parker, Albert E.A stochastic version of a stationary linear iterative solver may be designed to converge in distribution to a probability distribution with a specified mean $\mu$ and covariance matrix $A^{-1}$. A common example is Gibbs sampling applied to a multivariate Gaussian distribution which is a stochastic version of the Gauss--Seidel linear solver. The iteration operator that acts on the error in mean and covariance in the stochastic iteration is the same iteration operator that acts on the solution error in the linear solver, and thus both the stationary sampler and the stationary solver have the same error polynomial and geometric convergence rate. The polynomial acceleration techniques that are well known in numerical analysis for accelerating the linear solver may also be used to accelerate the stochastic iteration. We derive first-order and second-order Chebyshev polynomial acceleration for the stochastic iteration to accelerate convergence in the mean and covariance by mimicking the derivation for the linear solver. In particular, we show that the error polynomials are identical and hence so are the convergence rates. Thus, optimality of the Chebyshev accelerated solver implies optimality of the Chebyshev accelerated sampler. We give an algorithm for the stochastic version of the second-order Chebyshev accelerated SSOR (symmetric successive overrelaxation) iteration and provide numerical examples of sampling from multivariate Gaussian distributions to confirm that the desired convergence properties are achieved in finite precision.Item Sampling Gaussian distributions in Krylov spaces with conjugate gradients(2012-06) Parker, Albert E.; Fox, ColinThis paper introduces a conjugate gradient sampler that is a simple extension of the method of conjugate gradients (CG) for solving linear systems. The CG sampler iteratively generates samples from a Gaussian probability density, using either a symmetric positive definite covariance or precision matrix, whichever is more convenient to model. Similar to how the Lanczos method solves an eigenvalue problem, the CG sampler approximates the covariance or precision matrix in a small dimensional Krylov space. As with any iterative method, the CG sampler is efficient for high dimensional problems where forming the covariance or precision matrix is impractical, but operating by the matrix is feasible. In exact arithmetic, the sampler generates Gaussian samples with a realized covariance that converges to the covariance of interest. In finite precision, the sampler produces a Gaussian sample with a realized covariance that is the best approximation to the desired covariance in the smaller dimensional Krylov space. In this paper, an analysis of the sampler is given, and we give examples showing the usefulness and limitations of the Krylov approximations.