3:30 PM
4:30 PM
Gibbs samplers are a class of Markov chain Monte Carlo (MCMC) algorithms commonly used in statistics for sampling from intractable probability distributions. In this talk, I will demonstrate how Halmos's (1969) theory of two projections can be applied to study Gibbs samplers with two components. I will first give an introduction to MCMC algorithms, particularly Gibbs algorithms. Then, I will explain how problems regarding the asymptotic variance and convergence rate of a two-component Gibbs sampler can be translated into simple linear algebraic problems through Halmos's theory. In particular, a comparison is made between the deterministic-scan and random-scan versions of two-component Gibbs. It is found that in terms of asymptotic variance, the random-scan version is more robust than the deterministic-scan version, provided that the selection probability is appropriately chosen. On the other hand, the deterministic-scan version always has a faster convergence rate.