# Metropolis-Hastings Algorithms

Topics: Markov chain Monte Carlo, Gibbs sampling, Metropolis–Hastings algorithm Pages: 61 (7365 words) Published: May 25, 2013
CPSC 535 Metropolis-Hastings

March 2007

March 2007

1 / 45

Gibbs Sampler

Initialization: Select deterministically or randomly (0 ) (0 ) θ = θ 1 , ..., θ p .

March 2007

2 / 45

Gibbs Sampler

Initialization: Select deterministically or randomly (0 ) (0 ) θ = θ 1 , ..., θ p . Iteration i; i 1:

March 2007

2 / 45

Gibbs Sampler

Initialization: Select deterministically or randomly (0 ) (0 ) θ = θ 1 , ..., θ p . Iteration i; i 1: For k = 1 : p

March 2007

2 / 45

Gibbs Sampler

Initialization: Select deterministically or randomly (0 ) (0 ) θ = θ 1 , ..., θ p . Iteration i; i 1: (i ) k where (i ) (i ) (i 1 ) (i 1 ) θ 1 , ..., θ k 1 , θ k +1 , ..., θ p (i )

For k = 1 : p
Sample θ k θ
(i ) k

π θk j θ

=

.

March 2007

2 / 45

The Gibbs sampler requires sampling from the full conditional distributions π ( θk j θ k ) .

March 2007

3 / 45

The Gibbs sampler requires sampling from the full conditional distributions π ( θk j θ k ) . For many complex models, it is impossible to sample from several of these “full” conditional distributions.

March 2007

3 / 45

The Gibbs sampler requires sampling from the full conditional distributions π ( θk j θ k ) . For many complex models, it is impossible to sample from several of these “full” conditional distributions. Even if it is possible to implement the Gibbs sampler, the algorithm might be very ine¢ cient because the variables are very correlated or sampling from the full conditionals is extremely expensive/ine¢ cient.

March 2007

3 / 45

Metropolis-Hastings Algorithm

The Metropolis-Hastings algorithm is an alternative algorithm to sample from probability distribution π (θ ) known up to a normalizing constant.

March 2007

4 / 45

Metropolis-Hastings Algorithm

The Metropolis-Hastings algorithm is an alternative algorithm to sample from probability distribution π (θ ) known up to a normalizing constant. This can be interpreted as the basis of all MCMC algorithm: It provides a generic way to build a Markov kernel admitting π (θ ) as an invariant distribution.

March 2007

4 / 45

Metropolis-Hastings Algorithm

The Metropolis-Hastings algorithm is an alternative algorithm to sample from probability distribution π (θ ) known up to a normalizing constant. This can be interpreted as the basis of all MCMC algorithm: It provides a generic way to build a Markov kernel admitting π (θ ) as an invariant distribution. The Metropolis algorithm was named the “Top algorithm of the 20th century” by computer scientists, mathematicians, physicists.

March 2007

4 / 45

Introduce a proposal distribution/kernel q θ, θ 0 , i.e.
Z

q θ, θ 0 d θ 0 = 1 for any θ.

March 2007

5 / 45

Introduce a proposal distribution/kernel q θ, θ 0 , i.e.
Z

q θ, θ 0 d θ 0 = 1 for any θ.

The basic idea of the MH algorithm is to propose a new candidate θ 0 based on the current state of the Markov chain θ.

March 2007

5 / 45

Introduce a proposal distribution/kernel q θ, θ 0 , i.e.
Z

q θ, θ 0 d θ 0 = 1 for any θ.

The basic idea of the MH algorithm is to propose a new candidate θ 0 based on the current state of the Markov chain θ. We only accept this algorithm with respect to a probability α θ, θ 0 which ensures that the invariant distribution of the transition kernel is the target distribution π (θ ).

March 2007

5 / 45

Initialization: Select deterministically or randomly θ (0 ) .

March 2007

6 / 45

Initialization: Select deterministically or randomly θ (0 ) . Iteration i; i 1:

March 2007

6 / 45

Initialization: Select deterministically or randomly θ (0 ) . Iteration i; i Sample θ

1:
q θ (i
1)

and compute 0 π ( θ ) q θ , θ (i π θ (i
1) 1) 1)

α θ (i

1)

= min @1,

q θ (i

1

A....