Class Session 08: Coupling from the past and Heat-bath algorithm for the Ising model

The big problem in a Monte Carlo simulation is to determine the convergence time, or at least to estimate it correctly. In many applications, this is a complex, controversial problem, often a nightmare. Here we study the subject of perfect simulations that completely solve this problem, whenever they are applicable. After examining the simple case of a random walk on a finite interval, we show that the coupling-from-the-past method can be applied to the Heat-bath algorithm for the Ising model.

Forward Coupling of Markov chains

Random maps


Consider a particule diffusing on site sites 1 to 5, with reflecting boundary conditions. Transition probabilities are as follows: on site 1 (resp. site 5), stay at site 1 (resp. site 5) occurs with probability 2/3 ; while jump at site 2 (resp. site 4) occurs with probability 1/3. On sites 2,3,4, staying, the three events of staying onsite and jumping to one neighbor all occur with same probability 1/3. Is detailed balance verified, and with respect to which probability?

To understand how convergence occur at large times, irrespective of initial condition, consider a complete Monte Carlo simulation (see figure below, taken from[1]) where a random map prescribes the motion (up, down, straight) of a particle on a lattice of N=5 sites, for every position of the particle at each time step.

One_d_diffusion.jpg
A simulation of a particle moving on the 1D lattice of 5 sites, using random maps.
Corresponding to the particle jump transition probabilities, the way in which all arraws are drawn is as follows: on site 1 (site 5), arrows go straight with probability 2/3, and up (down) with probability 1/3. On sites 2,3,4, arrows go into the three possible directions with probability 1/3 each.

Computation of coupling time (example)

Use an (electronic) pencil to determine the coupling time tcoup, after which the Monte Carlo simulation (as realized in the above figure) no longer depends on the initial condition.

Probablity distribution of coupling time

The coupling time tcoup is not a fixed number, but a random variable, which depends on the way we have drawn the arrows in the figure. Show (without any calculation) that the distribution π(tcoup) of tcoup decays at least exponentially for large coupling times.

Exponential decay of coupling-time distribution

Use the reasoning of the previous question to show that the distribution decays exactly exponentially (without any calculation).

Bias in forward coupling

Show (use the algorithm given below) that the position at which the coupling takes place (from 1 to 5) is not uniformly distributed. Comment.

Backward Coupling of Markov chains

Coupling from the past: perfect sampling

Below, please find an infinite Monte Carlo calculation (not all arrows have been drawn). Use pencil and eraser to find the position at t = 0 for any of the initial conditions at t=-. Show that this position is uniformly distributed (if we average over the ways of drawing the arrows).

One_d_cftp.jpg
A simulation on a one-dimensional lattice that has run for more than one billion years.

The coupling from the past method to sample a site is as follows: starting from time 0, go back in the past until random map has coupled. The (unique) position of the coupled random map at time t = 0 is distributed according to the steady-state distribution. Explain why.

Algorithms

Forward coupling

We consider below the forward coupling for the generic N sites problem depicted above. The jump probabilities are

excepted at the boundaries where

This choice implements reflecting boundary conditions. Since the rates are symmetric

detailed balance is satisfied with the uniform distribution probability

To obtain a histogram of the site occupation at coupling time, one may run the following Python program:
import random, pylab
 
N = 8
pos = []
for statistic in range(5000):
   positions = set(range(N))
   while True:
      arrows = [random.randint(-1,1) for i in range(N)]
      if arrows[0] == -1: arrows[0] = 0
      if arrows[N-1] == 1: arrows[N - 1] = 0
      positions = set([b + arrows[b] for b in positions.copy()])
      if len(positions) == 1: break
   pos.append(positions.pop());
pylab.title('Forward coupling: distribution of the coupled configuration')
pylab.hist(pos, bins = N, range = (-0.5, N - 0.5), normed = True)
pylab.xlabel('Position')
pylab.ylabel('Histogram')
pylab.show()
One checks that the histogram is not flat. It does not yield the uniform steady state πk = 1/N :

Forward_coupling_histogram.png


Exact sampling from backward coupling

We consider the same system, now using the backward sampling method: starting from t = 0, we go backwards in time, constructing the graph starting from decreasing initial times t = -1, t = -2, ... , and waiting until the graph at t = 0 contains only one branch.
Note that, in contrast to the forward coupling method, the coupling may have occurred at any negative time t < 0, and not at the observation time t = 0.
To histogram the site occupation at t = 0, one may run the following Python program:
import random, pylab
 
 
N = 8
pos = []
times_tot = []
for statistic in range(5000):
    all_arrows = {}
    time_tot = 0
    while True:
        time_tot -= 1
        arrows = [random.randint(-1, 1) for i in range(N)]
        if arrows[0] == -1: arrows[0] = 0
        if arrows[N-1] == 1: arrows[N-1] = 0
        all_arrows[time_tot] = arrows
        positions = set(range(N))
        for t in range(time_tot, 0):
            positions = set([b + all_arrows[t][b] for b in positions.copy()])
        if len(positions) == 1: break
    times_tot.append(-time_tot) # here we store the coupling times
    pos.append(positions.pop())
pylab.title('Backward coupling for a particule diffusion on [1,N]: histogram of the position at t=0')
pylab.hist(pos, bins = N, range = (-0.5,N - 0.5), normed = True)
pylab.savefig('backward_coupling_position.png')
pylab.xlabel('Position')
pylab.ylabel('Histogram')
pylab.show()
This coupling-from-the-past method for sampling is perfect: configurations at time t = 0 are now exactly distributed according to the uniform steady state distribution:
Backward_coupling_histogram.png

Application to Ising model with Heat Bath algorithm


We discuss a simple algorithm for the Ising model at equilibrium with a bath at inverse temperature β (see section 5.2.2 in SMAC). The system is described by a set of N spins {σ1,...,σN} each of them taking the value +1 or –1. The energy of the system is proportional to the Boltzman weight of energy



where the sum is over neighboring sites.

The heat bath dynamics

During the lecture we have discussed the Metropolis algorithm for this system. Rather than flipping a spin at a random site, we now thermalize this spin with its local environment: the state of a spin after a flip does not depend on its state before the flip, but only on its molecular field h (defined by the sum of the spins of its neighbors).
Show that, in the presence of a molecular field h at site k, a spin points up and down with probabilities:



Those define the transition probabilities at each step of the heat bath algorithm (for a uniformly chosen spin to become respectively up or down, irrespective of its initial state). Show that detailed balance is verified with respect to the Boltzmann-Gibbs weight at inverse temperature β. The following program implements this algorithm in two dimension on a square lattice of lenght L with periodic boundary conditions:
import random, math, pylab, numpy
L = 100
Nstep = 100000
beta = 0.3
neighbors = [(0,-1),(0,1),(-1,0),(1,0)]
#initialisation
s = [[ 1 for i in range(L)] for j in range(L)]
energy = -2. * L**2
toten = 0.
#evolution
for step in range(Nstep):
    kx = random.randrange(L)
    ky = random.randrange(L)
    h = sum( s[(kx+dx)%L][(ky+dy)%L] for (dx,dy) in neighbors)
    olds = s[kx][ky]
    Upsilon = random.random()
    s[kx][ky] = -1
    if Upsilon < 1./(1+math.exp(-2*beta*h)): s[kx][ky] = 1
    if olds != s[kx][ky]: energy-=2*h*s[kx][ky]
    toten += energy
print toten / Nstep / L**2.
print sum(sum(s[i][j] for i in range(L)) for j in range(L) )/L**2.
pylab.matshow(numpy.array(s))
pylab.show()

Half order and coupling algorithms


Half_order_in_the_Ising_model.png
Figure 1. Half order in the Ising model.


Let us first introduce the concept of half order for the configurations in the Ising model (see figure 1). We define that, for a site, an up spin is larger than a down spin, and a whole configuration {σ1,...,σN} of spins is larger than an other configuration {σ'1,...,σ'N} if the spins on each sites k satisfy σkσ'k. This defines an half-order among the configurations (half, because not all configurations can be compared)..
  • Show that the half order is preserved at each iteration of the heat bath algorithm.
  • In the forward coupling, show that it is sufficient to focus on the coupling of two particular configurations. Which ones?

The same remark applies for the backward coupling, and allows to draw a perfect sample. Here is an implementation:
import random,math,pylab,numpy
L=10
beta=0.01
neighbors=[(0,-1),(0,1),(-1,0),(1,0)]
#initialisation
splus0=[[ 1 for i in range(L)] for j in range(L)]
sminus0=[[ -1 for i in range(L)] for j in range(L)]
splus,sminus=splus0[:],sminus0[:]
changes=[]
#evolution
while splus!=sminus :
    splus,sminus=splus0[:],sminus0[:]
    kx=random.randrange(L)
    ky=random.randrange(L)
    Upsilon=random.random()
    changes.insert(0,[kx,ky,Upsilon])
    for [kx,ky,Upsilon] in changes:
        for s in (splus,sminus):
            h=sum( s[(kx+dx)%L][(ky+dy)%L] for (dx,dy) in neighbors)
            s[kx][ky]=-1
            if Upsilon<1./(1+math.exp(-2*beta*h)): s[kx][ky]=1
print "here is a perfect sample", splus
print "steps to find it", len(changes)
pylab.matshow(numpy.array(splus))
pylab.show()
What is the advantage of this method compared to the naive implementation of the backward coupling?

[Print this page]
  1. ^ C. Chanal, W. Krauth Convergence and coupling for spin glasses and hard spheres, Physical Review E 81, 016705 (2010).