DM 1: Markov Chains: the Monte Carlo Markov Chain and the Markov matrix Don't hesitate to ask questions and make remarks onthis wiki page.

This exercise will allow you to familiarize yourselves with the Markov Chains.

In Part 1 we go back to the heliport and discover the « 1/2 thumb rule », which is an empirical – yet very important – rule for successful Monte Carlo Markov Chain algorithms.

In Part 2 we study the scaling of the correlation time of a Markov Chain. We consider systems of particles hopping on a lattice and construct the associated Markov matrix

Basic concepts: Detailed and Global Balance

We would like to sample a configuration C with a given probability πC. A solution for this problem is the Markov Chain Monte Carlo method. We define a dynamics over the ensemble of configurations {C} such that, in the long time limit each configuration is visited with the correct probability πC. As discussed in the lecture, a Markov chain describes the evolution which is fully specified by the transition probabilities

between configurations C and C'. The probability πC(t) of being at time t in configuration C evolves according to

To ensure that πC(t) converges towards the probability distribution πC, one must satisfy the global balance condition:

The detailed balance is more restrictive:

but, if verified by the transition probability, it leads towards the same probability distribution in the long-time limit.

Part 1 - The heliport and the 1/2 thumb rule [GROUP 1]

Consider the following Markov chain Monte Carlo algorithm:

importrandomdef markov_pi(N, delta):
x, y =1.0,1.0
n_hits =0for i inrange(N):
del_x, del_y =random.uniform(-delta, delta),random.uniform(-delta, delta)ifabs(x + del_x)<1.0andabs(y + del_y)<1.0:
x, y = x + del_x, y + del_y
if x**2 + y**2<1.0: n_hits +=1return n_hits
n_runs =1000
n_trials =4000
delta =0.1for run inrange(n_runs):
print4.0 * markov_pi(n_trials, delta) / float(n_trials)

Run the program and convince yourself that for large enough n_trials, the output of each run converges to the value of π. In the following, set the n_runs=500 and n_trials=1000. Modify the program so that it calculates the acceptance ratio, which corresponds to the number of moves that stay inside the square, divided by n_trials. Run the new program and compute the acceptance ratio for the following values of delta (delta=0.1,delta=0.2,delta=0.3, ..., delta=5.0). Plot the computed acceptance ratios as a function of delta using a linear scaling on all axes.

The 1/2 thumb rule predicts that the best performance of a Markov chain Monte Carlo is expected for an acceptance ratio of approximately 1/2. Find the delta-interval in which the acceptance ratio equals 1/2. Example: If the interval where you expect the acceptance ratio of 1/2 is between 3.5 and 3.6, you write "3.5-3.6". Intervals longer than 0.1 are considered as wrong results.

Study the performance of the algorithm as a function of delta. Modify the program so that it calculates the standard deviation of the n_runs data for values of delta=0.1, 0.2,...,5. Plot the standard deviation as a function of delta using linear scaling. Comment your results with one or two sentences.

N.B. We are asking you to produce a scientific diagram. The general criteria for the quality of diagrams in this lecture are the following. Each diagram can give a total of 2 points:

1 point is given if the diagram meets the following criteria of graphical quality: (i) The diagram shows exactly the quantities that were asked for (ii) All axes are correctly labelled (iii) All axes have the correct scaling (linear, logarithmic, semi-logarithmic...) (iv) The diagram shows at least the range of data points that was asked for.

1 point is given if the diagram represents the correct results.

If the plot meets the criteria of graphical quality and shows the correct results, you will earn a total of 2 points.

Part 2 - Hopping on a lattice: the correlation time

As an example we consider a particle on a system of L sites and an hard wall at sites 0 and L-1. At the equilibrium the particle occupies site i with probability πi = 1/L. To sample a configuration according to that probability, a possibility is to implement a dynamics in which the particle jumps from one site to one of its neighboring sites with a jump probability verifying detailed balance with respect to πi .

A particle hopping on a one-dimensional lattice with L sites .

The ensemble of all transition probabilities can be represented in the Markov matrix of the system. The function matrix_calc computes this matrix. We introduce a new package: numpy. This package is very useful for handling matrices, linear algebra operations...

importmath, numpy
def matrix_calc(L):
neighbor =[[min(k+1,L-1),max(k-1,0)]for k inrange(L)]
hopping1d = numpy.zeros((L,L))for k inrange(L):
hopping1d[neighbor[k][0],k]=0.5
hopping1d[neighbor[k][1],k]=0.5return hopping1d

As you discussed in Class Session 1 the Markov matrix controls the time evolution of all possible simulations performed with Markov Chain algorithms. The eigenvalues and eigenvectors of the Markov matrix can be computed using the package numpy.
(Remark: you have to copy-and-paste the function transfer_calc in the previous program to get a single file)

We call λ2 the second largest eigenvalue in absolute value. Explain why the time scale Δ = -1 / (log |λ2|) gives the relaxation time towards equilibrium.

Plot the behavior of Δ as a function of L (L=4,8,16,...,256) in logarithmic scale. How fast does the correlation time grow with L? [Tip: it can be useful to use eigenvalues.sort() in order to identify the second largest eigenvalue]

Modify the function matrix_calc in order to implement periodic boundary conditions. If you repeat the simulations, the equilibrium is not reached. Explain why and propose a possible solution to fix this problem.

Below we modify the function matrix_calc. Which hopping problem are we implementing?

Compute the size of the Markov matrix as a function of L and the dimension of the lattice.

Plot the behavior of Δ as a function of L (L=4,8,16,...,64) in logarithmic scale.[Maybe L=64 gives a too large matrix for your computer. If this is the case Plot L=4,8,16,32] How fast the does the correlation time grow with L?

More difficult: many particles on a lattice [-This part will not be evaluated-]

Consider a one dimensional system of 4 sites and 2 particles. Each site is either empty or occupied by a single particle. Each particle can jump to its right or its left, provided the target site is empty (this rule defines the Simple Symmetric Exclusion Process, or SSEP). A hard wall is placed at sites 0 and at site 3 (boundary condition are not periodic): when a particle is in 0, it can only stay in 0 or jump to 1 if this site is empty ; similar rules applies at site 3. Below we implement a Markov chain Monte Carlo corresponding to this process.

importrandomrandom.seed(2)
L =4
n_trials =100000
state =[0for k inrange(L)]
state[0],state[1]=1,1foriterinrange(Ntime):
k =random.randint(0, L-2)
state[k], state[k+1]= state[k+1], state[k]

Determine if detailed balance is respected in presence of two particles.

Explain how to generalize this algorithm to the case of a system of L sites and n particles following the SSEP rules. How does the complexity of the algorithm (i.e. the number of operations) grow as a function of L and n?

Write explicitly the Markov matrix associated to the SSEP for L=4 and n=2 (be careful to impose the correct normalization)

Compute the size of the associated Markov matrix as a function of L and n. Comment about the possibilities, in practice, to use the Markov matrix and to use the Markov Chain Monte Carlo in complex systems.

## Table of Contents

DM 1: Markov Chains: the Monte Carlo Markov Chain and the Markov matrixDon't hesitate to ask questions and make remarks on this wiki page.This exercise will allow you to familiarize yourselves with the Markov Chains.

1/2 thumb rule», which is an empirical – yet very important – rule for successful Monte Carlo Markov Chain algorithms.correlation timeof a Markov Chain. We consider systems of particles hopping on a lattice and construct the associated Markov matrix## Basic concepts: Detailed and Global Balance

We would like to sample a configurationCwith a given probabilityπC. A solution for this problem is the Markov Chain Monte Carlo method. We define a dynamics over the ensemble of configurations {C} such that, in the long time limit each configuration is visited with the correct probabilityπC. As discussed in the lecture, a Markov chain describes the evolution which is fully specified by the transition probabilitiesbetween configurations

CandC'. The probabilityπC(t) of being at time t in configurationCevolves according toTo ensure that

πC(t) converges towards the probability distributionπC, one must satisfy the global balance condition:The

detailed balanceis more restrictive:but, if verified by the transition probability, it leads towards the same probability distribution in the long-time limit.

## Part 1 - The heliport and the 1/2 thumb rule [GROUP 1]

Consider the following Markov chain Monte Carlo algorithm:π. In the following, set the n_runs=500 and n_trials=1000. Modify the program so that it calculates theacceptance ratio, which corresponds to the number of moves that stay inside the square, divided by n_trials. Run the new program and compute the acceptance ratio for the following values of delta (delta=0.1,delta=0.2,delta=0.3, ..., delta=5.0). Plot the computed acceptance ratios as a function of delta using a linear scaling on all axes.1/2 thumb rulepredicts that the best performance of a Markov chain Monte Carlo is expected for an acceptance ratio of approximately 1/2. Find the delta-interval in which the acceptance ratio equals 1/2. Example: If the interval where you expect the acceptance ratio of 1/2 is between 3.5 and 3.6, you write "3.5-3.6". Intervals longer than 0.1 are considered as wrong results.N.B.We are asking you to produce a scientific diagram. The general criteria for the quality of diagrams in this lecture are the following. Each diagram can give a total of 2 points:- 1 point is given if the diagram meets the following criteria of graphical quality: (i) The diagram shows exactly the quantities that were asked for (ii) All axes are correctly labelled (iii) All axes have the correct scaling (linear, logarithmic, semi-logarithmic...) (iv) The diagram shows at least the range of data points that was asked for.
- 1 point is given if the diagram represents the correct results.

If the plot meets the criteria of graphical quality and shows the correct results, you will earn a total of 2 points.## Part 2 - Hopping on a lattice: the correlation time

As an example we consider a particle on a system ofLsites and an hard wall at sites 0 andL-1. At the equilibrium the particle occupies siteiwith probabilityπi=1/L. To sample a configuration according to that probability, a possibility is to implement a dynamics in which the particle jumps from one site to one of its neighboring sites with a jump probability verifying detailed balance with respect toπi.The ensemble of all transition probabilities can be represented in the Markov matrix of the system. The function matrix_calc computes this matrix. We introduce a new package:

numpy. This package is very useful for handling matrices, linear algebra operations...As you discussed in Class Session 1 the Markov matrix controls the time evolution of all possible simulations performed with Markov Chain algorithms. The eigenvalues and eigenvectors of the Markov matrix can be computed using the package numpy.

(Remark: you have to copy-and-paste the function transfer_calc in the previous program to get a single file)

λ2 the second largest eigenvalue in absolute value. Explain why the time scaleΔ= -1 / (log |λ2|) gives the relaxation time towards equilibrium.Δas a function ofL(L=4,8,16,...,256) in logarithmic scale. How fast does the correlation time grow withL?[Tip: it can be useful to use eigenvalues.sort() in order to identify the second largest eigenvalue]Land the dimension of the lattice.Δas a function ofL(L=4,8,16,...,64) in logarithmic scale.[Maybe L=64 gives a too large matrix for your computer. If this is the case Plot L=4,8,16,32]How fast the does the correlation time grow withL?## More difficult: many particles on a lattice [-This part will not be evaluated-]

Consider a one dimensional system of 4 sites and 2 particles. Each site is either empty or occupied by a single particle. Each particle can jump to its right or its left, provided the target site is empty (this rule defines theSimple Symmetric Exclusion Process, or SSEP). A hard wall is placed at sites 0 and at site 3 (boundary condition are not periodic): when a particle is in 0, it can only stay in 0 or jump to 1 if this site is empty ; similar rules applies at site 3. Below we implement a Markov chain Monte Carlo corresponding to this process.Lsites andnparticles following the SSEP rules. How does the complexity of the algorithm (i.e.the number of operations) grow as a function ofLandn?L=4 andn=2 (be careful to impose the correct normalization)Landn. Comment about the possibilities, in practice, to use the Markov matrix and to use the Markov Chain Monte Carlo in complex systems.print this page