Class Session 11: Correlated noise

Introduction

Consider a sequence of random numbers



used to simulate the result of a stochastic evolution. In most simple cases, those numbers are taken to be independent and identically distributed, which is realistic enough to simulate a number of physical phenomena. However, such uncorrelation often correspond to an idealistic limit, and one might want to go beyond this limit.

In this exercice, we discuss how to generate N identically distributed but correlated Gaussian random numbers.

Covariance matrix

For simplicity we assume that each of the ηi is normally distributed, whith zero mean and variance equal to 1.

The covariance matrix C is the matrix whose entry (i,j) is equal to the following covariance:



Show that:
  • C is a symmetric matrix.
  • Eigenvalues of C are positive or zero (the matrix is positive semi-definite).

Show that there exists a matrix A, symmetric and positive semi-definite, verifying C = A2. The matrix A is a square root of the matrix C.
  • How can one compute A in practice?

Show that the correlated noise (ηi) can be generated in the following way:
  • Draw are N independent numbers X=(X1,...,XN) of Gaussian distribution of zero mean and unit variance.
  • Compute η from: η = A.X .

Explain why Gaussianity is important. Use this program to test our conclusions:
import pylab, math, numpy
#numpy.random.seed(2)
N = 128
covariance  =  numpy.zeros((N,N))
A = numpy.zeros((N,N))
for i in range(N):
    for j in range(N):
        x = abs(i-j)
        covariance[i,j] = math.exp(-x/200.)
w,v = numpy.linalg.eig(covariance)
for i in range(N):
    for j in range(N):
        A[i,j] = sum(math.sqrt(w[k])*v[i,k]*v[j,k] for k in range(N))
x = numpy.random.randn((N))
eta = numpy.dot(A,x)
pylab.title('Correlated vs non-correlated noise')
pylab.xlabel('i')
pylab.ylabel('eta, x')
pylab.plot(eta,'b.-')
pylab.plot(x,'r.-')
pylab.show()

CFP_2010_noise.png
Figure 1. Comparison of correlated (blue) and uncorrelated (red) noise.


From correlated noise to non-Markovian random walk

We consider a one dimensional Gaussian random walk of N steps, initially located at the origin. Its trajectory {x1,x2,...,xN} can be written as:




If the variables ηi are independent, the walk is Markovian, and the process x(i) becomes a Brownian motion of time t = i / dt in the continuum limit where the time step dt goes to zero. In particular the autocorrelation function of the process is



What happens when the noise (ηi) is correlated?

The fractional Brownian motion is an important class of non-Markovian processes. The autocorrelation function for these processes is



H is the Hurst exponent (0 < H < 1) and for H = 1/2 we recover Brownian motion and standard diffusion.
  • Compute the covariance matrix of the ηi = x(i) − x(i − 1) associated to a generic fractional Brownian motion.
  • Looking at the properties of this matrix, explain why fractional Brownian motion is so famous.
  • Plot the behavior of the covariance matrix for H < 1/2 and for H > 1/2.
  • Generate some paths of fractional Brownian motion in python.

Use this program to test our conclusions:

import  pylab, math, numpy
numpy.random.seed(2)
N = 128
H = 3./4.
HH = 2*H
covariance = numpy.zeros((N,N))
A = numpy.zeros((N,N))
for i in range(N):
    for j in range(N):
        d = abs(i-j)
        covariance[i,j] = (abs(d - 1)**HH + (d + 1)**HH - 2*d**HH)/2.
w,v = numpy.linalg.eig(covariance)
for i in range(N):
    for j in range(N):
        A[i,j] = sum(math.sqrt(w[k])*v[i,k]*v[j,k] for k in range(N))
x = numpy.random.randn((N))
eta = numpy.dot(A,x)
xfBm = [sum(eta[0:i]) for i in range(len(eta)+1)]
xBM = [sum(x[0:i]) for i in range(len(x)+1)]
pylab.title('fBm (blue) vs BM (red)')
pylab.xlabel('i')
pylab.ylabel('x(i)')
pylab.plot(xfBm,'b.-')
pylab.plot(xBM,'r.-')
pylab.show()


CFP_2010_H34.png
Figure 2. Fractional Brownian motion for H=3/4.
CFP_2010_H14.png
Figure 3. Fractional Brownian motion for H=1/4.























[Print this page]