## Exact Markov chain Monte Carlo sampling

I don’t like the naming. The word exact could mislead us to understand the concept. Anyway I used the word in the title because it was the title of the chapter of the book “Information Theory, Inference, and Learning Algorithms” by David Mackay, which I studied to learn the theory.

The different names of it are perfect simulation and coupling from the past. Maybe these are better names.

In Monte Carlo sampling, one of the problems is that it is hard to know when we stop samplings. We should stop when the Markov chain converges to the the equilibrium distribution. In the following scripts, niters is that.

Of course, we just try from the small number and can stop increasing the iteration number when we keep having the similar results in the range of the errors. However, as the system gets complicated, this procedure becomes more annoying and time-wasting. Exact Monte Carlo method is to avoid it, and to offers the appropriate number of samples.

We start from the Python script introduced at the previous posting. Just we have more samples from different initial points.

niters= 500
n_sample = 10
List = np.random.randint(0,20, size=(niters,n_sample))

for i in range(niters-1):
for j in range(n_sample):
x0 = List[i][j]
if x0 == 20:
next = x0 - 1
elif x0 == 0:
next = x0 + 1
else:
next = x0+1 if np.random.random()>=0.5 else x0-1
List[i+1][j] = next

plt.figure(figsize=(12,4))
plt.plot(List,linewidth=0.3)
plt.show()


If we assume the samples can coalesce, we can set change it as follows.

for j in range(n_sample):
for k in range(n_sample):
for i in range(niters-1):
if j != k and List[i,j] == List[i,k]:
List[i+1,j] = List[i+1,k]


As soon as two samples have the same value, make the next ones same as well.

plt.figure(figsize=(12,4))
plt.plot(List,linewidth=0.3)
plt.show()


We see that the number of plots diminish, and only two lines remain. They correspond to odd/even initial points for each.

In the next simulator the random proposal (left or right) is the same for all states, and it makes them coalesce. See how the else loop is different from the first script. We define the single random generator, and make it acting on all the samples for each turn.

niters= 400
n_sample = 10
List = np.zeros((niters, n_sample))

for i in range(n_sample):
List[0][i] = random.randint(0,20)

for i in range(niters-1):
single_generator= np.random.random()
for j in range(n_sample):
x0 = List[i][j]
if x0 == 20:
next = x0 - 1
elif x0 == 0:
next = x0 + 1
else:
next = x0+1 if single_generator>=0.5 else x0-1
List[i+1][j] = next

plt.plot(List,linewidth=0.3)
plt.show()


We can guess the number of iteration for the convergence is around 400.