# MCMC (3) : Importance Sampling

Importance sampling is the first sampling method I faced when I studied Monte Carlo method. Nevertheless, I haven’t seen many examples for the importance sampling. Maybe it is because the importance sampling is not effective for high dimensional systems. The weak point of the importance sampling is that the performance of it is determined by how well we choose the disposal distribution close to the target distribution. Here, I will present a simple example of the importance sampling.

# MCMC (2) : Exact Monte Carlo Method

Exact Markov chain Monte Carlo sampling I don’t like the naming. The word exact could mislead us to understand the concept. Anyway I used the word in the title because it was the title of the chapter of the book “Information Theory, Inference, and Learning Algorithms” by David Mackay, which I studied to learn the theory. The different names of it are perfect simulation and coupling from the past. Maybe these are better names.

# MCMC (1) : Monte Carlo Method and Metropolis-Hastings Sampling

Monte Carlo method Monte Carlo method is useful in Bayesian data modeling because maximizing posterior probability is often very difficult and fitting a Gaussian becomes hard. Monte Carlo method becomes valuable in that we want to generate samples in some situation, and also want to estimate some expectation values of various functions. What we deal with in this post is only small part of Monte Carlo method. It is going to be good if I have a chance to introduce about all sooner in this blog, but if not, see one of the repositories from my Github
• page 2 of 2

#### Namshik Kim

physicist, data scientist

Data Scientist