Skip to content

Commit 1270317

Browse files
committed
Add new MCMC dummy blog post.
1 parent 1ab2421 commit 1270317

File tree

3 files changed

+846
-2
lines changed

3 files changed

+846
-2
lines changed
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
Title: MCMC Sampling for dummies
2+
date: 2015-11-10 09:00
3+
comments: true
4+
slug: mcmc-sampling
5+
tags: bayesian statistics
6+
7+
{% notebook MCMC-sampling-for-dummies.v3.ipynb %}

content/downloads/notebooks/MCMC-sampling-for-dummies.ipynb

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@
2828
"\n",
2929
"Now we might say \"OK, if we can't solve something, could we try to approximate it? For example, if we could somehow draw samples from that posterior we can [Monte Carlo approximate](https://en.wikipedia.org/wiki/Monte_Carlo_method) it.\" Unfortunately, to directly sample from that distribution you not only have to solve Bayes formula, but also invert it, so that's even harder. \n",
3030
"\n",
31-
"Then we might say \"Well, instead lets construct a Markov chain that has our posterior as the target distribution and sample from that\". I'm just kidding, most people wouldn't say that as it sounds bat-shit crazy. If you can't compute it, can't sample from it, then constructing that Markov chain with all these properties must be much much harder.\n",
31+
"Then we might say \"Well, instead lets construct a Markov chain that has as an equilibrium distribution which matches our posterior distribution\". I'm just kidding, most people wouldn't say that as it sounds bat-shit crazy. If you can't compute it, can't sample from it, then constructing that Markov chain with all these properties must be even harder.\n",
3232
"\n",
3333
"The surprising insight though is that this is actually very easy and there exist a general class of algorithms that do this called [**Markov chain Monte Carlo**](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo) (constructing a Markov chain to do Monte Carlo approximation)."
3434
]
@@ -851,7 +851,9 @@
851851
"\n",
852852
"We glossed over a lot of detail which is certainly important but there are many other posts that deal with that. Here we really wanted to communicate the idea of MCMC and the Metropolis sampler. Hopefully now you have right intuition to read one of the more technical introductions to this topic.\n",
853853
"\n",
854-
"Other, more fancy, MCMC algorithms like Hamiltonian Monte Carlo actually work very similar to this, they are just much more clever in proposing where to jump next."
854+
"Other, more fancy, MCMC algorithms like Hamiltonian Monte Carlo actually work very similar to this, they are just much more clever in proposing where to jump next.\n",
855+
"\n",
856+
"This blog post was written in a Jupyter Notebook, you can find the underlying NB with all code [here](https://github.com/twiecki/WhileMyMCMCGentlySamples/blob/master/content/downloads/notebooks/MCMC-sampling-for-dummies.ipynb)."
855857
]
856858
}
857859
],

0 commit comments

Comments
 (0)