...

/

Correct Implementation of Importance Sampling

Correct Implementation of Importance Sampling

In this lesson, we will correct our implementation of Importance Sampling by removing the effect of the constant factor error.

Revisiting Our Previous Example

Let’s revisit our example one more time.

Suppose we have our nominal distribution p that possibly has “black swans” and our helper distribution q which has the same support, but no black swans.

We wish to compute the expected value of f when applied to samples from p, and we’ve seen that we can estimate it by computing the expected value of g:

x => f(x) * p.Weight(x) / q.Weight(x)

applied to samples of q.

Removing the Constant Factor Error

Unfortunately, in the last two lessons, we saw that the result will be wrong by a constant factor; the constant factor is the quotient of the normalization constants of q and p.

It seems like we’re stuck; it can be expensive or difficult to determine the normalization factor for an arbitrary distribution. We’ve created infrastructure for building weighted distributions and computing posteriors and all sorts of fun stuff, and none of it assumes that weights are normalized so that the area under the PDF is 1.01.0 ...