Solve Problems in Bayesian Inference
In this lesson, we will review concepts learned so far and learn how to solve problems using Bayesian Inference.
Review of the Concepts Learned So Far
Since we have covered many new concepts this would be a good time to quickly review where we’re at:
- We’re representing a particular discrete probability distribution over a small number of members of a particular type by
IDiscreteDistribution<A>
. - We can condition a distribution — by discarding certain possibilities from it — with
Where
. - We can project a distribution from one type to another with
Select
. - A conditional probability — the probability of given that some is true — is represented as likelihood function of type
Func<A, IDiscreteDistribution<B>>
. - We can “bind” a likelihood function onto a prior distribution with
SelectMany
to produce a joint distribution.
These are all good results and we hope you agree that we have already produced a much richer and more powerful abstraction over randomness than System.Random
provides.
Bayes’ Theorem
In this lesson, everything is really going to come together to reveal that we can use these tools to solve interesting problems in probabilistic inference.
To show how we’ll need to start by reviewing Bayes’ Theorem.
If we have a prior , and a likelihood , we know that we can “bind” them together to form the joint distribution. That is, the probability of and both happening is the probability of multiplied by the probability that happens given that has happened:
Obviously, that goes the other way. If we have as our prior, and as our likelihood, then:
But is the same as , and things equal to the same are equal to each other. Therefore:
Let’s suppose that is our prior and is our likelihood. In the equation above the term is called the posterior and can be computed like this:
Let’s move away from abstract mathematics and illustrate an example by using the code we’ve written so far.
We can step back a few lessons and re-examine our prior and likelihood example for Frob Syndrome. Recall that this was a made-up study of a made-up condition which we believe may be linked to height. We’ll use the weights from the original episode.
That is to say: we have , we have likelihood function , and we wish to first compute the joint probability distribution &:
var heights = new List<Height() { Tall, Medium, Short }
var prior = heights.ToWeighted(5, 2, 1);
[...]
IDiscreteDistribution<Severity> likelihood(Height h)
{
switch(h)
{
case Tall: return severity.ToWeighted(10, 11, 0);
case Medium: return severity.ToWeighted(0, 12, 5);
default: return severity.ToWeighted(0, 0, 1);
}
}
[...]
var joint = prior.Joint(likelihood);
Console.WriteLine(joint.ShowWeights());
This produces:
(Tall, Severe):850
(Tall, Moderate):935
(Medium, Moderate):504
(Medium, Mild):210
(Short, Mild):357
Now the question is: what is the posterior, ?
Get hands-on with 1400+ tech skills courses.