Probability distribution

A probability distribution is a function that gives the probability of occurrence of a random event. It can be either discrete or continuous, depending on the domain. We can formally define it as:

P:AR\mathcal P: \mathcal A\to \mathcal R

Here A\mathcal A can determine the distribution as either discrete or continuous as being a discrete or continuous set, respectively.

There are a couple of useful ways to describe a probability distribution:

  • Cumulative Distribution Function (CDF)
  • Probability Density Function (PDF)

Cumulative Distribution Function (CDF)

The CDF of a distribution is the value it will take for xx less than (or equal to) the given value (generalized as XX). Formally,

FX(x)=P(Xx)=xfX(u)duF_X(x) = {P}(X\leq x) = \int_{-\infty}^x f_X(u) \, du

Note: Since probabilities are always non-negative, CDF is a non-decreasing function.

Probability Density Function (PDF)

More often, we are interested in calculating the value for a particular xx rather than the whole limit. Here we can use the Probability Density Function (PDF). This can be calculated by taking the derivative of the CDF:

fX(x)=ddxFX(x)f_X(x) = \frac{d}{dx} F_X(x)

Moments

The expected value of a random variable is an intuitive measure of its mean, which is formally defined as:

E[X]=i=1xipi\mathrm{E}[X] = \sum_{i=1}^\infty x_i\, p_i

As a simple example, suppose the probability of a dice roll is as follows:

P(1)=P(6)=0.1P(1) = P(6) = 0.1

P(2)=P(3)=P(4)=P(5)=0.2P(2) = P(3)=P(4) =P(5) = 0.2

The expected value will be:

E[X]=i=16xipi=0.1(1)+0.2(2)+0.2(3)+0.2(4)+0.2(5)+0.1(6)=3.5\mathrm{E}[X] = \sum_{i=1}^6 x_i\, p_i =0.1(1)+0.2(2)+0.2(3)+0.2(4)+0.2(5)+0.1(6) =3.5

This notion of expected value can be extended to variance, etc., which are collectively known as moments.

The nth\mathrm{n}^{th} moment can be defined as:

μn=E[(XE[X])n]=+(xμ)nf(x)dx\mu_n = \mathrm{E} \left[ ( X - \mathrm{E}[X] )^n \right] = \int_{-\infty}^{+\infty} (x - \mu)^n f(x)\,\mathrm{d} x

The first four moments are:

  1. Mean
  2. Variance
  3. Skewness
  4. Kurtosis

Usually, we consider only mean and variance.

Now we’ll revise some commonly used probability distributions along with their respective JAX functions.


Get hands-on with 1400+ tech skills courses.