...

/

Moments of a Time Series: Mean and Variance

Moments of a Time Series: Mean and Variance

Learn about the first two and most popular moments of the distribution of a time series: the mean and the variance.

Motivation

The moments of a time series can often tell us a lot about its distribution. They summarize the key properties of the time series, thus giving us a lot of information about what future realizations may look like. Generally speaking, we can define an unconditional moment as:

In this definition, kk represents the order of the moment of yty_t. Also, note the word “unconditional.” We use this term to refer to the value of the moment, regardless of any past realizations. Unsurprisingly, unconditional moments are in opposition to conditional moments. We’ll have a look at the difference between them at the end of the lesson. For the moment, and unless stated otherwise, we’ll stick with unconditional moments.

The two most popular moments of any time series are the first, called the mean, and the second, the variance. Let’s look into them.

Mean

The mean is the first moment of a distribution, and it is used to locate (or center) the time series. The mean is the value that a stationary series converges to in the long run. We might encounter other terms to refer to the mean, such as expected value or average. We usually denote the mean with the Greek letter, μ\mu, while assigning other letters or expressions to higher moments. Mathematically, we define the mean as:

Inference

The problem with the unconditional mean, as with any other unconditional moments, is that we don’t usually know them. They are abstract concepts that define the distribution of ...