Moments of a Time Series: Mean and Variance
Learn about the first two and most popular moments of the distribution of a time series: the mean and the variance.
We'll cover the following...
Motivation
The moments of a time series can often tell us a lot about its distribution. They summarize the key properties of the time series, thus giving us a lot of information about what future realizations may look like. Generally speaking, we can define an unconditional moment as:
In this definition,
The two most popular moments of any time series are the first, called the mean, and the second, the variance. Let’s look into them.
Mean
The mean is the first moment of a distribution, and it is used to locate (or center) the time series. The mean is the value that a stationary series converges to in the long run. We might encounter other terms to refer to the mean, such as expected value or average. We usually denote the mean with the Greek letter,
Inference
The problem with the unconditional mean, as with any other unconditional moments, is that we don’t usually know them. They are abstract concepts that define the distribution of