Model Evaluation Measures (Explained Variance Score, MAE, MSE)
In this lesson we will look at different evaluation measures for Regression Models.
We'll cover the following
Regression Models Evaluation Metrics
Once we have built a model on the training dataset, it is time to evaluate the model on the test dataset to check how good or bad it is. It will also help us know
- If the model is overfitting
- If the model is underfitting
- If we need to revise our Feature Engineering or Feature Selection techniques.
We use the following measures to assess the performance of a Regression Model.
Explained Variance Score
Explained Variance is one of the key measures in evaluating the Regression Models. In statistics, explained Variation Measures the proportion to which a regression model accounts for the variation (dispersion) of a given data set.
Formula
If is the predicted target real valued output, then is the corresponding (correct) target real valued output, and is Variance. Then the explained variance is estimated as follow:
The best possible score is 1.0. The lower values are worse.
Code
These code examples have been taken from the Scikit Learn Documentation. In all the codes below:
y_true
are the actual hypothetical values.y_pred
are the predicted hypothetical values.
Get hands-on with 1400+ tech skills courses.