The mean squared error (MSE) is largely used as a metric to determine the performance of an algorithm.
The formula to calculate the MSE is as follows:
The mean square error is the average of the square of the difference between the observed and predicted values of a variable.
In Python, the MSE can be calculated rather easily, especially with the use of lists.
Suppose you wish to calculate the MSE and are provided with the observed and predicted values. The steps mentioned above will be implemented as follows:
y = [11,20,19,17,10]y_bar = [12,18,19.5,18,9]summation = 0 #variable to store the summation of differencesn = len(y) #finding total number of items in listfor i in range (0,n): #looping through each element of the listdifference = y[i] - y_bar[i] #finding the difference between observed and predicted valuesquared_difference = difference**2 #taking square of the differenesummation = summation + squared_difference #taking a sum of all the differencesMSE = summation/n #dividing summation by total values to obtain averageprint "The Mean Square Error is: " , MSE