Search⌘ K
AI Features

Least Squared Error Solution

Explore how to calculate and interpret the least squared error solution in linear regression problems. Learn to minimize the sum of squared errors of a linear system using vectorized notation and matrix operations. Gain hands-on experience applying these concepts with Python functions like pseudo-inverse and least squares solving to approximate solutions, even for inconsistent systems.

Squared error

Squared distance is also known as squared error. Consider a linear equation in wiw_i's:

w1a1+w2a2+...+wnan=bw_1a_1+w_2a_2+...+w_na_n=b

The squared error (squared distance) on a given point, (w^1,w^2,...,w^n)(\hat w_1,\hat w_2,...,\hat w_n), is defined as:

SE(w^1,w^2,...,w^n)=(w^1a1+w^2a2+...+w^nanb)2SE(\hat w_1,\hat w_2,...,\hat w_n)=(\hat w_1a_1+\hat w_2a_2+...+\hat w_na_n-b)^2

Note: In the case of w=w^w=\hat{w}, the sum of squared errors=0. This implies that we’re able to find an exact solution.

Sum of squared errors

Consider a linear system with mm equations and nn unknowns and the corresponding squared errors on a point, (w^1,w^2,...,w^n)(\hat w_1,\hat w_2,...,\hat w_n):

Linear System Sum of Squared Distances
(1):w1a11+w2a12+...+wna1n=b1(1):w_1a_{11}+w_2a_{12}+...+w_na_{1n}=b_1(2):w1a21+w2a22+...+wna2n=b2(2):w_1a_{21}+w_2a_{22}+...+w_na_{2n}=b_2                        \;\ \;\ \;\ \;\ \;\ \;\ \;\ \;\ \vdots(m):w1am1+w2am2+...+wnamn=bm(m):w_1a_{m1}+w_2a_{m2}+...+w_na_{mn}=b_m
...