What is singular value decomposition?

Singular value decomposition (SVD) is a mathematical technique that breaks down a matrix into three separate matrices, allowing us to understand the underlying structure of the original matrix.

Given a matrix AA of size m×nm \times n, it's SVD is represented as:

where:

  • UU is an m×mm \times m, orthogonal matrixA matrix for which its inverse is its transpose. (columns are orthogonal unit vectors).

  • \sum is a diagonal matrix with non-negative real numbers on the diagonal. The elements on the diagonal are the square roots of the positive eigenvalues of A.ATA.A^{T}.

  • VTV^{T} is the transpose of an orthogonal matrix (rows are orthogonal unit vectors).

Example

Suppose we have the following matrix AA:

And we want to calculate its SVD.

Calculating VV

  1. To find a matrix of the eigenvectors of A×ATA \times A^{T}, we'll first find A×ATA \times A^{T}:

  1. After getting A×ATA \times A^{T}, we'll find its eigenvalues as follows:

Hence, the eigenvalues are 2020 and 8080.

  1. Now we have the eigenvalues, we need to find the corresponding eigenvectors. Let's suppose v1v_{1} and v2v_{2} are two corresponding eigenvectors:

([26181874][1001].20).v1=0(\begin{bmatrix} 26 & 18\\ 18 & 74 \end{bmatrix} - \begin{bmatrix} 1 & 0\\ 0 & 1 \end{bmatrix}.20).v_{1} = 0

([26181874][200020]).v1=0(\begin{bmatrix} 26 & 18\\ 18 & 74 \end{bmatrix} - \begin{bmatrix} 20 & 0\\ 0 & 20 \end{bmatrix}).v_{1} = 0

([6181854]).([x1y1])=0(\begin{bmatrix} 6 & 18\\ 18 & 54 \end{bmatrix}).(\begin{bmatrix} x_{1}\\ y_{1} \end{bmatrix}) = 0

6x1+18y1=018x1+54y1=0 6x_{1}+18y_{1} = 0 \\ 18x_{1}+54y_{1} = 0 \\

This system of linear equation can have infinitely many solutions. One of the solutions is:

x1=3y1=1 x_{1} = -3\\ y_{1} = 1

Hence, v1=[31] v_{1} = \begin{bmatrix} -3\\ 1 \end{bmatrix}

Normalizing Eigenvectors: v1n=[310110] v_{1n} = \begin{bmatrix} \frac{-3}{\sqrt{10}}\\ \frac{1}{\sqrt{10}} \end{bmatrix}

([26181874][1001].80).v2=0(\begin{bmatrix} 26 & 18\\ 18 & 74 \end{bmatrix} - \begin{bmatrix} 1 & 0\\ 0 & 1 \end{bmatrix}.80).v_{2} = 0

([26181874][800080]).v2=0(\begin{bmatrix} 26 & 18\\ 18 & 74 \end{bmatrix} - \begin{bmatrix} 80 & 0\\ 0 & 80 \end{bmatrix}).v_{2} = 0

([5418186]).([x2y2])=0(\begin{bmatrix} -54 & 18\\ 18 & -6 \end{bmatrix}).(\begin{bmatrix} x_{2}\\ y_{2} \end{bmatrix}) = 0

54x2+18y2=018x26y2=0 -54x_{2}+18y_{2} = 0 \\ 18x_{2}-6y_{2} = 0

This system of linear equation can have infinitely many solutions. One of the solutions is:

x2=1y2=3 x_{2} = 1\\ y_{2} = 3

Hence, v2=[13] v_{2} = \begin{bmatrix} 1\\ 3 \end{bmatrix}

Normalizing Eigenvectors: v2n=[110310] v_{2n} = \begin{bmatrix} \frac{1}{\sqrt{10}}\\ \frac{3}{\sqrt{10}} \end{bmatrix}

  1. Now, we can say that:

Calculating \sum

The eigenvalues we got are 2020 and 8080. Hence, our ∑ matrix will be:

Calculating UU

To calculate UU, we can use the following relation:

Combining calculations

The final form of SVD after combining all the calculations is:

You can verify the calculations as follows:

Conclusion

SVD has vast applications in machine learning techniques such as dimensionality reduction, signal processing, or image compression. It can also be used to determine the transformation of an image matrix.

Note: Read about eigen value decomposition.

Free Resources

Copyright ©2024 Educative, Inc. All rights reserved