...

/

Alternating Least Squares Regression

Alternating Least Squares Regression

Learn how to solve alternate least squares regression using the gradient-solving approach.

Matrix decomposition

Consider a movie recommendation system with nn users and mm movies. Suppose the matrix ARn×mA \in \R^{n \times m} represents the rating matrix where its (i,j)th(i,j)^{th} entry AijA_{ij} represents the rating given by the ithi^{th} user to the jthj^{th} movie. We would like to represent/decompose the matrix AA as a product of the two matrices XRn×r,YRm×rX \in \R^{n \times r}, Y \in \R^{m \times r} as follows:

Here, XRn×r,YRm×rX \in \R^{n \times r}, Y \in \R^{m \times r} are often referred to as the user and item matrices, respectively. The user matrix XX can represent the behavior of all nn users in an rr-dimensional latent space. Similarly, the item matrix YY can represent all mm movies in the same rr-dimensional latent space.

Furthermore, the computational complexity to multiply the matrices XX and YY is of the order O(nmr)O(nmr) ...