Principal Component Analysis
Gain insights into PCA theory, address dimensionality challenges, and acquire hands-on skills in PCA implementation using real-world data.
We'll cover the following...
Curse of dimensionality
Curse of dimensionality in machine learning refers to the challenges and computational complexities that arise when dealing with a large number of features (high-dimensional data/high-dimensional feature space). As the number of features or dimensions increases, the amount of data needed to maintain reliable and meaningful patterns also increases, often leading to increased data and computational demands and the risk of overfitting.
Example
Consider a product recommendation system where each product is described by multiple features such as price, size, color, brand, and so on. As the number of features increases, possible combinations grow exponentially, making it harder to find meaningful relationships between products and user preferences. This high-dimensional data can lead to sparse data points, which makes accurate predictions more challenging and requires more data to avoid unreliable results, hence, illustrating the curse of dimensionality.
It seems desirable to reduce the number of features by maintaining the information. Does the term “compression” ring a bell?
Dimensionality reduction
Dimensionality reduction involves decreasing the number of features and is achieved by either selecting the most significant ones or by transforming them into a smaller set of new features. Not all dimensionality reduction methods aim to maintain information (to reconstruct or decompress). Different objectives can be defined in this regard.
PCA
Principal Component Analysis (PCA) is a dimensionality reduction technique that identifies key patterns and relationships within data by projecting it onto a lower-dimensional space while preserving as much variance as possible.
We first need to understand the dimensions to understand PCA. Imagine you’re in a video game where you can move forward, backward, left, and right. These are two dimensions. Now, imagine you can also fly up or dig down. That’s a third dimension. In data science, dimensions are like these directions, but they can be anything—age, height, income, etc.
Note: We can visualize up to three dimensions easily, but what if we have more? That’s where PCA comes in. It helps us to reduce the number of dimensions while keeping the most important information intact.
Properties of PCA
To explain the essential properties of PCA, let’s take an example of data points in dimensional space being the columns of the matrix ...