Kernels

Discover the power of kernels: functions, examples, tricks, and popular types.

Kernels are an important concept in machine learning and pattern recognition. They’re a mathematical function that maps input data into a high-dimensional feature space where it’s easier to classify or analyze. Kernels allow us to perform complex computations on data that would otherwise be difficult or impossible to process in its original form.

Press + to interact
Applying kernel on a 2D dataset
Applying kernel on a 2D dataset

Kernel function

A kernel function can be thought of as a dot product in the feature space defined by the mapping ϕ\phi. Given two input vectors xi\bold x_i and xj\bold x_j, the dot product in the feature space can be represented as ϕ(xi)Tϕ(xj)\phi(\bold x_i)^T\phi(\bold x_j). It’s possible to compute the dot product ϕ(xi)Tϕ(xj)\phi(\bold x_i)^T\phi(\bold x_j) in the feature space via a kernel function k(xi,xj)k(\bold x_i, \bold x_j) on the input vectors.

Polynomial kernel example

Assume two input vectors xi=[x1ix2i]T\bold x_i=\begin{bmatrix}x_{1i} & x_{2i}\end{bmatrix}^T ...