Dual Formulation

Learn dual formulation and derive Lagrangian dual function for hard and soft-margin SVM.

Why dual formulation?

The dual formulation of SVM is important because it allows us to kernelize the optimization problem, making it easier to find a hyperplane that separates the classes in a higher-dimensional feature space. Additionally, the dual formulation provides better visibility of the solution’s sparsity, which explains SVM’s strong generalization ability in high-dimensional feature spaces.

Lagrangian dual of hard-margin SVM

In the case of hard-margin SVM, we have a constrained optimization problem where we want to find the parameters w\bold w that minimize the objective function subject to the constraint that all training samples are classified correctly with a margin of at least 1. This can be written as:

minw12w2              s.t.              yi(wTϕ(xi))1              i \min_{\bold w} \frac{1}{2}\|\bold w\|^2 \;\;\;\;\;\;\;s.t. \;\;\;\;\;\;\; y_i(\bold{w}^{T}\phi(\bold x_i))\ge 1 \;\;\;\;\;\;\; \forall_i

We introduce a Lagrange multiplier for each constraint to apply the Lagrangian method. We call it a\bold a and write the Lagrangian as:

L ...