SVM

SVM Introduction

The boundary separating the examples of different classes is called the decision boundary.

In SVM, a hyperplane is used to make the boundary to classify the feature X as the label Y=+1 or Y=-1. The hyperplane is expressed with two parameters w, b

wxāˆ’b=0{\bf{wx}}-b = 0

where the expression wx means w(1)x(1) + w(2)x(2) + . . . + w(D)x(D), and D is the number of dimensions of the feature vector x.

The feature input X can be classified as Y=+1 or Y=-1, by the condition of

y=sign(wxāˆ’b)y = sign({\bf wx} āˆ’ b)

Classification

Choose the following method if the classification is using

Linear (hyperplace) Decision Boundary of Hard margin classification

  • Maximal Margin Classifier

    • Hard margin classification is very sensitive to outliers in the training data

**Linear Decision Boundary ** of Soft margin classification

  • Support Vector Classifier (Linear)

    • Allows misclassification

    • But it may not work for certain data without changing the dimensions.

Non-Linear Decision Boundary

  • Support Vector Machine Classifier (Non-Linear)

    • Use kernel function to make it higher dimension.

    • polynomial kernel, Gaussian RBF kernel

    • Use Kernel Trick to reduces the computation that transforms the data from low to high dimension

Regression

  • Support Vector Regression

Method Comparison

Maximal Margin Classifier vs SVC**

SVC vs SVM

Study plan

Study SVM in the following order:

  1. SVM concept

  2. Maximal margin classifier

  3. Support vector classifier

  4. Support vector machine classifier

  5. support vector regression

Last updated