🖍️
gitbook_docs
  • Introduction
  • Machine Learning
    • Recommended Courses
      • For Undergrad Research
      • Math for Machine Learning
    • ML Notes
      • Covariance Correlation
      • Feature Selection
      • Linear Regression
      • Entropy, Cross-Entropy, KL Divergence
      • Bayesian Classifier
        • Terminology Review
        • Bayesian Classifier for Normally Distributed classes
      • Linear Discriminant Analysis
      • Logistic Regression
        • Logistic Regression Math
      • Logistic Regression-MaximumLikelihood
      • SVM
        • SVM concept
        • SVM math
      • Cross Validation
      • Parameter, Density Estimation
        • MAP, MLE
        • Gaussian Mixture Model
      • E-M
      • Density Estimation(non-parametric)
      • Unsupervised Learning
      • Clustering
      • kNN
      • WaveletTransform
      • Decision Tree
    • Probability and Statistics for Machine Learning
      • Introduction
      • Basics of Data Analysis
      • Probability for Discrete Random Variable
      • Poisson Distribution
      • Chi-Square Distribution
      • P-value and Statistical Hypothesis
      • Power and Sample Size
      • Hypothesis Test Old
      • Hypothesis Test
      • Multi Armed Bandit
      • Bayesian Inference
      • Bayesian Updating with Continuous Priors
      • Discrete Distribution
      • Comparison of Bayesian and frequentist inference
      • Confidence Intervals for Normal Data
      • Frequenist Methods
      • Null Hypothesis Significance Testing
      • Confidence Intervals: Three Views
      • Confidence Intervals for the Mean of Non-normal Data
      • Probabilistic Prediction
  • Industrial AI
    • PHM Dataset
    • BearingFault_Journal
      • Support Vector Machine based
      • Autoregressive(AR) model based
      • Envelope Extraction based
      • Wavelet Decomposition based
      • Prediction of RUL with Deep Convolution Nueral Network
      • Prediction of RUL with Information Entropy
      • Feature Model and Feature Selection
    • TempCore Journal
      • Machine learning of mechanical properties of steels
      • Online prediction of mechanical properties of hot rolled steel plate using machine learning
      • Prediction and Analysis of Tensile Properties of Austenitic Stainless Steel Using Artificial Neural
      • Tempcore, new process for the production of high quality reinforcing
      • TEMPCORE, the most convenient process to produce low cost high strength rebars from 8 to 75 mm
      • Experimental investigation and simulation of structure and tensile properties of Tempcore treated re
    • Notes
  • LiDAR
    • Processing of Point Cloud
    • Intro. 3D Object Detection
    • PointNet
    • PointNet++
    • Frustrum-PointNet
    • VoxelNet
    • Point RCNN
    • PointPillars
    • LaserNet
  • Simulator
    • Simulator List
    • CARLA
    • Airsim
      • Setup
      • Tutorial
        • T#1
        • T#2
        • T#3: Opencv CPP
        • T#4: Opencv Py
        • Untitled
        • T#5: End2End Driving
  • Resources
    • Useful Resources
    • Github
    • Jekyll
  • Reinforcement Learning
    • RL Overview
      • RL Bootcamp
      • MIT Deep RL
    • Textbook
    • Basics
    • Continuous Space RL
  • Unsupervised Learning
    • Introduction
  • Unclassified
    • Ethics
    • Conference Guideline
  • FPGA
    • Untitled
  • Numerical Method
    • NM API reference
Powered by GitBook
On this page
  • SVM Introduction
  • Classification
  • Regression
  • Study plan

Was this helpful?

  1. Machine Learning
  2. ML Notes

SVM

SVM Introduction

The boundary separating the examples of different classes is called the decision boundary.

In SVM, a hyperplane is used to make the boundary to classify the feature X as the label Y=+1 or Y=-1. The hyperplane is expressed with two parameters w, b

wx−b=0{\bf{wx}}-b = 0wx−b=0

where the expression wx means w(1)x(1) + w(2)x(2) + . . . + w(D)x(D), and D is the number of dimensions of the feature vector x.

The feature input X can be classified as Y=+1 or Y=-1, by the condition of

y=sign(wx−b)y = sign({\bf wx} − b)y=sign(wx−b)

Classification

Choose the following method if the classification is using

Linear (hyperplace) Decision Boundary of Hard margin classification

  • Maximal Margin Classifier

    • Hard margin classification is very sensitive to outliers in the training data

**Linear Decision Boundary ** of Soft margin classification

  • Support Vector Classifier (Linear)

    • Allows misclassification

    • But it may not work for certain data without changing the dimensions.

Non-Linear Decision Boundary

  • Support Vector Machine Classifier (Non-Linear)

    • Use kernel function to make it higher dimension.

    • polynomial kernel, Gaussian RBF kernel

    • Use Kernel Trick to reduces the computation that transforms the data from low to high dimension

Regression

  • Support Vector Regression

Method Comparison

Maximal Margin Classifier vs SVC**

SVC vs SVM

Study plan

Study SVM in the following order:

  1. SVM concept

  2. Maximal margin classifier

  3. Support vector classifier

  4. Support vector machine classifier

  5. support vector regression

PreviousLogistic Regression-MaximumLikelihoodNextSVM concept

Last updated 3 years ago

Was this helpful?

image-20220121200932005
image-20220121201030122
image-20220121203515848
image-20220121203533786