# Linear Regression

## Linear Regression

See Numerical Method Lecture Note

![](https://3698175758-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MAwtzMy_pbrChIExFtN%2Fuploads%2Fgit-blob-db43189f8df7bb13debcfc7c002df5f43aba6a83%2Fimage.png?alt=media)

### Cost Function

![](https://3698175758-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MAwtzMy_pbrChIExFtN%2Fuploads%2Fgit-blob-0b3fe6f941b9e494137b44aa935e984a028dcbbe%2Fimage.png?alt=media)

### Optimization

#### 1. Solve analytically:

* Least square solution using normal equation
* Don't need to iterate, and no need to design learning rate
* Slow if the dimension of data (n) is very large.

![normal equation](https://3698175758-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MAwtzMy_pbrChIExFtN%2Fuploads%2Fgit-blob-9a43f950a614826afd939cf0c1a46d25ec372ab5%2Fimage.png?alt=media)

#### 2. Gradient Descent:

* can use batch gradient descent
* Idea: Make sure features are on a similar scale ( e.g ,. −1≤𝑥𝑖≤1): normalization
* Check the learning rate
* Need many iterations
* Works well even when n is large

![](https://3698175758-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MAwtzMy_pbrChIExFtN%2Fuploads%2Fgit-blob-32d8db5003e84b71ad7fb3f9b200ea4494bd78f1%2Fimage.png?alt=media)

Derive the Gradient of the cost function J

![](https://3698175758-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MAwtzMy_pbrChIExFtN%2Fuploads%2Fgit-blob-9da0c1ad578619da14907d582389b28ff1dbf27b%2Fimage.png?alt=media)

### Polynomial Regression

![](https://3698175758-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F-MAwtzMy_pbrChIExFtN%2Fuploads%2Fgit-blob-758a34d2302d7c0bf718c948419b3a989fb70d76%2Fimage.png?alt=media)

## Logistic Regression

The first thing to say is that logistic regression is **not a regression, but a classification** learning algorithm.
