Logistic Regression-MaximumLikelihood
Logistic Regression - Maximum Likelihood
Concept
https://youtu.be/vN5cNN2-HWE
https://youtu.be/BfKanl1aSG0
Introduction
Click here for the full note
Logistic regression is a classification problem
It is choosing class 'k' of highest
Q. How to estimate ?
Option 1: Indirectly estimate using Bayes rule. e.g. LDA
Option 2: Directly estimate using Logistic Regression
Logistic Regression Model of Posterior Probability Function
From now on, we will assume it is binary classification, K=[0,1], number of K=2
Let function is
Definition of odds are
Odds:
log-odds:
Let Log-odds is a Linear Function of data X
Q) How to estimate the parameter ? Use maximum likelihood
Estimate Posterior Probability Function with Likelihood Function
Given: Training dataset:
The probability of the observed data: product of probability that Y=1 for z_i of k=1 and probability that Y=0 for z_i of k=0
This is also known as Likelihood function.
Let function is
Then, Likelihood function is expressed as
Maximizing Likelihood
Goal: estimate parameter that maximizes likelihood function.
Maximizing likelihood is also maximizing log-likelihood
Estimating parameter for Maximizing Log-Likelihood
Expressing with Matrices
Iterative Reweighted Least Squares (IRLS)
Solving iteratively with updated values of W,p
Further Reading
Q) Comparison with Logistic Regression with Sigmoid Function (see Andrew Ng's lecture)
Note:
Maximizes the likelihood vs Minimizes the loss function
Example
Step 1
Step 2
Step 3
Next
Multinomial Logistic Regression:
Fitting Data with Logistic Regression:
Last updated
Was this helpful?