SLIDE 1
CSE 802 Spring 2017 Logistic Regression
Inci M. Baytas Computer Science Michigan State University March 29, 2017
1 / 10
CSE 802 Spring 2017 Logistic Regression Inci M. Baytas Computer - - PowerPoint PPT Presentation
CSE 802 Spring 2017 Logistic Regression Inci M. Baytas Computer Science Michigan State University March 29, 2017 1 / 10 Introduction Consider two-class classification problem, the posterior probability of class C 1 can be written as: w T
1 / 10
2 / 10
◮ We estimate the parameter w directly.
◮ Logistic regression: M adjustable parameters. ◮ Generative models: Assume we fit Gaussian class conditional
3 / 10
4 / 10
5 / 10
◮ There is a global minimum. ◮ Can use an iterative approach.
6 / 10
7 / 10
k Φ)
j Φ) which is called
8 / 10
◮ ℓ2 norm ◮ ℓ1 norm (Lasso) 9 / 10
10 / 10