Introduction to Machine Learning Polynomial Regression Models - - PowerPoint PPT Presentation

introduction to machine learning polynomial regression
SMART_READER_LITE
LIVE PREVIEW

Introduction to Machine Learning Polynomial Regression Models - - PowerPoint PPT Presentation

Introduction to Machine Learning Polynomial Regression Models Learning goals Understand how to add flexibility to the linear model by using 2.0 f(x) for d = 1 (linear) f(x) for d = 5 1.5 polynomials f(x) for d = 25 1.0 0.5 y Understand


slide-1
SLIDE 1

Introduction to Machine Learning Polynomial Regression Models

0.5 1.0 1.5 2.0 −1.0 0.0 0.5 1.0 1.5 2.0 x y f(x) for d = 1 (linear) f(x) for d = 5 f(x) for d = 25

Learning goals

Understand how to add flexibility to the linear model by using polynomials Understand that this only affects the hypothesis space, not risk or

  • ptimization

Understand that more flexibility is not equivalent to a better model

slide-2
SLIDE 2

REGRESSION: POLYNOMIALS

We can make linear regression models much more flexible by using polynomials xd

j – or any other derived features like sin(xj) or (xj · xk) –

as additional features. The optimization and risk of the learner remain the same. Only the hypothesis space of the learner changes: instead of linear functions f

  • x(i) | θ
  • = θ0 + θ1x(i)

1 + θ2x(i) 2 + . . .

  • f only the original features,

it now includes linear functions of the derived features as well, e.g. f

  • x(i) | θ
  • = θ0 +

d

  • k=1

θ1k

  • x(i)

1

k +

d

  • k=1

θ2k

  • x(i)

2

k + . . .

c

  • Introduction to Machine Learning – 1 / 7
slide-3
SLIDE 3

REGRESSION: POLYNOMIALS

Polynomial regression example

0.5 1.0 1.5 2.0 −1.0 0.0 0.5 1.0 1.5 2.0 x y c

  • Introduction to Machine Learning – 2 / 7
slide-4
SLIDE 4

REGRESSION: POLYNOMIALS

Polynomial regression example Models of different complexity, i.e., of different polynomial order d, are fitted to the data:

0.5 1.0 1.5 2.0 −1.0 0.0 0.5 1.0 1.5 2.0 x y f(x) for d = 1 (linear) f(x) for d = 5 f(x) for d = 25 c

  • Introduction to Machine Learning – 3 / 7
slide-5
SLIDE 5

REGRESSION: POLYNOMIALS

Polynomial regression example Models of different complexity, i.e., of different polynomial order d, are fitted to the data:

0.5 1.0 1.5 2.0 −1.0 0.0 0.5 1.0 1.5 2.0 x y f(x) for d = 1 (linear) f(x) for d = 5 f(x) for d = 25 c

  • Introduction to Machine Learning – 4 / 7
slide-6
SLIDE 6

REGRESSION: POLYNOMIALS

Polynomial regression example Models of different complexity, i.e., of different polynomial order d, are fitted to the data:

0.5 1.0 1.5 2.0 −1.0 0.0 0.5 1.0 1.5 2.0 x y f(x) for d = 1 (linear) f(x) for d = 5 f(x) for d = 25 c

  • Introduction to Machine Learning – 5 / 7
slide-7
SLIDE 7

REGRESSION: POLYNOMIALS

The higher d is, the more capacity the learner has to learn complicated functions of x, but this also increases the danger of overfitting: The model space H contains so many complex functions that we are able to find one that approximates the training data arbitrarily well. However, predictions on new data are not as successful because our model has learnt spurious “wiggles” from the random noise in the training data (much, much more on this later).

c

  • Introduction to Machine Learning – 6 / 7
slide-8
SLIDE 8

REGRESSION: POLYNOMIALS

0.5 1.0 1.5 2.0 −1.0 0.0 1.0 2.0 x y

Training Data

0.5 1.0 1.5 2.0 −1.0 0.0 1.0 2.0 x y

Test Data

c

  • Introduction to Machine Learning – 7 / 7