NONLINEAR REGRESSION Sylvain Calinon Robot Learning & - - PowerPoint PPT Presentation

nonlinear regression
SMART_READER_LITE
LIVE PREVIEW

NONLINEAR REGRESSION Sylvain Calinon Robot Learning & - - PowerPoint PPT Presentation

EE613 Machine Learning for Engineers NONLINEAR REGRESSION Sylvain Calinon Robot Learning & Interaction Group Idiap Research Institute Nov. 11, 2015 1 Outline Locally weighted regression (LWR) Radial basis functions (RBF)


slide-1
SLIDE 1

EE613 Machine Learning for Engineers

NONLINEAR REGRESSION

Sylvain Calinon Robot Learning & Interaction Group Idiap Research Institute

  • Nov. 11, 2015

1

slide-2
SLIDE 2

Outline

2

  • Locally weighted regression (LWR)
  • Radial basis functions (RBF)
  • Gaussian mixture regression (GMR)
  • Gaussian process regression (GPR)
slide-3
SLIDE 3

Locally weighted regression (LWR)

demo_LWR01.m

3

[C. G. Atkeson, A. W. Moore, and S. Schaal. Locally weighted learning for

  • control. Artificial Intelligence Review, 11(1-5):75–113, 1997]
slide-4
SLIDE 4

Locally weighted regression (LWR)

4

slide-5
SLIDE 5

Locally weighted regression (LWR)

5

LWR can be directly extended to local least squares polynomial fitting by changing the definition of the inputs.

slide-6
SLIDE 6

Locally weighted regression (LWR)

6

slide-7
SLIDE 7

Locally weighted regression (LWR)

7

slide-8
SLIDE 8

Gaussian mixture regression (GMR)

demo_GMR01.m demo_GMR_polyFit01.m

8

[Z. Ghahramani and M. I. Jordan. Supervised learning from incomplete data via an EM approach. In Advances in Neural Information Processing Systems (NIPS), volume 6, pages 120–127, 1994]

slide-9
SLIDE 9

Product of Gaussians: Linear combination: Conditional probability:

Gaussian distribution properties

9

slide-10
SLIDE 10

Product of Gaussians

slide-11
SLIDE 11

 in new situation…

 Product of Gaussians Frame 1: This is where I expect the data to be located! Frame 2: This is where I expect the data to be located!

11

Product of Gaussians

[Calinon, S. (2015). A Tutorial on Task-Parameterized Movement Learning and Retrieval. Intelligent Service Robotics.]

slide-12
SLIDE 12

Linear combination

12

slide-13
SLIDE 13

Conditional probability

slide-14
SLIDE 14

Gaussian mixture regression (GMR)

14

slide-15
SLIDE 15

Gaussian mixture regression (GMR)

15

slide-16
SLIDE 16

Gaussian mixture regression (GMR)

16

slide-17
SLIDE 17

Gaussian mixture regression (GMR)

17

slide-18
SLIDE 18

Gaussian mixture regression (GMR)

18

slide-19
SLIDE 19

Gaussian mixture regression (GMR)

19

slide-20
SLIDE 20

Gaussian mixture regression (GMR)

20

slide-21
SLIDE 21

Gaussian mixture regression (GMR)

21

slide-22
SLIDE 22

Gaussian mixture regression (GMR)

22

slide-23
SLIDE 23

[Calinon, Guenter and Billard, IEEE Trans. on SMC-B 37(2), 2007]

Gaussian mixture regression (GMR)

[Hersch, Guenter, Calinon and Billard, IEEE Trans. on Robotics 24(6), 2008]

With expectation-maximization (EM): (maximizing log-likelihood)

[Khansari-Zadeh and Billard, IEEE Trans. on Robotics 27(5), 2011]

With quadratic programming solver: (maximizing log-likelihood s.t. stability constraints)

slide-24
SLIDE 24

GMR can cover a large spectrum

  • f regression mechanisms

Both and can be multidimensional encoded in Gaussian mixture model (GMM) retrieved by Gaussian mixture regression (GMR)

Gaussian mixture regression (GMR)

Nadaraya-Watson kernel regression Least squares linear regression

24

slide-25
SLIDE 25

Gaussian mixture regression (GMR)

25

slide-26
SLIDE 26

Gaussian process regression (GPR)

demo_GPR01.m

26

[C. K. I. Williams and C. E. Rasmussen. Gaussian processes for regression. In Advances in Neural Information Processing Systems (NIPS), pages 514–520, 1996] [S. Roberts, M. Osborne, M. Ebden, S. Reece, N. Gibson, and S. Aigrain. Gaussian processes for time-series modelling. Philosophical Trans. of the Royal Society A, 371(1984):1–25, 2012]

slide-27
SLIDE 27

Gaussian process regression (GPR)

27

slide-28
SLIDE 28

Gaussian process regression (GPR)

28

Polynomial fitting with least squares and nullspace

  • ptimization
slide-29
SLIDE 29

Gaussian process regression (GPR)

29

slide-30
SLIDE 30

Gaussian process regression (GPR)

30

slide-31
SLIDE 31

Gaussian process regression (GPR)

31

slide-32
SLIDE 32

Gaussian process regression (GPR)

32

slide-33
SLIDE 33

Gaussian process regression (GPR)

33

slide-34
SLIDE 34

Gaussian process regression (GPR)

34

slide-35
SLIDE 35

Gaussian process regression (GPR)

35

slide-36
SLIDE 36

Gaussian process regression (GPR)

36

slide-37
SLIDE 37

Gaussian process regression (GPR)

37

slide-38
SLIDE 38

Gaussian process regression (GPR)

38

slide-39
SLIDE 39

Gaussian process regression (GPR)

39

slide-40
SLIDE 40

Gaussian process regression (GPR)

40

slide-41
SLIDE 41

Gaussian process regression (GPR)

41

slide-42
SLIDE 42

Gaussian process regression (GPR)

42

slide-43
SLIDE 43

Gaussian process regression (GPR)

43

slide-44
SLIDE 44

Main references

Regression

  • F. Stulp and O. Sigaud. Many regression algorithms, one unified model – a review.

Neural Networks, 69:60–79, September 2015

LWR

  • C. G. Atkeson, A. W. Moore, and S. Schaal. Locally weighted learning for control.

Artificial Intelligence Review, 11(1-5):75–113, 1997

GMR

  • Z. Ghahramani and M. I. Jordan. Supervised learning from incomplete data via an EM
  • approach. In Advances in Neural Information Processing Systems (NIPS), volume 6,

pages 120–127, 1994

GPR

  • C. K. I. Williams and C. E. Rasmussen. Gaussian processes for regression.

In Advances in Neural Information Processing Systems (NIPS), pages 514–520, 1996

  • S. Roberts, M. Osborne, M. Ebden, S. Reece, N. Gibson, and S. Aigrain. Gaussian

processes for time-series modelling. Philosophical Trans. of the Royal Society A, 371(1984):1–25, 2012