Least Squares Support Vector Regression with Applications to - - PowerPoint PPT Presentation

least squares support vector regression with applications
SMART_READER_LITE
LIVE PREVIEW

Least Squares Support Vector Regression with Applications to - - PowerPoint PPT Presentation

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions Least Squares Support Vector Regression with Applications to Large-Scale Data: a Statistical Approach Kris De


slide-1
SLIDE 1

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Least Squares Support Vector Regression with Applications to Large-Scale Data: a Statistical Approach

Kris De Brabanter

Public Defense

April, 27 2011 Promotor: Prof. dr. ir. B. De Moor Co-Promotor: Prof. dr. ir. J. Suykens

1/39

slide-2
SLIDE 2

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Outline

1

Goal & Overview

2

Introduction Parametric vs. nonparametric regression Nonparametric regression estimates: an overview

3

Fixed-Size Least Squares Support Vector Machines Fixed Size LS-SVM formulation Selection of Support Vectors Practical identification problem

4

Robust Nonparametric Methods Problems with outliers Robust nonparametric regression

5

Correlated Errors Problems with correlation in nonparametric regression Removing correlation effects

6

Confidence Intervals

7

Conclusions

2/39

slide-3
SLIDE 3

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Outline

1

Goal & Overview

2

Introduction Parametric vs. nonparametric regression Nonparametric regression estimates: an overview

3

Fixed-Size Least Squares Support Vector Machines Fixed Size LS-SVM formulation Selection of Support Vectors Practical identification problem

4

Robust Nonparametric Methods Problems with outliers Robust nonparametric regression

5

Correlated Errors Problems with correlation in nonparametric regression Removing correlation effects

6

Confidence Intervals

7

Conclusions

3/39

slide-4
SLIDE 4

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Goal of the Thesis

Goal of the Thesis Study the properties of Least Squares Support Vector Machines for regression with an emphasis on statistical aspects and develop a framework for large scale data

4/39

slide-5
SLIDE 5

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Overview

Least Squares Support Vector Machines

Introduction (Chapter 1) Model Building (Chapter 2) Model Selection (Chapter 3) Large Scale Data Sets (Chapter 4) Robustness (Chapter 5) Correlated Errors (Chapter 6) Confidence Intervals (Chapter 7) Applications & Case Studies (Chapter 8) Conclusions & Further Research (Chapter 9) 5/39

slide-6
SLIDE 6

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Outline

1

Goal & Overview

2

Introduction Parametric vs. nonparametric regression Nonparametric regression estimates: an overview

3

Fixed-Size Least Squares Support Vector Machines Fixed Size LS-SVM formulation Selection of Support Vectors Practical identification problem

4

Robust Nonparametric Methods Problems with outliers Robust nonparametric regression

5

Correlated Errors Problems with correlation in nonparametric regression Removing correlation effects

6

Confidence Intervals

7

Conclusions

6/39

slide-7
SLIDE 7

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

A simple example

−5 5 −30 −20 −10 10 20 30

7/39

slide-8
SLIDE 8

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

A simple example

−5 5 −30 −20 −10 10 20 30

7/39

slide-9
SLIDE 9

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

A simple example

−5 5 −30 −20 −10 10 20 30

Y = aX + b

7/39

slide-10
SLIDE 10

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

A simple example

−5 5 −30 −20 −10 10 20 30

Y = aX + b

10 20 30 40 50 60 −150 −100 −50 50 100

7/39

slide-11
SLIDE 11

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

A simple example

−5 5 −30 −20 −10 10 20 30

Y = aX + b

10 20 30 40 50 60 −150 −100 −50 50 100

7/39

slide-12
SLIDE 12

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

A simple example

−5 5 −30 −20 −10 10 20 30

Y = aX + b

10 20 30 40 50 60 −150 −100 −50 50 100

Y = ?

7/39

slide-13
SLIDE 13

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

A simple example

−5 5 −30 −20 −10 10 20 30

Y = aX + b

10 20 30 40 50 60 −150 −100 −50 50 100

Y = ?

PARAMETRIC FORM IS NOT ALWAYS EASY TO FIND

7/39

slide-14
SLIDE 14

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Construction of a nonparametric estimate: NW smoother

0.2 0.4 0.6 0.8 1 −1 −0.5 0.5 1 1.5

8/39

slide-15
SLIDE 15

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Construction of a nonparametric estimate: NW smoother

0.2 0.4 0.6 0.8 1 −1 −0.5 0.5 1 1.5

8/39

slide-16
SLIDE 16

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Construction of a nonparametric estimate: NW smoother

0.2 0.4 0.6 0.8 1 −1 −0.5 0.5 1 1.5

x

8/39

slide-17
SLIDE 17

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Construction of a nonparametric estimate: NW smoother

0.2 0.4 0.6 0.8 1 −1 −0.5 0.5 1 1.5

x

8/39

slide-18
SLIDE 18

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Construction of a nonparametric estimate: NW smoother

0.2 0.4 0.6 0.8 1 −1 −0.5 0.5 1 1.5

x ˆ mn(x)

8/39

slide-19
SLIDE 19

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Construction of a nonparametric estimate: NW smoother

0.2 0.4 0.6 0.8 1 −1 −0.5 0.5 1 1.5

x ˆ mn(x) ˆ mn(x) =

n

  • i=1

K(x−Xi

h

) n

j=1 K(x−Xj h

) Yi

8/39

slide-20
SLIDE 20

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Other nonparametric regression estimates

Local constant regression (Nadaraya, 1964; Watson, 1964) Regression trees (Breiman et al., 1984) Wavelets (Daubechies, 1992) Nearest Neighbors (Devroye et al., 1994) Local linear regression (Fan & Gijbels, 1996) Support vector machines (Vapnik, 1995) Splines (Wahba, 1990; Eubank, 1999) Partitioning estimates (Gy¨

  • rfi et al., 2002)

Least squares support vector machines (Suykens et al., 2002) . . .

9/39

slide-21
SLIDE 21

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Least squares support vector machines

Primal formulation (LS-SVM formulation for regression) min

w,b,e JP(w, e) = 1 2wT w + γ 2

n

k=1 e2 k

s.t. wT ϕ(Xk) + b + ek = Yk, k = 1, . . . , n.

10/39

slide-22
SLIDE 22

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Least squares support vector machines

Primal formulation (LS-SVM formulation for regression) min

w,b,e JP(w, e) = 1 2wT w + γ 2

n

k=1 e2 k

s.t. wT ϕ(Xk) + b + ek = Yk, k = 1, . . . , n.

⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆

input space 10/39

slide-23
SLIDE 23

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Least squares support vector machines

Primal formulation (LS-SVM formulation for regression) min

w,b,e JP(w, e) = 1 2wT w + γ 2

n

k=1 e2 k

s.t. wT ϕ(Xk) + b + ek = Yk, k = 1, . . . , n.

⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆

input space 10/39

slide-24
SLIDE 24

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Least squares support vector machines

Primal formulation (LS-SVM formulation for regression) min

w,b,e JP(w, e) = 1 2wT w + γ 2

n

k=1 e2 k

s.t. wT ϕ(Xk) + b + ek = Yk, k = 1, . . . , n.

⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆

input space feature space

⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆

ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( )

⋆ ϕ(·)

10/39

slide-25
SLIDE 25

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Least squares support vector machines

Primal formulation (LS-SVM formulation for regression) min

w,b,e JP(w, e) = 1 2wT w + γ 2

n

k=1 e2 k

s.t. wT ϕ(Xk) + b + ek = Yk, k = 1, . . . , n.

⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆

input space feature space

⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆ ⋆

ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( ) ϕ( )

⋆ ϕ(·)

10/39

slide-26
SLIDE 26

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

LS-SVM: solution + model selection

Dn = {(Xk, Yk) : Xk ∈ Rd, Yk ∈ R; k = 1, . . . , n} i.i.d. ∼ (X, Y ) Primal formulation min

w,b,e JP(w, e) = 1 2wT w + γ 2

n

k=1 e2 k

s.t. Yk = wT ϕ(Xk) + b + ek, k = 1, . . . , n. Dual formulation

  • 1T

n

1n Ω + In

γ

b α

  • =

Y

  • Ωkl = ϕ(Xk)T ϕ(Xl) = K(Xk, Xl) = (2π)−d/2 exp(− Xk−Xl2

2h2

) K has to be positive definite i.e.

  • exp(−jωx)K(x) dx ≥ 0

Model in dual space ˆ mn(x) = n

k=1 ˆ

αkK(x, Xk) + ˆ b γ and h: tuning parameters ⇒ cross-validation

11/39

slide-27
SLIDE 27

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Effect of the tuning parameters

10 20 30 40 50 60 −150 −100 −50 50 100

12/39

slide-28
SLIDE 28

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Effect of the tuning parameters

10 20 30 40 50 60 −150 −100 −50 50 100

h too small

12/39

slide-29
SLIDE 29

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Effect of the tuning parameters

10 20 30 40 50 60 −150 −100 −50 50 100

h too small

10 20 30 40 50 60 −150 −100 −50 50 100

12/39

slide-30
SLIDE 30

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Effect of the tuning parameters

10 20 30 40 50 60 −150 −100 −50 50 100

h too small

10 20 30 40 50 60 −150 −100 −50 50 100

h too large

12/39

slide-31
SLIDE 31

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Effect of the tuning parameters

10 20 30 40 50 60 −150 −100 −50 50 100

h too small

10 20 30 40 50 60 −150 −100 −50 50 100

h too large

Model selection criteria are ABSOLUTELY needed

12/39

slide-32
SLIDE 32

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Outline

1

Goal & Overview

2

Introduction Parametric vs. nonparametric regression Nonparametric regression estimates: an overview

3

Fixed-Size Least Squares Support Vector Machines Fixed Size LS-SVM formulation Selection of Support Vectors Practical identification problem

4

Robust Nonparametric Methods Problems with outliers Robust nonparametric regression

5

Correlated Errors Problems with correlation in nonparametric regression Removing correlation effects

6

Confidence Intervals

7

Conclusions

13/39

slide-33
SLIDE 33

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Estimation in Primal Space

LS-SVM formulation for regression Primal formulation min

w,b,e JP(w, e) = 1 2wT w + γ 2

n

k=1 e2 k

s.t. wT ϕ(Xk) + b + ek = Yk, k = 1, . . . , n. Can we solve the LS-SVM in primal space instead of dual? Approximation of feature map ϕ needed Is it possible to compute such a mapping?

ϕ can be infinite dimensional Solution:use a fixed size m of support vectors to approximate ϕ Solve the above as primal ridge regression

14/39

slide-34
SLIDE 34

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Problems with Large Scale Data

1

Calculation and/or storage kernel matrix Ω

N = 1.000 ⇒ Ω ⇒ 8 MB N = 10.000 ⇒ Ω ⇒ 763 MB N = 20.000 ⇒ Ω ⇒ 3051 MB

2

If possible to compute, how long would it take?

15/39

slide-35
SLIDE 35

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Problems with Large Scale Data

1

Calculation and/or storage kernel matrix Ω

N = 1.000 ⇒ Ω ⇒ 8 MB N = 10.000 ⇒ Ω ⇒ 763 MB N = 20.000 ⇒ Ω ⇒ 3051 MB

2

If possible to compute, how long would it take? ⇒ Solution: Matrix Approximations (Nystr¨

  • m, 1930)

ˆ ϕi(x)

m≪n

√m λ(m)

i

m

k=1 K(Xk, x)u(m) ki

15/39

slide-36
SLIDE 36

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Fixed Size LS-SVM formulation

Given: approximation to the feature map Primal formulation min

w,b,e JP(w, e) = 1 2wT w + γ 2

n

k=1 e2 k

s.t. wT ˆ ϕ(Xk) + b + ek = Yk, k = 1, . . . , n. Solution w b

  • =
  • ˆ

ΦT

e ˆ

Φe + Im+1

γ

−1 ˆ ΦT

e Y,

with ˆ Φe =    ˆ ϕ1(X1) · · · ˆ ϕm(X1) 1 . . . ... . . . . . . ˆ ϕ1(Xn) · · · ˆ ϕm(Xn) 1   

16/39

slide-37
SLIDE 37

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Selection of support vectors: R´ enyi Entropy

Maximize quadratic R´ enyi entropy: Hm

R2 = − log

  • f(x)2 dx

Theorem (Maximizing Entropy) The R´ enyi entropy on a closed interval [a, b] with a, b ∈ R and no additional moment constraints is maximized for the uniform density 1/(b − a). (Selecting SV) (Wrong Bandwidth)

17/39

slide-38
SLIDE 38

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Identification of a pilot scale distillation column

Joint work with Bart Huyck (CIT)

18/39

slide-39
SLIDE 39

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Identification of a pilot scale distillation column

Joint work with Bart Huyck (CIT) Task: identify bottom temperature column with LS-SVM

18/39

slide-40
SLIDE 40

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Identification of a pilot scale distillation column

Joint work with Bart Huyck (CIT) Task: identify bottom temperature column with LS-SVM

18/39

slide-41
SLIDE 41

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Identification of a pilot scale distillation column

Joint work with Bart Huyck (CIT) Task: identify bottom temperature column with LS-SVM

18/39

slide-42
SLIDE 42

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Identification of a pilot scale distillation column

Joint work with Bart Huyck (CIT) Task: identify bottom temperature column with LS-SVM

1000 2000 3000 4000 5000 6000 7000 8000 77 77.5 78 78.5 79 79.5

Temp Time (s)

18/39

slide-43
SLIDE 43

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Outline

1

Goal & Overview

2

Introduction Parametric vs. nonparametric regression Nonparametric regression estimates: an overview

3

Fixed-Size Least Squares Support Vector Machines Fixed Size LS-SVM formulation Selection of Support Vectors Practical identification problem

4

Robust Nonparametric Methods Problems with outliers Robust nonparametric regression

5

Correlated Errors Problems with correlation in nonparametric regression Removing correlation effects

6

Confidence Intervals

7

Conclusions

19/39

slide-44
SLIDE 44

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Problems with outliers in parametric regression

model: Yk = aXk + b + ek, k = 1, . . . , n (a, b) estimated from data LS principle: (ˆ a,ˆ b) = arg min

a,b∈R2 1 n

n

k=1 [Yk − (aXk + b)]2

20/39

slide-45
SLIDE 45

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Problems with outliers in parametric regression

model: Yk = aXk + b + ek, k = 1, . . . , n (a, b) estimated from data LS principle: (ˆ a,ˆ b) = arg min

a,b∈R2 1 n

n

k=1 [Yk − (aXk + b)]2

1 2 3 4 5 −10 −5 5 10 15 20 25 30 35 40

X Y

20/39

slide-46
SLIDE 46

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Problems with outliers in parametric regression

model: Yk = aXk + b + ek, k = 1, . . . , n (a, b) estimated from data LS principle: (ˆ a,ˆ b) = arg min

a,b∈R2 1 n

n

k=1 [Yk − (aXk + b)]2

1 2 3 4 5 −10 −5 5 10 15 20 25 30 35 40

X Y

20/39

slide-47
SLIDE 47

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Problems with outliers in parametric regression

model: Yk = aXk + b + ek, k = 1, . . . , n (a, b) estimated from data LS principle: (ˆ a,ˆ b) = arg min

a,b∈R2 1 n

n

k=1 [Yk − (aXk + b)]2

1 2 3 4 5 −10 −5 5 10 15 20 25 30 35 40

X Y LS principle is NOT robust

20/39

slide-48
SLIDE 48

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Problems with outliers in parametric regression

model: Yk = aXk + b + ek, k = 1, . . . , n (a, b) estimated from data LS principle: (ˆ a,ˆ b) = arg min

a,b∈R2 1 n

n

k=1 [Yk − (aXk + b)]2

1 2 3 4 5 −10 −5 5 10 15 20 25 30 35 40

X Y LS principle is NOT robust Solution LAD: (ˆ a,ˆ b) = arg min

a,b∈R2 1 n

n

k=1 |Yk − (aXk + b)|

20/39

slide-49
SLIDE 49

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Problems with outliers in nonparametric regression

0.2 0.4 0.6 0.8 1 −1.5 −1 −0.5 0.5 1 1.5 2

f(X) X

21/39

slide-50
SLIDE 50

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Problems with outliers in nonparametric regression

0.2 0.4 0.6 0.8 1 −1.5 −1 −0.5 0.5 1 1.5 2 2.5

  • utliers

X f(X)

21/39

slide-51
SLIDE 51

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Problems with outliers in nonparametric regression

0.2 0.4 0.6 0.8 1 −1.5 −1 −0.5 0.5 1 1.5 2 2.5

  • utliers

X f(X) LS principle ⇒ sensitive to outliers (and leverage points) Linear/polynomial kernel ⇒ non-robust methods Using appropriate CV ⇒ robust CV

21/39

slide-52
SLIDE 52

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Iterative Reweighting & Weight Functions

Primal formulation min

w,b,e JP(w, e) = 1 2wT w + γ 2

n

k=1 e2 k

s.t. Yk = wT ϕ(Xk) + b + ek, k = 1, . . . , n.

22/39

slide-53
SLIDE 53

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Iterative Reweighting & Weight Functions

Primal formulation min

w,b,e JP(w, e) = 1 2wT w + γ 2

n

k=1 vke2 k

s.t. Yk = wT ϕ(Xk) + b + ek, k = 1, . . . , n.

22/39

slide-54
SLIDE 54

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Iterative Reweighting & Weight Functions

Primal formulation min

w,b,e JP(w, e) = 1 2wT w + γ 2

n

k=1 vke2 k

s.t. Yk = wT ϕ(Xk) + b + ek, k = 1, . . . , n.

Huber Hampel Logistic Myriad

V (r)

  • 1,

if |r| < β;

β |r| ,

if |r| ≥ β.      1, if |r| < b1;

b2−|r| b2−b1 ,

if b1 ≤ |r| ≤ b2; 0, if |r| > b2.

tanh(r) r δ2 δ2+r2

ψ(r) L(r)

  • r2,

if |r| < β; β|r| − 1

2 β2, if |r| ≥ β.

     r2, if |r| < b1;

b2r2−|r3| b2−b1

, if b1 ≤ |r| ≤ b2; 0, if |r| > b2.

r tanh(r) log(δ2 +r2)

22/39

slide-55
SLIDE 55

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Iterative Reweighting & Weight Functions

Primal formulation min

w,b,e JP(w, e) = 1 2wT w + γ 2

n

k=1 vke2 k

s.t. Yk = wT ϕ(Xk) + b + ek, k = 1, . . . , n.

Huber Hampel Logistic Myriad

V (r)

  • 1,

if |r| < β;

β |r| ,

if |r| ≥ β.      1, if |r| < b1;

b2−|r| b2−b1 ,

if b1 ≤ |r| ≤ b2; 0, if |r| > b2.

tanh(r) r δ2 δ2+r2

ψ(r) L(r)

  • r2,

if |r| < β; β|r| − 1

2 β2, if |r| ≥ β.

     r2, if |r| < b1;

b2r2−|r3| b2−b1

, if b1 ≤ |r| ≤ b2; 0, if |r| > b2.

r tanh(r) log(δ2 +r2)

22/39

slide-56
SLIDE 56

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Properties of the Myriad

if δ → ∞ = ⇒ Myriad converges to sample mean if δ → 0 = ⇒ Myriad converges to the sample mode

23/39

slide-57
SLIDE 57

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Properties of the Myriad

if δ → ∞ = ⇒ Myriad converges to sample mean if δ → 0 = ⇒ Myriad converges to the sample mode

X1 X2 X3 X4 X5 23/39

slide-58
SLIDE 58

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Properties of the Myriad

if δ → ∞ = ⇒ Myriad converges to sample mean if δ → 0 = ⇒ Myriad converges to the sample mode

X1 X2 X3 X4 X5

δ = 0.75

23/39

slide-59
SLIDE 59

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Properties of the Myriad

if δ → ∞ = ⇒ Myriad converges to sample mean if δ → 0 = ⇒ Myriad converges to the sample mode

X1 X2 X3 X4 X5

δ = 0.75 δ = 2

23/39

slide-60
SLIDE 60

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Properties of the Myriad

if δ → ∞ = ⇒ Myriad converges to sample mean if δ → 0 = ⇒ Myriad converges to the sample mode

X1 X2 X3 X4 X5

δ = 0.75 δ = 2 δ = 20

23/39

slide-61
SLIDE 61

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

To obtain a fully robust solution...

robust smoother bounded kernel robust CV ⇒ L′ bounded RCV (θ) = 1 n

n

  • i=1

L

  • Yi − ˆ

m(−i)

n

(Xi; θ)

  • 24/39
slide-62
SLIDE 62

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

To obtain a fully robust solution...

robust smoother bounded kernel robust CV ⇒ L′ bounded RCV (θ) = 1 n

n

  • i=1

L

  • Yi − ˆ

m(−i)

n

(Xi; θ)

  • 0.2

0.4 0.6 0.8 1 −1.5 −1 −0.5 0.5 1 1.5 2 2.5

  • utliers

X f(X)

24/39

slide-63
SLIDE 63

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

To obtain a fully robust solution...

robust smoother bounded kernel robust CV ⇒ L′ bounded RCV (θ) = 1 n

n

  • i=1

L

  • Yi − ˆ

m(−i)

n

(Xi; θ)

  • 0.2

0.4 0.6 0.8 1 −1.5 −1 −0.5 0.5 1 1.5 2 2.5

  • utliers

X f(X)

0.2 0.4 0.6 0.8 1 −2 −1 1 2

X f(X)

24/39

slide-64
SLIDE 64

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Outline

1

Goal & Overview

2

Introduction Parametric vs. nonparametric regression Nonparametric regression estimates: an overview

3

Fixed-Size Least Squares Support Vector Machines Fixed Size LS-SVM formulation Selection of Support Vectors Practical identification problem

4

Robust Nonparametric Methods Problems with outliers Robust nonparametric regression

5

Correlated Errors Problems with correlation in nonparametric regression Removing correlation effects

6

Confidence Intervals

7

Conclusions

25/39

slide-65
SLIDE 65

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

We have a problem!!!

0.2 0.4 0.6 0.8 1 −2 −1 1 2 3 4 5 6

x m(x), ˆ mn(x)

26/39

slide-66
SLIDE 66

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

We have a problem!!!

0.2 0.4 0.6 0.8 1 −2 −1 1 2 3 4 5 6

x m(x), ˆ mn(x)

26/39

slide-67
SLIDE 67

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

We have a problem!!!

0.2 0.4 0.6 0.8 1 −2 −1 1 2 3 4 5 6

x m(x), ˆ mn(x)

26/39

slide-68
SLIDE 68

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

We have a problem!!!

0.2 0.4 0.6 0.8 1 −2 −1 1 2 3 4 5 6

x m(x), ˆ mn(x)

VIOLATION OF I.I.D. ASSUMPTION

26/39

slide-69
SLIDE 69

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

What went wrong?

Yi = m(xi) + ei: model selection = ⇒ Cov[ei, ej] = 0 In previous example Cov[ei, ej] = 0 correlation (covariance)? Strength of relationship between ei and ej

27/39

slide-70
SLIDE 70

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

What went wrong?

Yi = m(xi) + ei: model selection = ⇒ Cov[ei, ej] = 0 In previous example Cov[ei, ej] = 0 correlation (covariance)? Strength of relationship between ei and ej Example 1 Suppose there are two technology stocks. If they are affected by the same industry trends, their prices will tend to rise or fall together.

27/39

slide-71
SLIDE 71

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

What went wrong?

Yi = m(xi) + ei: model selection = ⇒ Cov[ei, ej] = 0 In previous example Cov[ei, ej] = 0 correlation (covariance)? Strength of relationship between ei and ej Example 1 Suppose there are two technology stocks. If they are affected by the same industry trends, their prices will tend to rise or fall together. Example 2 Housing prices. Sensitive to offer and demand.

27/39

slide-72
SLIDE 72

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

What went wrong?

Yi = m(xi) + ei: model selection = ⇒ Cov[ei, ej] = 0 In previous example Cov[ei, ej] = 0 correlation (covariance)? Strength of relationship between ei and ej Example 1 Suppose there are two technology stocks. If they are affected by the same industry trends, their prices will tend to rise or fall together. Example 2 Housing prices. Sensitive to offer and demand. Example 3 In our toy example: ei+1 was affected by the value of ei and so

  • n...

27/39

slide-73
SLIDE 73

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Removing correlation effects: main theorem

Breakdown bandwidth selection procedures, smoother stays consistent!!

Theorem Assume x ≡ i/n, x ∈ [0, 1], E[e] = 0, Cov[ei, ei+k] = E[eiei+k] = γk and γk ∼ k−a for some a > 2. Assume that Yi = m(xi) + ei and (C1) K is Lipschitz continuous at x = 0; (C2) K(u) du = 1, lim|u|→∞ |uK(u)| = 0, |K(u)| du < ∞, supu |K(u)| < ∞; (C3)

  • |k(u)| du < ∞ and K is symmetric.

Further, assume that boundary effects are ignored and that h → 0 as n → ∞ such that nh2 → ∞, then for the NW smoother it follows that

E[CV(h)]= 1 n E

n

  • i=1
  • m(xi)− ˆ

m(−i)

n

(xi) 2 +σ2 − 4K(0) nh − K(0)

  • k=1

γk +o(n−1h−1)

28/39

slide-74
SLIDE 74

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Removing correlation effects: main theorem

Breakdown bandwidth selection procedures, smoother stays consistent!!

Theorem Assume x ≡ i/n, x ∈ [0, 1], E[e] = 0, Cov[ei, ei+k] = E[eiei+k] = γk and γk ∼ k−a for some a > 2. Assume that Yi = m(xi) + ei and (C1) K is Lipschitz continuous at x = 0; (C2) K(u) du = 1, lim|u|→∞ |uK(u)| = 0, |K(u)| du < ∞, supu |K(u)| < ∞; (C3)

  • |k(u)| du < ∞ and K is symmetric.

Further, assume that boundary effects are ignored and that h → 0 as n → ∞ such that nh2 → ∞, then for the NW smoother it follows that

E[CV(h)]= 1 n E

n

  • i=1
  • m(xi)− ˆ

m(−i)

n

(xi) 2 +σ2 − 4K(0) nh − K(0)

  • k=1

γk +o(n−1h−1)

No prior knowledge about correlation structure needed !!

28/39

slide-75
SLIDE 75

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Suitable kernels & drawback

−0.8 −0.6 −0.4 −0.2 0.2 0.4 0.6 0.8 0.5 1 1.5 2 2.5 −3 −2 −1 1 2 3 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 −5 5 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2 0.2 0.4 0.6 0.8 1 −1 1 2 3 4 5 6 0.2 0.4 0.6 0.8 1 −1 1 2 3 4 5 6 0.2 0.4 0.6 0.8 1 −1 1 2 3 4 5 6

29/39

slide-76
SLIDE 76

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Suitable kernels & drawback

−0.8 −0.6 −0.4 −0.2 0.2 0.4 0.6 0.8 0.5 1 1.5 2 2.5 −3 −2 −1 1 2 3 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 −5 5 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2 0.2 0.4 0.6 0.8 1 −1 1 2 3 4 5 6 0.2 0.4 0.6 0.8 1 −1 1 2 3 4 5 6 0.2 0.4 0.6 0.8 1 −1 1 2 3 4 5 6

⇒⇒ Decreased Mean Squared Error ⇒⇒

29/39

slide-77
SLIDE 77

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Some real life examples

Beveridge index of wheat prices

1500 1550 1600 1650 1700 1750 1800 1850 1900 2 2.5 3 3.5 4 4.5 5 5.5 6

Year log(Beveridge wheat prices) U.S. monthly birth rate

30/39

slide-78
SLIDE 78

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Some real life examples

Beveridge index of wheat prices

1500 1550 1600 1650 1700 1750 1800 1850 1900 2 2.5 3 3.5 4 4.5 5 5.5 6

Year log(Beveridge wheat prices) U.S. monthly birth rate

30/39

slide-79
SLIDE 79

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Some real life examples

Beveridge index of wheat prices

1500 1550 1600 1650 1700 1750 1800 1850 1900 2 2.5 3 3.5 4 4.5 5 5.5 6

Year log(Beveridge wheat prices) U.S. monthly birth rate

30/39

slide-80
SLIDE 80

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Some real life examples

Beveridge index of wheat prices

1500 1550 1600 1650 1700 1750 1800 1850 1900 2 2.5 3 3.5 4 4.5 5 5.5 6

Year log(Beveridge wheat prices) U.S. monthly birth rate

1940 1941 1942 1943 1944 1945 1946 1947 1948 1800 2000 2200 2400 2600 2800 3000

Year Birth rate

30/39

slide-81
SLIDE 81

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Some real life examples

Beveridge index of wheat prices

1500 1550 1600 1650 1700 1750 1800 1850 1900 2 2.5 3 3.5 4 4.5 5 5.5 6

Year log(Beveridge wheat prices) U.S. monthly birth rate

1940 1941 1942 1943 1944 1945 1946 1947 1948 1800 2000 2200 2400 2600 2800 3000

Year Birth rate

30/39

slide-82
SLIDE 82

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Some real life examples

Beveridge index of wheat prices

1500 1550 1600 1650 1700 1750 1800 1850 1900 2 2.5 3 3.5 4 4.5 5 5.5 6

Year log(Beveridge wheat prices) U.S. monthly birth rate

1940 1941 1942 1943 1944 1945 1946 1947 1948 1800 2000 2200 2400 2600 2800 3000

Year Birth rate

30/39

slide-83
SLIDE 83

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Outline

1

Goal & Overview

2

Introduction Parametric vs. nonparametric regression Nonparametric regression estimates: an overview

3

Fixed-Size Least Squares Support Vector Machines Fixed Size LS-SVM formulation Selection of Support Vectors Practical identification problem

4

Robust Nonparametric Methods Problems with outliers Robust nonparametric regression

5

Correlated Errors Problems with correlation in nonparametric regression Removing correlation effects

6

Confidence Intervals

7

Conclusions

31/39

slide-84
SLIDE 84

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

What are confidence intervals?

How accurate are our nonparametric estimates? Can we say something about the true function m given ˆ mn? We want something of the form: Ln(x) ≤ m(x) ≤ Un(x) ∀x for some confidence level α Pointwise vs. simultaneous/uniform confidence intervals

32/39

slide-85
SLIDE 85

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

What are confidence intervals?

How accurate are our nonparametric estimates? Can we say something about the true function m given ˆ mn? We want something of the form: Ln(x) ≤ m(x) ≤ Un(x) ∀x for some confidence level α Pointwise vs. simultaneous/uniform confidence intervals

−1 −0.5 0.5 1 −0.5 0.5 1 1.5 2 2.5

X Y, ˆ mn(X)

32/39

slide-86
SLIDE 86

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

What are confidence intervals?

How accurate are our nonparametric estimates? Can we say something about the true function m given ˆ mn? We want something of the form: Ln(x) ≤ m(x) ≤ Un(x) ∀x for some confidence level α Pointwise vs. simultaneous/uniform confidence intervals

−1 −0.5 0.5 1 −0.5 0.5 1 1.5 2 2.5

X Y, ˆ mn(X)

32/39

slide-87
SLIDE 87

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

In practice...

In general These intervals give the user the ability to see how well a certain model explains the true underlying process while taking statistical properties of the estimator into account. Fault detection In fault detection: CI are used for reducing the number of false alarms

33/39

slide-88
SLIDE 88

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Outline

1

Goal & Overview

2

Introduction Parametric vs. nonparametric regression Nonparametric regression estimates: an overview

3

Fixed-Size Least Squares Support Vector Machines Fixed Size LS-SVM formulation Selection of Support Vectors Practical identification problem

4

Robust Nonparametric Methods Problems with outliers Robust nonparametric regression

5

Correlated Errors Problems with correlation in nonparametric regression Removing correlation effects

6

Confidence Intervals

7

Conclusions

34/39

slide-89
SLIDE 89

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Conclusions

Goal of the Thesis Study the properties of LS-SVM for regression with an emphasis

  • n statistical aspects and develop a framework for large scale data

Main Achievements Framework for large data sets Method for minimizing model selection criteria score functions Robustification of kernel based method Weight function with attractive properties Framework for correlated errors based on bimodal kernels Asymptotic normality of linear smoothers Bias & variance estimators for LS-SVM Pointwise & simultaneous CI + comparison bootstrap

35/39

slide-90
SLIDE 90

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

LS-SVMLab software

Free available (for research purposes) Matlab toolbox http://www.esat.kuleuven.ac.be/sista/lssvmlab/ User’s guide with applications (p. 113, De Brabanter et al, 2010)

36/39

slide-91
SLIDE 91

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

LS-SVMLab software

Free available (for research purposes) Matlab toolbox http://www.esat.kuleuven.ac.be/sista/lssvmlab/ User’s guide with applications (p. 113, De Brabanter et al, 2010)

36/39

slide-92
SLIDE 92

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Publications

Falck, T., Dreesen, P., De Brabanter, K., Pelckmans, K., De Moor, B., Suykens, J.A.K, Least-Squares Support Vector Machines for the Identification of Wiener-Hammerstein Systems, Submitted, 2011. De Brabanter K., De Brabanter J., Suykens J.A.K., De Moor B., Kernel Regression in the Presence of Correlated Errors, Submitted, 2011. De Brabanter K., Karsmakers P., De Brabanter J., Suykens J.A.K., De Moor B., Confidence Bands for Least Squares Support Vector Machine Classifiers: A Regression Approach, Submitted, 2010. De Brabanter K., De Brabanter J., Suykens J.A.K., De Moor B., Approximate Confidence and Prediction Intervals for Least Squares Support Vector Regression, IEEE Transactions on Neural Networks, 22(1):110–120 , 2011. Sahhaf S., De Brabanter K., Degraeve R., Suykens J.A.K., De Moor B., Groeseneken G., Modelling of Charge Trapping/De-trapping Induced Voltage Instability in High-k Gate Dielectrics, Submitted, 2010. Karsmakers P., Pelckmans K., De Brabanter K., Van Hamme H., Suykens J.A.K., Sparse Conjugate Directions Pursuit with Application to Fixed-size Kernel Models, Submitted, 2010. Sahhaf S., Degraeve R., Cho M., De Brabanter K., Roussel Ph.J., Zahid M.B., Groeseneken G., Detailed Analysis of Charge Pumping and Id − Vg Hysteresis for Profiling Traps in SiO2/HfSiO(N), Microelectronic Engineering, 87(12):2614–2619, 2010.

37/39

slide-93
SLIDE 93

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Publications

De Brabanter K., De Brabanter J., Suykens J.A.K., De Moor B., Optimized Fixed-Size Kernel Models for Large Data Sets, Computational Statistics & Data Analysis, 54(6):1484–1504, 2010. L´

  • pez J., De Brabanter K., Dorronsoro J.R., Suykens J.A.K., Sparse LS-SVMs

with L0-Norm Minimization, Accepted for publication in Proc. of the 19th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN), Brugge (Belgium), 2011. Huyck B., De Brabanter K., Logist F., De Brabanter J., Van Impe J., De Moor B., Identification of a Pilot Scale Distillation Column: A Kernel Based Approach, Accepted for publication in 18th World Congress of the International Federation

  • f Automatic Control (IFAC), 2011.

De Brabanter K., Karsmakers P., De Brabanter J., Pelckmans K., Suykens J.A.K., De Moor B., On Robustness in Kernel Based Regression, NIPS 2010 Robust Statistical Learning (ROBUSTML) (NIPS 2010), Whistler, Canada, December 2010. De Brabanter K., Sahhaf S., Karsmakers P., De Brabanter J., Suykens J.A.K., De Moor B., Nonparametric Comparison of Densities Based on Statistical Bootstrap, in Proc. of the Fourth European Conference on the Use of Modern Information and Communication Technologies (ECUMICT), Gent, Belgium, March 2010, pp. 179–190.

38/39

slide-94
SLIDE 94

Goal & Overview Introduction FS-LSSVM Robust Nonparametric Methods Correlated Errors Confidence Intervals Conclusions

Publications

De Brabanter K., De Brabanter J., Suykens J.A.K., De Moor B., Kernel Regression with Correlated Errors, in Proc. of the the 11th International Symposium on Computer Applications in Biotechnology (CAB), Leuven, Belgium, July 2010, pp. 13–18. De Brabanter K., Pelckmans K., De Brabanter J., Debruyne M., Suykens J.A.K., Hubert M., De Moor B., Robustness of Kernel Based Regression: a Comparison of Iterative Weighting Schemes, in Proc. of the 19th International Conference on Artificial Neural Networks (ICANN), Limassol, Cyprus, September 2009, pp. 100–110. De Brabanter K., Dreesen P., Karsmakers P., Pelckmans K., De Brabanter J., Suykens J.A.K., De Moor B., Fixed-Size LS-SVM Applied to the Wiener-Hammerstein Benchmark, in Proc. of the 15th IFAC Symposium on System Identification (SYSID 2009), Saint-Malo, France, July 2009, pp. 826–831. De Brabanter K., Karsmakers P., Ojeda F., Alzate C., De Brabanter J., Pelckmans K., De Moor B., Vandewalle J., Suykens J.A.K., LS-SVMlab Toolbox User’s Guide version 1.7”, Internal Report 10-146, ESAT-SISTA, K.U.Leuven (Leuven, Belgium), 2010.

39/39