k -Step Ahead Prediction Error Model 1. k -Step Ahead Prediction - - PowerPoint PPT Presentation

k step ahead prediction error model
SMART_READER_LITE
LIVE PREVIEW

k -Step Ahead Prediction Error Model 1. k -Step Ahead Prediction - - PowerPoint PPT Presentation

k -Step Ahead Prediction Error Model 1. k -Step Ahead Prediction Error Model 1. ARMAX model is ARMA plus eXogeneous signal model: A ( z ) y ( n ) = B ( z ) u ( n k ) + C ( z ) ( n ) k -Step Ahead Prediction Error Model 1. ARMAX model is


slide-1
SLIDE 1

1.

k-Step Ahead Prediction Error Model

slide-2
SLIDE 2

1.

k-Step Ahead Prediction Error Model ARMAX model is ARMA plus eXogeneous signal model: A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)

slide-3
SLIDE 3

1.

k-Step Ahead Prediction Error Model ARMAX model is ARMA plus eXogeneous signal model: A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) u - input y - output ξ - white noise k - delay

slide-4
SLIDE 4

1.

k-Step Ahead Prediction Error Model ARMAX model is ARMA plus eXogeneous signal model: A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) u - input y - output ξ - white noise k - delay

  • A, B, C are polynomials in z−1
slide-5
SLIDE 5

1.

k-Step Ahead Prediction Error Model ARMAX model is ARMA plus eXogeneous signal model: A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) u - input y - output ξ - white noise k - delay

  • A, B, C are polynomials in z−1
  • All delay is factored into k so the constant terms of A, B,

C are not zero

slide-6
SLIDE 6

1.

k-Step Ahead Prediction Error Model ARMAX model is ARMA plus eXogeneous signal model: A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) u - input y - output ξ - white noise k - delay

  • A, B, C are polynomials in z−1
  • All delay is factored into k so the constant terms of A, B,

C are not zero

  • Constant terms of A and C are one (that is, A, C are

monic)

Digital Control

1

Kannan M. Moudgalya, Autumn 2007

slide-7
SLIDE 7

2.

k-Step Ahead Prediction Error Model

slide-8
SLIDE 8

2.

k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)

slide-9
SLIDE 9

2.

k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)

  • Any change in u can affect y only after k samples
slide-10
SLIDE 10

2.

k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)

  • Any change in u can affect y only after k samples
  • But white noise starts affecting the process right away
slide-11
SLIDE 11

2.

k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)

  • Any change in u can affect y only after k samples
  • But white noise starts affecting the process right away
  • Want to get the best estimate of the output so as to take

corrective action,

slide-12
SLIDE 12

2.

k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)

  • Any change in u can affect y only after k samples
  • But white noise starts affecting the process right away
  • Want to get the best estimate of the output so as to take

corrective action, starting now

slide-13
SLIDE 13

2.

k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)

  • Any change in u can affect y only after k samples
  • But white noise starts affecting the process right away
  • Want to get the best estimate of the output so as to take

corrective action, starting now The above equation can be rewritten as, A(z)y(n + j) = B(z)u(n + j − k) + C(z)ξ(n + j)

slide-14
SLIDE 14

2.

k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)

  • Any change in u can affect y only after k samples
  • But white noise starts affecting the process right away
  • Want to get the best estimate of the output so as to take

corrective action, starting now The above equation can be rewritten as, A(z)y(n + j) = B(z)u(n + j − k) + C(z)ξ(n + j) Want to predict output from n+k onwards or for n+j, j ≥ k

Digital Control

2

Kannan M. Moudgalya, Autumn 2007

slide-15
SLIDE 15

3.

k-Step Ahead Prediction Error Model

slide-16
SLIDE 16

3.

k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)

slide-17
SLIDE 17

3.

k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k)

slide-18
SLIDE 18

3.

k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is,

slide-19
SLIDE 19

3.

k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is, ˆ y(n + k|n) =

slide-20
SLIDE 20

3.

k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is, ˆ y(n + k|n) = B(z) A(z)u(n)

slide-21
SLIDE 21

3.

k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is, ˆ y(n + k|n) = B(z) A(z)u(n) If C = A, divide C by A as follows, with j to be specified: C(z) A(z) = Ej(z) + z−jFj(z) A(z)

slide-22
SLIDE 22

3.

k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is, ˆ y(n + k|n) = B(z) A(z)u(n) If C = A, divide C by A as follows, with j to be specified: C(z) A(z) = Ej(z) + z−jFj(z) A(z) Ej(z) = ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1)

slide-23
SLIDE 23

3.

k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is, ˆ y(n + k|n) = B(z) A(z)u(n) If C = A, divide C by A as follows, with j to be specified: C(z) A(z) = Ej(z) + z−jFj(z) A(z) Ej(z) = ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1) Fj(z) = fj,0 + fj,1z−1 + · · · + fj,dFjz−dFj

slide-24
SLIDE 24

3.

k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is, ˆ y(n + k|n) = B(z) A(z)u(n) If C = A, divide C by A as follows, with j to be specified: C(z) A(z) = Ej(z) + z−jFj(z) A(z) Ej(z) = ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1) Fj(z) = fj,0 + fj,1z−1 + · · · + fj,dFjz−dFj Noise has past and future terms, to be split

Digital Control

3

Kannan M. Moudgalya, Autumn 2007

slide-25
SLIDE 25

4.

Splitting Noise into Past and Future

slide-26
SLIDE 26

4.

Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j)

slide-27
SLIDE 27

4.

Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k)

slide-28
SLIDE 28

4.

Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k) +

  • (ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1))

+ z−jfj,0 + fj,1z−1 + · · · + fj,dFjz−dFj A(z)

  • ξ(n + j)
slide-29
SLIDE 29

4.

Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k) +

  • (ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1))

+ z−jfj,0 + fj,1z−1 + · · · + fj,dFjz−dFj A(z)

  • ξ(n + j)

II = ej,0ξ(n + j)

slide-30
SLIDE 30

4.

Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k) +

  • (ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1))

+ z−jfj,0 + fj,1z−1 + · · · + fj,dFjz−dFj A(z)

  • ξ(n + j)

II = ej,0ξ(n + j) + ej,1ξ(n + j − 1)

slide-31
SLIDE 31

4.

Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k) +

  • (ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1))

+ z−jfj,0 + fj,1z−1 + · · · + fj,dFjz−dFj A(z)

  • ξ(n + j)

II = ej,0ξ(n + j) + ej,1ξ(n + j − 1) + · · · + ej,j−1ξ(n + 1)

slide-32
SLIDE 32

4.

Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k) +

  • (ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1))

+ z−jfj,0 + fj,1z−1 + · · · + fj,dFjz−dFj A(z)

  • ξ(n + j)

II = ej,0ξ(n + j) + ej,1ξ(n + j − 1) + · · · + ej,j−1ξ(n + 1) All future terms.

slide-33
SLIDE 33

4.

Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k) +

  • (ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1))

+ z−jfj,0 + fj,1z−1 + · · · + fj,dFjz−dFj A(z)

  • ξ(n + j)

II = ej,0ξ(n + j) + ej,1ξ(n + j − 1) + · · · + ej,j−1ξ(n + 1) All future terms. III =

  • fj,0 + fj,1z−1 + · · · + fj,dFjz−dFj
  • ξ(n)/A(z)

III term is known from previous measurements

Digital Control

4

Kannan M. Moudgalya, Autumn 2007

slide-34
SLIDE 34

5.

Example: Splitting Noise into Past and Future

slide-35
SLIDE 35

5.

Example: Splitting Noise into Past and Future

y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)

slide-36
SLIDE 36

5.

Example: Splitting Noise into Past and Future

y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)

Split C into Ej and Fj, for j = 2:

slide-37
SLIDE 37

5.

Example: Splitting Noise into Past and Future

y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)

Split C into Ej and Fj, for j = 2:

1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2

slide-38
SLIDE 38

5.

Example: Splitting Noise into Past and Future

y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)

Split C into Ej and Fj, for j = 2:

1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2

Substitute it in the expression for y(n + j),

slide-39
SLIDE 39

5.

Example: Splitting Noise into Past and Future

y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)

Split C into Ej and Fj, for j = 2:

1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2

Substitute it in the expression for y(n + j), with j = 2:

slide-40
SLIDE 40

5.

Example: Splitting Noise into Past and Future

y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)

Split C into Ej and Fj, for j = 2:

1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2

Substitute it in the expression for y(n + j), with j = 2: y(n + 2) = 1 1 − 0.6z−1 − 0.16z−2u(n)

slide-41
SLIDE 41

5.

Example: Splitting Noise into Past and Future

y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)

Split C into Ej and Fj, for j = 2:

1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2

Substitute it in the expression for y(n + j), with j = 2: y(n + 2) = 1 1 − 0.6z−1 − 0.16z−2u(n) + (1 + 1.1z−1)ξ(n + 2)

slide-42
SLIDE 42

5.

Example: Splitting Noise into Past and Future

y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)

Split C into Ej and Fj, for j = 2:

1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2

Substitute it in the expression for y(n + j), with j = 2: y(n + 2) = 1 1 − 0.6z−1 − 0.16z−2u(n) + (1 + 1.1z−1)ξ(n + 2) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2ξ(n + 2)

slide-43
SLIDE 43

5.

Example: Splitting Noise into Past and Future

y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)

Split C into Ej and Fj, for j = 2:

1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2

Substitute it in the expression for y(n + j), with j = 2: y(n + 2) = 1 1 − 0.6z−1 − 0.16z−2u(n) + (1 + 1.1z−1)ξ(n + 2) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2ξ(n + 2) Second term is unknown;

slide-44
SLIDE 44

5.

Example: Splitting Noise into Past and Future

y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)

Split C into Ej and Fj, for j = 2:

1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2

Substitute it in the expression for y(n + j), with j = 2: y(n + 2) = 1 1 − 0.6z−1 − 0.16z−2u(n) + (1 + 1.1z−1)ξ(n + 2) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2ξ(n + 2) Second term is unknown; Last term is known.

Digital Control

5

Kannan M. Moudgalya, Autumn 2007

slide-45
SLIDE 45

6.

Splitting Noise into Past and Future

slide-46
SLIDE 46

6.

Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n)

slide-47
SLIDE 47

6.

Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n) y(n + j) = B Au(n + j − k) + C Aξ(n + j)

slide-48
SLIDE 48

6.

Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n) y(n + j) = B Au(n + j − k) + C Aξ(n + j) = B Au(n + j − k) +

  • Ej + z−jFj

A

  • ξ(n + j)
slide-49
SLIDE 49

6.

Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n) y(n + j) = B Au(n + j − k) + C Aξ(n + j) = B Au(n + j − k) +

  • Ej + z−jFj

A

  • ξ(n + j)

= B Au(n + j − k) + Fj A ξ(n) + Ejξ(n + j)

slide-50
SLIDE 50

6.

Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n) y(n + j) = B Au(n + j − k) + C Aξ(n + j) = B Au(n + j − k) +

  • Ej + z−jFj

A

  • ξ(n + j)

= B Au(n + j − k) + Fj A ξ(n) + Ejξ(n + j) = B Au(n + j − k) + Fj A Ay(n) − Bu(n − k) C + Ejξ(n + j)

slide-51
SLIDE 51

6.

Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n) y(n + j) = B Au(n + j − k) + C Aξ(n + j) = B Au(n + j − k) +

  • Ej + z−jFj

A

  • ξ(n + j)

= B Au(n + j − k) + Fj A ξ(n) + Ejξ(n + j) = B Au(n + j − k) + Fj A Ay(n) − Bu(n − k) C + Ejξ(n + j) = B Au(n + j − k) − FjB AC u(n − k) + Fj C y(n) + Ejξ(n + j)

slide-52
SLIDE 52

6.

Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n) y(n + j) = B Au(n + j − k) + C Aξ(n + j) = B Au(n + j − k) +

  • Ej + z−jFj

A

  • ξ(n + j)

= B Au(n + j − k) + Fj A ξ(n) + Ejξ(n + j) = B Au(n + j − k) + Fj A Ay(n) − Bu(n − k) C + Ejξ(n + j) = B Au(n + j − k) − FjB AC u(n − k) + Fj C y(n) + Ejξ(n + j) = B A

  • 1 − Fj

C z−j

  • u(n + j − k) + Fj

C y(n) + Ejξ(n + j)

Digital Control

6

Kannan M. Moudgalya, Autumn 2007

slide-53
SLIDE 53

7.

Splitting Noise into Past and Future

slide-54
SLIDE 54

7.

Splitting Noise into Past and Future

From the previous slide, y(n + j) = B A

  • 1 − Fj

C z−j

  • u(n + j − k) + Fj

C y(n) + Ejξ(n + j)

slide-55
SLIDE 55

7.

Splitting Noise into Past and Future

From the previous slide, y(n + j) = B A

  • 1 − Fj

C z−j

  • u(n + j − k) + Fj

C y(n) + Ejξ(n + j) C A = Ej + z−jFj A

slide-56
SLIDE 56

7.

Splitting Noise into Past and Future

From the previous slide, y(n + j) = B A

  • 1 − Fj

C z−j

  • u(n + j − k) + Fj

C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej

slide-57
SLIDE 57

7.

Splitting Noise into Past and Future

From the previous slide, y(n + j) = B A

  • 1 − Fj

C z−j

  • u(n + j − k) + Fj

C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A

  • 1 − z−jFj

C

  • = Ej
slide-58
SLIDE 58

7.

Splitting Noise into Past and Future

From the previous slide, y(n + j) = B A

  • 1 − Fj

C z−j

  • u(n + j − k) + Fj

C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A

  • 1 − z−jFj

C

  • = Ej

y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j)

slide-59
SLIDE 59

7.

Splitting Noise into Past and Future

From the previous slide, y(n + j) = B A

  • 1 − Fj

C z−j

  • u(n + j − k) + Fj

C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A

  • 1 − z−jFj

C

  • = Ej

y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Last term has only future terms.

slide-60
SLIDE 60

7.

Splitting Noise into Past and Future

From the previous slide, y(n + j) = B A

  • 1 − Fj

C z−j

  • u(n + j − k) + Fj

C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A

  • 1 − z−jFj

C

  • = Ej

y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Last term has only future terms. Hence, best prediction model:

slide-61
SLIDE 61

7.

Splitting Noise into Past and Future

From the previous slide, y(n + j) = B A

  • 1 − Fj

C z−j

  • u(n + j − k) + Fj

C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A

  • 1 − z−jFj

C

  • = Ej

y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Last term has only future terms. Hence, best prediction model: ˆ y(n + j|n) = EjB C u(n + j − k) + Fj C y(n)

slide-62
SLIDE 62

7.

Splitting Noise into Past and Future

From the previous slide, y(n + j) = B A

  • 1 − Fj

C z−j

  • u(n + j − k) + Fj

C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A

  • 1 − z−jFj

C

  • = Ej

y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Last term has only future terms. Hence, best prediction model: ˆ y(n + j|n) = EjB C u(n + j − k) + Fj C y(n) ˆmeans estimate.

slide-63
SLIDE 63

7.

Splitting Noise into Past and Future

From the previous slide, y(n + j) = B A

  • 1 − Fj

C z−j

  • u(n + j − k) + Fj

C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A

  • 1 − z−jFj

C

  • = Ej

y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Last term has only future terms. Hence, best prediction model: ˆ y(n + j|n) = EjB C u(n + j − k) + Fj C y(n) ˆmeans estimate.|n means “using measurements, available up to and including n”.

Digital Control

7

Kannan M. Moudgalya, Autumn 2007

slide-64
SLIDE 64

8.

Example: Splitting C/A into Ej and Fj

slide-65
SLIDE 65

8.

Example: Splitting C/A into Ej and Fj 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = C A

slide-66
SLIDE 66

8.

Example: Splitting C/A into Ej and Fj 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = C A = Ej + z−jFj A

slide-67
SLIDE 67

8.

Example: Splitting C/A into Ej and Fj 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = C A = Ej + z−jFj A 1 + 1.1z−1 1 − 0.6z−1 − 0.16z−2 | 1 +0.5z−1 1 −0.6z−1 −0.16z−2 +1.1z−1 +0.16z−2 +1.1z−1 −0.66z−2 −0.176z−3 +0.82z−2 +0.176z−3

slide-68
SLIDE 68

8.

Example: Splitting C/A into Ej and Fj 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = C A = Ej + z−jFj A 1 + 1.1z−1 1 − 0.6z−1 − 0.16z−2 | 1 +0.5z−1 1 −0.6z−1 −0.16z−2 +1.1z−1 +0.16z−2 +1.1z−1 −0.66z−2 −0.176z−3 +0.82z−2 +0.176z−3

1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2

Digital Control

8

Kannan M. Moudgalya, Autumn 2007

slide-69
SLIDE 69

9.

Another Method to Split C/A into Ej and Fj

slide-70
SLIDE 70

9.

Another Method to Split C/A into Ej and Fj An easier method exists to solve C A = Ej + z−jFj A

slide-71
SLIDE 71

9.

Another Method to Split C/A into Ej and Fj An easier method exists to solve C A = Ej + z−jFj A Cross multiply by A:

slide-72
SLIDE 72

9.

Another Method to Split C/A into Ej and Fj An easier method exists to solve C A = Ej + z−jFj A Cross multiply by A: C = AEj + z−jFj

slide-73
SLIDE 73

9.

Another Method to Split C/A into Ej and Fj An easier method exists to solve C A = Ej + z−jFj A Cross multiply by A: C = AEj + z−jFj

  • C, A, z−j are known
slide-74
SLIDE 74

9.

Another Method to Split C/A into Ej and Fj An easier method exists to solve C A = Ej + z−jFj A Cross multiply by A: C = AEj + z−jFj

  • C, A, z−j are known
  • Ej, Fj are to be calculated.
slide-75
SLIDE 75

9.

Another Method to Split C/A into Ej and Fj An easier method exists to solve C A = Ej + z−jFj A Cross multiply by A: C = AEj + z−jFj

  • C, A, z−j are known
  • Ej, Fj are to be calculated.
  • Think: How would you solve it?

Digital Control

9

Kannan M. Moudgalya, Autumn 2007

slide-76
SLIDE 76

10.

Different Noise and Prediction Models: AR- MAX

slide-77
SLIDE 77

10.

Different Noise and Prediction Models: AR- MAX ARMAX Model

slide-78
SLIDE 78

10.

Different Noise and Prediction Models: AR- MAX ARMAX Model : Ay(n) = Bu(n − k) + Cξ(n)

slide-79
SLIDE 79

10.

Different Noise and Prediction Models: AR- MAX ARMAX Model : Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj

slide-80
SLIDE 80

10.

Different Noise and Prediction Models: AR- MAX ARMAX Model : Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj ˆ y(n + j|t) = EjB C u(n + j − k) + Fj C y(n)

Digital Control

10

Kannan M. Moudgalya, Autumn 2007

slide-81
SLIDE 81

11.

Different Noise and Prediction Models: ARI- MAX

slide-82
SLIDE 82

11.

Different Noise and Prediction Models: ARI- MAX ARIMAX model

slide-83
SLIDE 83

11.

Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1:

slide-84
SLIDE 84

11.

Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n)

slide-85
SLIDE 85

11.

Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n)

slide-86
SLIDE 86

11.

Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n) Recall ARMAX model: Ay(n) = Bu(n − k) + Cξ(n)

slide-87
SLIDE 87

11.

Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n) Recall ARMAX model: Ay(n) = Bu(n − k) + Cξ(n) Is the solution for ARMAX model useful?

slide-88
SLIDE 88

11.

Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n) Recall ARMAX model: Ay(n) = Bu(n − k) + Cξ(n) Is the solution for ARMAX model useful? A ← A∆

slide-89
SLIDE 89

11.

Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n) Recall ARMAX model: Ay(n) = Bu(n − k) + Cξ(n) Is the solution for ARMAX model useful? A ← A∆, B ← B∆

slide-90
SLIDE 90

11.

Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n) Recall ARMAX model: Ay(n) = Bu(n − k) + Cξ(n) Is the solution for ARMAX model useful? A ← A∆, B ← B∆ C = EjA∆ + z−jFj

slide-91
SLIDE 91

11.

Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n) Recall ARMAX model: Ay(n) = Bu(n − k) + Cξ(n) Is the solution for ARMAX model useful? A ← A∆, B ← B∆ C = EjA∆ + z−jFj ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n)

Digital Control

11

Kannan M. Moudgalya, Autumn 2007

slide-92
SLIDE 92

12.

Different Noise and Prediction Models: ARIX

slide-93
SLIDE 93

12.

Different Noise and Prediction Models: ARIX Recall ARIMAX model from previous slide: A∆y(n) = B∆u(n − k) + Cξ(n) ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n)

slide-94
SLIDE 94

12.

Different Noise and Prediction Models: ARIX Recall ARIMAX model from previous slide: A∆y(n) = B∆u(n − k) + Cξ(n) ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n) ARIX model,

slide-95
SLIDE 95

12.

Different Noise and Prediction Models: ARIX Recall ARIMAX model from previous slide: A∆y(n) = B∆u(n − k) + Cξ(n) ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n) ARIX model, obtained with C = 1 in ARIMAX:

slide-96
SLIDE 96

12.

Different Noise and Prediction Models: ARIX Recall ARIMAX model from previous slide: A∆y(n) = B∆u(n − k) + Cξ(n) ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n) ARIX model, obtained with C = 1 in ARIMAX: Ay(n) = Bu(n − k) + 1 ∆ξ(n)

slide-97
SLIDE 97

12.

Different Noise and Prediction Models: ARIX Recall ARIMAX model from previous slide: A∆y(n) = B∆u(n − k) + Cξ(n) ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n) ARIX model, obtained with C = 1 in ARIMAX: Ay(n) = Bu(n − k) + 1 ∆ξ(n) 1 = EjA∆ + z−jFj

slide-98
SLIDE 98

12.

Different Noise and Prediction Models: ARIX Recall ARIMAX model from previous slide: A∆y(n) = B∆u(n − k) + Cξ(n) ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n) ARIX model, obtained with C = 1 in ARIMAX: Ay(n) = Bu(n − k) + 1 ∆ξ(n) 1 = EjA∆ + z−jFj ˆ y(n + j|t) = EjB∆u(n + j − k) + Fjy(n)

Digital Control

12

Kannan M. Moudgalya, Autumn 2007

slide-99
SLIDE 99

13.

Minimum Variance Control: Regulation

slide-100
SLIDE 100

13.

Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n)

slide-101
SLIDE 101

13.

Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj

slide-102
SLIDE 102

13.

Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j)

slide-103
SLIDE 103

13.

Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Minimum variance control: Minimize the variations in y at k:

slide-104
SLIDE 104

13.

Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Minimum variance control: Minimize the variations in y at k: y(n + k) = EkB C u(n) + Fk C y(n) + Ekξ(n + k)

slide-105
SLIDE 105

13.

Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Minimum variance control: Minimize the variations in y at k: y(n + k) = EkB C u(n) + Fk C y(n) + Ekξ(n + k) To minimize E

  • y2(n + k)
  • .
slide-106
SLIDE 106

13.

Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Minimum variance control: Minimize the variations in y at k: y(n + k) = EkB C u(n) + Fk C y(n) + Ekξ(n + k) To minimize E

  • y2(n + k)
  • . ξ(n + k) is ind. of u(n), y(n)
slide-107
SLIDE 107

13.

Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Minimum variance control: Minimize the variations in y at k: y(n + k) = EkB C u(n) + Fk C y(n) + Ekξ(n + k) To minimize E

  • y2(n + k)
  • . ξ(n + k) is ind. of u(n), y(n)

EkBu(n) + Fky(n) = 0

slide-108
SLIDE 108

13.

Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Minimum variance control: Minimize the variations in y at k: y(n + k) = EkB C u(n) + Fk C y(n) + Ekξ(n + k) To minimize E

  • y2(n + k)
  • . ξ(n + k) is ind. of u(n), y(n)

EkBu(n) + Fky(n) = 0 u(n) = − Fk EkBy(n)

Digital Control

13

Kannan M. Moudgalya, Autumn 2007

slide-109
SLIDE 109

14.

Example: Minimum Variance Control

slide-110
SLIDE 110

14.

Example: Minimum Variance Control y(n) = 0.5 1 − 0.5z−1u(n − 1) + 1 1 − 0.9z−1ξ(n)

slide-111
SLIDE 111

14.

Example: Minimum Variance Control y(n) = 0.5 1 − 0.5z−1u(n − 1) + 1 1 − 0.9z−1ξ(n) A = (1 − 0.5z−1)(1 − 0.9z−1) = 1 − 1.4z−1 + 0.45z−2 B = 0.5(1 − 0.9z−1) C = (1 − 0.5z−1) k = 1

slide-112
SLIDE 112

14.

Example: Minimum Variance Control y(n) = 0.5 1 − 0.5z−1u(n − 1) + 1 1 − 0.9z−1ξ(n) A = (1 − 0.5z−1)(1 − 0.9z−1) = 1 − 1.4z−1 + 0.45z−2 B = 0.5(1 − 0.9z−1) C = (1 − 0.5z−1) k = 1 C = EkA + z−kFk 1 − 0.5z−1 = E1(1 − 1.4z−1 + 0.45z−2) + z−1F1

slide-113
SLIDE 113

14.

Example: Minimum Variance Control y(n) = 0.5 1 − 0.5z−1u(n − 1) + 1 1 − 0.9z−1ξ(n) A = (1 − 0.5z−1)(1 − 0.9z−1) = 1 − 1.4z−1 + 0.45z−2 B = 0.5(1 − 0.9z−1) C = (1 − 0.5z−1) k = 1 C = EkA + z−kFk 1 − 0.5z−1 = E1(1 − 1.4z−1 + 0.45z−2) + z−1F1 Solving, E1 = 1 F1 = 0.9 − 0.45z−1

Digital Control

14

Kannan M. Moudgalya, Autumn 2007

slide-114
SLIDE 114

15.

Example: Minimum Variance Control

slide-115
SLIDE 115

15.

Example: Minimum Variance Control B = 0.5(1 − 0.9z−1) E1 = 1 F1 = 0.9 − 0.45z−1

slide-116
SLIDE 116

15.

Example: Minimum Variance Control B = 0.5(1 − 0.9z−1) E1 = 1 F1 = 0.9 − 0.45z−1 u(n) = − Fk EkBy(n)

slide-117
SLIDE 117

15.

Example: Minimum Variance Control B = 0.5(1 − 0.9z−1) E1 = 1 F1 = 0.9 − 0.45z−1 u(n) = − Fk EkBy(n) = − 0.9 − 0.45z−1 0.5(1 − 0.9z−1)y(n)

slide-118
SLIDE 118

15.

Example: Minimum Variance Control B = 0.5(1 − 0.9z−1) E1 = 1 F1 = 0.9 − 0.45z−1 u(n) = − Fk EkBy(n) = − 0.9 − 0.45z−1 0.5(1 − 0.9z−1)y(n) = −0.9 2 − z−1 1 − 0.9z−1y(n) E

  • y2(n + k)
  • = E
  • (Ekξ(n + k))2
slide-119
SLIDE 119

15.

Example: Minimum Variance Control B = 0.5(1 − 0.9z−1) E1 = 1 F1 = 0.9 − 0.45z−1 u(n) = − Fk EkBy(n) = − 0.9 − 0.45z−1 0.5(1 − 0.9z−1)y(n) = −0.9 2 − z−1 1 − 0.9z−1y(n) E

  • y2(n + k)
  • = E
  • (Ekξ(n + k))2

= E

  • (ξ(n + 1))2
slide-120
SLIDE 120

15.

Example: Minimum Variance Control B = 0.5(1 − 0.9z−1) E1 = 1 F1 = 0.9 − 0.45z−1 u(n) = − Fk EkBy(n) = − 0.9 − 0.45z−1 0.5(1 − 0.9z−1)y(n) = −0.9 2 − z−1 1 − 0.9z−1y(n) E

  • y2(n + k)
  • = E
  • (Ekξ(n + k))2

= E

  • (ξ(n + 1))2

= σ2

Digital Control

15

Kannan M. Moudgalya, Autumn 2007

slide-121
SLIDE 121

16.

Minimum Variance Control for ARIX Model

slide-122
SLIDE 122

16.

Minimum Variance Control for ARIX Model Recall Ay(n) = Bu(n − k) + 1 ∆ξ(n)

slide-123
SLIDE 123

16.

Minimum Variance Control for ARIX Model Recall Ay(n) = Bu(n − k) + 1 ∆ξ(n) ˆ y(n + j|n) = EjB∆u(n + j − k) + Fjy(n) 1 = EjA∆ + z−jFj

slide-124
SLIDE 124

16.

Minimum Variance Control for ARIX Model Recall Ay(n) = Bu(n − k) + 1 ∆ξ(n) ˆ y(n + j|n) = EjB∆u(n + j − k) + Fjy(n) 1 = EjA∆ + z−jFj Minimum variance control law is obtained by forcing ˆ y(n+j|n) to be zero:

slide-125
SLIDE 125

16.

Minimum Variance Control for ARIX Model Recall Ay(n) = Bu(n − k) + 1 ∆ξ(n) ˆ y(n + j|n) = EjB∆u(n + j − k) + Fjy(n) 1 = EjA∆ + z−jFj Minimum variance control law is obtained by forcing ˆ y(n+j|n) to be zero: EkB∆u(n) = −Fky(n)

slide-126
SLIDE 126

16.

Minimum Variance Control for ARIX Model Recall Ay(n) = Bu(n − k) + 1 ∆ξ(n) ˆ y(n + j|n) = EjB∆u(n + j − k) + Fjy(n) 1 = EjA∆ + z−jFj Minimum variance control law is obtained by forcing ˆ y(n+j|n) to be zero: EkB∆u(n) = −Fky(n) ∆u(n) = − Fk EkBy(n)

slide-127
SLIDE 127

16.

Minimum Variance Control for ARIX Model Recall Ay(n) = Bu(n − k) + 1 ∆ξ(n) ˆ y(n + j|n) = EjB∆u(n + j − k) + Fjy(n) 1 = EjA∆ + z−jFj Minimum variance control law is obtained by forcing ˆ y(n+j|n) to be zero: EkB∆u(n) = −Fky(n) ∆u(n) = − Fk EkBy(n) For nonminimum phase systems, use an alternate approach

Digital Control

16

Kannan M. Moudgalya, Autumn 2007