SLIDE 1
k -Step Ahead Prediction Error Model 1. k -Step Ahead Prediction - - PowerPoint PPT Presentation
k -Step Ahead Prediction Error Model 1. k -Step Ahead Prediction - - PowerPoint PPT Presentation
k -Step Ahead Prediction Error Model 1. k -Step Ahead Prediction Error Model 1. ARMAX model is ARMA plus eXogeneous signal model: A ( z ) y ( n ) = B ( z ) u ( n k ) + C ( z ) ( n ) k -Step Ahead Prediction Error Model 1. ARMAX model is
SLIDE 2
SLIDE 3
1.
k-Step Ahead Prediction Error Model ARMAX model is ARMA plus eXogeneous signal model: A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) u - input y - output ξ - white noise k - delay
SLIDE 4
1.
k-Step Ahead Prediction Error Model ARMAX model is ARMA plus eXogeneous signal model: A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) u - input y - output ξ - white noise k - delay
- A, B, C are polynomials in z−1
SLIDE 5
1.
k-Step Ahead Prediction Error Model ARMAX model is ARMA plus eXogeneous signal model: A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) u - input y - output ξ - white noise k - delay
- A, B, C are polynomials in z−1
- All delay is factored into k so the constant terms of A, B,
C are not zero
SLIDE 6
1.
k-Step Ahead Prediction Error Model ARMAX model is ARMA plus eXogeneous signal model: A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) u - input y - output ξ - white noise k - delay
- A, B, C are polynomials in z−1
- All delay is factored into k so the constant terms of A, B,
C are not zero
- Constant terms of A and C are one (that is, A, C are
monic)
Digital Control
1
Kannan M. Moudgalya, Autumn 2007
SLIDE 7
2.
k-Step Ahead Prediction Error Model
SLIDE 8
2.
k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)
SLIDE 9
2.
k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)
- Any change in u can affect y only after k samples
SLIDE 10
2.
k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)
- Any change in u can affect y only after k samples
- But white noise starts affecting the process right away
SLIDE 11
2.
k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)
- Any change in u can affect y only after k samples
- But white noise starts affecting the process right away
- Want to get the best estimate of the output so as to take
corrective action,
SLIDE 12
2.
k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)
- Any change in u can affect y only after k samples
- But white noise starts affecting the process right away
- Want to get the best estimate of the output so as to take
corrective action, starting now
SLIDE 13
2.
k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)
- Any change in u can affect y only after k samples
- But white noise starts affecting the process right away
- Want to get the best estimate of the output so as to take
corrective action, starting now The above equation can be rewritten as, A(z)y(n + j) = B(z)u(n + j − k) + C(z)ξ(n + j)
SLIDE 14
2.
k-Step Ahead Prediction Error Model Recall A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)
- Any change in u can affect y only after k samples
- But white noise starts affecting the process right away
- Want to get the best estimate of the output so as to take
corrective action, starting now The above equation can be rewritten as, A(z)y(n + j) = B(z)u(n + j − k) + C(z)ξ(n + j) Want to predict output from n+k onwards or for n+j, j ≥ k
Digital Control
2
Kannan M. Moudgalya, Autumn 2007
SLIDE 15
3.
k-Step Ahead Prediction Error Model
SLIDE 16
3.
k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n)
SLIDE 17
3.
k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k)
SLIDE 18
3.
k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is,
SLIDE 19
3.
k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is, ˆ y(n + k|n) =
SLIDE 20
3.
k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is, ˆ y(n + k|n) = B(z) A(z)u(n)
SLIDE 21
3.
k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is, ˆ y(n + k|n) = B(z) A(z)u(n) If C = A, divide C by A as follows, with j to be specified: C(z) A(z) = Ej(z) + z−jFj(z) A(z)
SLIDE 22
3.
k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is, ˆ y(n + k|n) = B(z) A(z)u(n) If C = A, divide C by A as follows, with j to be specified: C(z) A(z) = Ej(z) + z−jFj(z) A(z) Ej(z) = ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1)
SLIDE 23
3.
k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is, ˆ y(n + k|n) = B(z) A(z)u(n) If C = A, divide C by A as follows, with j to be specified: C(z) A(z) = Ej(z) + z−jFj(z) A(z) Ej(z) = ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1) Fj(z) = fj,0 + fj,1z−1 + · · · + fj,dFjz−dFj
SLIDE 24
3.
k-Step Ahead Prediction Error Model A(z)y(n) = B(z)u(n − k) + C(z)ξ(n) y(n + k) = B(z) A(z)u(n) + C(z) A(z)ξ(n + k) If C = A, the best prediction model is, ˆ y(n + k|n) = B(z) A(z)u(n) If C = A, divide C by A as follows, with j to be specified: C(z) A(z) = Ej(z) + z−jFj(z) A(z) Ej(z) = ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1) Fj(z) = fj,0 + fj,1z−1 + · · · + fj,dFjz−dFj Noise has past and future terms, to be split
Digital Control
3
Kannan M. Moudgalya, Autumn 2007
SLIDE 25
4.
Splitting Noise into Past and Future
SLIDE 26
4.
Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j)
SLIDE 27
4.
Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k)
SLIDE 28
4.
Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k) +
- (ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1))
+ z−jfj,0 + fj,1z−1 + · · · + fj,dFjz−dFj A(z)
- ξ(n + j)
SLIDE 29
4.
Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k) +
- (ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1))
+ z−jfj,0 + fj,1z−1 + · · · + fj,dFjz−dFj A(z)
- ξ(n + j)
II = ej,0ξ(n + j)
SLIDE 30
4.
Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k) +
- (ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1))
+ z−jfj,0 + fj,1z−1 + · · · + fj,dFjz−dFj A(z)
- ξ(n + j)
II = ej,0ξ(n + j) + ej,1ξ(n + j − 1)
SLIDE 31
4.
Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k) +
- (ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1))
+ z−jfj,0 + fj,1z−1 + · · · + fj,dFjz−dFj A(z)
- ξ(n + j)
II = ej,0ξ(n + j) + ej,1ξ(n + j − 1) + · · · + ej,j−1ξ(n + 1)
SLIDE 32
4.
Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k) +
- (ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1))
+ z−jfj,0 + fj,1z−1 + · · · + fj,dFjz−dFj A(z)
- ξ(n + j)
II = ej,0ξ(n + j) + ej,1ξ(n + j − 1) + · · · + ej,j−1ξ(n + 1) All future terms.
SLIDE 33
4.
Splitting Noise into Past and Future y(n + j) = B(z) A(z)u(n + j − k) + C(z) A(z)ξ(n + j) y(n + j) = B(z) A(z)u(n + j − k) +
- (ej,0 + ej,1z−1 + · · · + ej,j−1z−(j−1))
+ z−jfj,0 + fj,1z−1 + · · · + fj,dFjz−dFj A(z)
- ξ(n + j)
II = ej,0ξ(n + j) + ej,1ξ(n + j − 1) + · · · + ej,j−1ξ(n + 1) All future terms. III =
- fj,0 + fj,1z−1 + · · · + fj,dFjz−dFj
- ξ(n)/A(z)
III term is known from previous measurements
Digital Control
4
Kannan M. Moudgalya, Autumn 2007
SLIDE 34
5.
Example: Splitting Noise into Past and Future
SLIDE 35
5.
Example: Splitting Noise into Past and Future
y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)
SLIDE 36
5.
Example: Splitting Noise into Past and Future
y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)
Split C into Ej and Fj, for j = 2:
SLIDE 37
5.
Example: Splitting Noise into Past and Future
y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)
Split C into Ej and Fj, for j = 2:
1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2
SLIDE 38
5.
Example: Splitting Noise into Past and Future
y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)
Split C into Ej and Fj, for j = 2:
1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2
Substitute it in the expression for y(n + j),
SLIDE 39
5.
Example: Splitting Noise into Past and Future
y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)
Split C into Ej and Fj, for j = 2:
1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2
Substitute it in the expression for y(n + j), with j = 2:
SLIDE 40
5.
Example: Splitting Noise into Past and Future
y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)
Split C into Ej and Fj, for j = 2:
1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2
Substitute it in the expression for y(n + j), with j = 2: y(n + 2) = 1 1 − 0.6z−1 − 0.16z−2u(n)
SLIDE 41
5.
Example: Splitting Noise into Past and Future
y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)
Split C into Ej and Fj, for j = 2:
1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2
Substitute it in the expression for y(n + j), with j = 2: y(n + 2) = 1 1 − 0.6z−1 − 0.16z−2u(n) + (1 + 1.1z−1)ξ(n + 2)
SLIDE 42
5.
Example: Splitting Noise into Past and Future
y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)
Split C into Ej and Fj, for j = 2:
1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2
Substitute it in the expression for y(n + j), with j = 2: y(n + 2) = 1 1 − 0.6z−1 − 0.16z−2u(n) + (1 + 1.1z−1)ξ(n + 2) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2ξ(n + 2)
SLIDE 43
5.
Example: Splitting Noise into Past and Future
y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)
Split C into Ej and Fj, for j = 2:
1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2
Substitute it in the expression for y(n + j), with j = 2: y(n + 2) = 1 1 − 0.6z−1 − 0.16z−2u(n) + (1 + 1.1z−1)ξ(n + 2) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2ξ(n + 2) Second term is unknown;
SLIDE 44
5.
Example: Splitting Noise into Past and Future
y(n + j) = u(n + j − 2) 1 − 0.6z−1 − 0.16z−2 + 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2ξ(n + j)
Split C into Ej and Fj, for j = 2:
1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2
Substitute it in the expression for y(n + j), with j = 2: y(n + 2) = 1 1 − 0.6z−1 − 0.16z−2u(n) + (1 + 1.1z−1)ξ(n + 2) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2ξ(n + 2) Second term is unknown; Last term is known.
Digital Control
5
Kannan M. Moudgalya, Autumn 2007
SLIDE 45
6.
Splitting Noise into Past and Future
SLIDE 46
6.
Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n)
SLIDE 47
6.
Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n) y(n + j) = B Au(n + j − k) + C Aξ(n + j)
SLIDE 48
6.
Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n) y(n + j) = B Au(n + j − k) + C Aξ(n + j) = B Au(n + j − k) +
- Ej + z−jFj
A
- ξ(n + j)
SLIDE 49
6.
Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n) y(n + j) = B Au(n + j − k) + C Aξ(n + j) = B Au(n + j − k) +
- Ej + z−jFj
A
- ξ(n + j)
= B Au(n + j − k) + Fj A ξ(n) + Ejξ(n + j)
SLIDE 50
6.
Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n) y(n + j) = B Au(n + j − k) + C Aξ(n + j) = B Au(n + j − k) +
- Ej + z−jFj
A
- ξ(n + j)
= B Au(n + j − k) + Fj A ξ(n) + Ejξ(n + j) = B Au(n + j − k) + Fj A Ay(n) − Bu(n − k) C + Ejξ(n + j)
SLIDE 51
6.
Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n) y(n + j) = B Au(n + j − k) + C Aξ(n + j) = B Au(n + j − k) +
- Ej + z−jFj
A
- ξ(n + j)
= B Au(n + j − k) + Fj A ξ(n) + Ejξ(n + j) = B Au(n + j − k) + Fj A Ay(n) − Bu(n − k) C + Ejξ(n + j) = B Au(n + j − k) − FjB AC u(n − k) + Fj C y(n) + Ejξ(n + j)
SLIDE 52
6.
Splitting Noise into Past and Future Ay(n) = Bu(n − k) + Cξ(n) y(n + j) = B Au(n + j − k) + C Aξ(n + j) = B Au(n + j − k) +
- Ej + z−jFj
A
- ξ(n + j)
= B Au(n + j − k) + Fj A ξ(n) + Ejξ(n + j) = B Au(n + j − k) + Fj A Ay(n) − Bu(n − k) C + Ejξ(n + j) = B Au(n + j − k) − FjB AC u(n − k) + Fj C y(n) + Ejξ(n + j) = B A
- 1 − Fj
C z−j
- u(n + j − k) + Fj
C y(n) + Ejξ(n + j)
Digital Control
6
Kannan M. Moudgalya, Autumn 2007
SLIDE 53
7.
Splitting Noise into Past and Future
SLIDE 54
7.
Splitting Noise into Past and Future
From the previous slide, y(n + j) = B A
- 1 − Fj
C z−j
- u(n + j − k) + Fj
C y(n) + Ejξ(n + j)
SLIDE 55
7.
Splitting Noise into Past and Future
From the previous slide, y(n + j) = B A
- 1 − Fj
C z−j
- u(n + j − k) + Fj
C y(n) + Ejξ(n + j) C A = Ej + z−jFj A
SLIDE 56
7.
Splitting Noise into Past and Future
From the previous slide, y(n + j) = B A
- 1 − Fj
C z−j
- u(n + j − k) + Fj
C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej
SLIDE 57
7.
Splitting Noise into Past and Future
From the previous slide, y(n + j) = B A
- 1 − Fj
C z−j
- u(n + j − k) + Fj
C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A
- 1 − z−jFj
C
- = Ej
SLIDE 58
7.
Splitting Noise into Past and Future
From the previous slide, y(n + j) = B A
- 1 − Fj
C z−j
- u(n + j − k) + Fj
C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A
- 1 − z−jFj
C
- = Ej
y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j)
SLIDE 59
7.
Splitting Noise into Past and Future
From the previous slide, y(n + j) = B A
- 1 − Fj
C z−j
- u(n + j − k) + Fj
C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A
- 1 − z−jFj
C
- = Ej
y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Last term has only future terms.
SLIDE 60
7.
Splitting Noise into Past and Future
From the previous slide, y(n + j) = B A
- 1 − Fj
C z−j
- u(n + j − k) + Fj
C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A
- 1 − z−jFj
C
- = Ej
y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Last term has only future terms. Hence, best prediction model:
SLIDE 61
7.
Splitting Noise into Past and Future
From the previous slide, y(n + j) = B A
- 1 − Fj
C z−j
- u(n + j − k) + Fj
C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A
- 1 − z−jFj
C
- = Ej
y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Last term has only future terms. Hence, best prediction model: ˆ y(n + j|n) = EjB C u(n + j − k) + Fj C y(n)
SLIDE 62
7.
Splitting Noise into Past and Future
From the previous slide, y(n + j) = B A
- 1 − Fj
C z−j
- u(n + j − k) + Fj
C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A
- 1 − z−jFj
C
- = Ej
y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Last term has only future terms. Hence, best prediction model: ˆ y(n + j|n) = EjB C u(n + j − k) + Fj C y(n) ˆmeans estimate.
SLIDE 63
7.
Splitting Noise into Past and Future
From the previous slide, y(n + j) = B A
- 1 − Fj
C z−j
- u(n + j − k) + Fj
C y(n) + Ejξ(n + j) C A = Ej + z−jFj A ⇒ C A − z−jFj A = Ej ⇒ C A
- 1 − z−jFj
C
- = Ej
y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Last term has only future terms. Hence, best prediction model: ˆ y(n + j|n) = EjB C u(n + j − k) + Fj C y(n) ˆmeans estimate.|n means “using measurements, available up to and including n”.
Digital Control
7
Kannan M. Moudgalya, Autumn 2007
SLIDE 64
8.
Example: Splitting C/A into Ej and Fj
SLIDE 65
8.
Example: Splitting C/A into Ej and Fj 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = C A
SLIDE 66
8.
Example: Splitting C/A into Ej and Fj 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = C A = Ej + z−jFj A
SLIDE 67
8.
Example: Splitting C/A into Ej and Fj 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = C A = Ej + z−jFj A 1 + 1.1z−1 1 − 0.6z−1 − 0.16z−2 | 1 +0.5z−1 1 −0.6z−1 −0.16z−2 +1.1z−1 +0.16z−2 +1.1z−1 −0.66z−2 −0.176z−3 +0.82z−2 +0.176z−3
SLIDE 68
8.
Example: Splitting C/A into Ej and Fj 1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = C A = Ej + z−jFj A 1 + 1.1z−1 1 − 0.6z−1 − 0.16z−2 | 1 +0.5z−1 1 −0.6z−1 −0.16z−2 +1.1z−1 +0.16z−2 +1.1z−1 −0.66z−2 −0.176z−3 +0.82z−2 +0.176z−3
1 + 0.5z−1 1 − 0.6z−1 − 0.16z−2 = (1 + 1.1z−1) + z−2 0.82 + 0.176z−1 1 − 0.6z−1 − 0.16z−2
Digital Control
8
Kannan M. Moudgalya, Autumn 2007
SLIDE 69
9.
Another Method to Split C/A into Ej and Fj
SLIDE 70
9.
Another Method to Split C/A into Ej and Fj An easier method exists to solve C A = Ej + z−jFj A
SLIDE 71
9.
Another Method to Split C/A into Ej and Fj An easier method exists to solve C A = Ej + z−jFj A Cross multiply by A:
SLIDE 72
9.
Another Method to Split C/A into Ej and Fj An easier method exists to solve C A = Ej + z−jFj A Cross multiply by A: C = AEj + z−jFj
SLIDE 73
9.
Another Method to Split C/A into Ej and Fj An easier method exists to solve C A = Ej + z−jFj A Cross multiply by A: C = AEj + z−jFj
- C, A, z−j are known
SLIDE 74
9.
Another Method to Split C/A into Ej and Fj An easier method exists to solve C A = Ej + z−jFj A Cross multiply by A: C = AEj + z−jFj
- C, A, z−j are known
- Ej, Fj are to be calculated.
SLIDE 75
9.
Another Method to Split C/A into Ej and Fj An easier method exists to solve C A = Ej + z−jFj A Cross multiply by A: C = AEj + z−jFj
- C, A, z−j are known
- Ej, Fj are to be calculated.
- Think: How would you solve it?
Digital Control
9
Kannan M. Moudgalya, Autumn 2007
SLIDE 76
10.
Different Noise and Prediction Models: AR- MAX
SLIDE 77
10.
Different Noise and Prediction Models: AR- MAX ARMAX Model
SLIDE 78
10.
Different Noise and Prediction Models: AR- MAX ARMAX Model : Ay(n) = Bu(n − k) + Cξ(n)
SLIDE 79
10.
Different Noise and Prediction Models: AR- MAX ARMAX Model : Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj
SLIDE 80
10.
Different Noise and Prediction Models: AR- MAX ARMAX Model : Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj ˆ y(n + j|t) = EjB C u(n + j − k) + Fj C y(n)
Digital Control
10
Kannan M. Moudgalya, Autumn 2007
SLIDE 81
11.
Different Noise and Prediction Models: ARI- MAX
SLIDE 82
11.
Different Noise and Prediction Models: ARI- MAX ARIMAX model
SLIDE 83
11.
Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1:
SLIDE 84
11.
Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n)
SLIDE 85
11.
Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n)
SLIDE 86
11.
Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n) Recall ARMAX model: Ay(n) = Bu(n − k) + Cξ(n)
SLIDE 87
11.
Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n) Recall ARMAX model: Ay(n) = Bu(n − k) + Cξ(n) Is the solution for ARMAX model useful?
SLIDE 88
11.
Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n) Recall ARMAX model: Ay(n) = Bu(n − k) + Cξ(n) Is the solution for ARMAX model useful? A ← A∆
SLIDE 89
11.
Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n) Recall ARMAX model: Ay(n) = Bu(n − k) + Cξ(n) Is the solution for ARMAX model useful? A ← A∆, B ← B∆
SLIDE 90
11.
Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n) Recall ARMAX model: Ay(n) = Bu(n − k) + Cξ(n) Is the solution for ARMAX model useful? A ← A∆, B ← B∆ C = EjA∆ + z−jFj
SLIDE 91
11.
Different Noise and Prediction Models: ARI- MAX ARIMAX model with ∆ = 1 − z−1: Ay(n) = Bu(n − k) + C ∆ξ(n) A∆y(n) = B∆u(n − k) + Cξ(n) Recall ARMAX model: Ay(n) = Bu(n − k) + Cξ(n) Is the solution for ARMAX model useful? A ← A∆, B ← B∆ C = EjA∆ + z−jFj ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n)
Digital Control
11
Kannan M. Moudgalya, Autumn 2007
SLIDE 92
12.
Different Noise and Prediction Models: ARIX
SLIDE 93
12.
Different Noise and Prediction Models: ARIX Recall ARIMAX model from previous slide: A∆y(n) = B∆u(n − k) + Cξ(n) ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n)
SLIDE 94
12.
Different Noise and Prediction Models: ARIX Recall ARIMAX model from previous slide: A∆y(n) = B∆u(n − k) + Cξ(n) ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n) ARIX model,
SLIDE 95
12.
Different Noise and Prediction Models: ARIX Recall ARIMAX model from previous slide: A∆y(n) = B∆u(n − k) + Cξ(n) ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n) ARIX model, obtained with C = 1 in ARIMAX:
SLIDE 96
12.
Different Noise and Prediction Models: ARIX Recall ARIMAX model from previous slide: A∆y(n) = B∆u(n − k) + Cξ(n) ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n) ARIX model, obtained with C = 1 in ARIMAX: Ay(n) = Bu(n − k) + 1 ∆ξ(n)
SLIDE 97
12.
Different Noise and Prediction Models: ARIX Recall ARIMAX model from previous slide: A∆y(n) = B∆u(n − k) + Cξ(n) ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n) ARIX model, obtained with C = 1 in ARIMAX: Ay(n) = Bu(n − k) + 1 ∆ξ(n) 1 = EjA∆ + z−jFj
SLIDE 98
12.
Different Noise and Prediction Models: ARIX Recall ARIMAX model from previous slide: A∆y(n) = B∆u(n − k) + Cξ(n) ˆ y(n + j|n) = EjB∆ C u(n + j − k) + Fj C y(n) ARIX model, obtained with C = 1 in ARIMAX: Ay(n) = Bu(n − k) + 1 ∆ξ(n) 1 = EjA∆ + z−jFj ˆ y(n + j|t) = EjB∆u(n + j − k) + Fjy(n)
Digital Control
12
Kannan M. Moudgalya, Autumn 2007
SLIDE 99
13.
Minimum Variance Control: Regulation
SLIDE 100
13.
Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n)
SLIDE 101
13.
Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj
SLIDE 102
13.
Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j)
SLIDE 103
13.
Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Minimum variance control: Minimize the variations in y at k:
SLIDE 104
13.
Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Minimum variance control: Minimize the variations in y at k: y(n + k) = EkB C u(n) + Fk C y(n) + Ekξ(n + k)
SLIDE 105
13.
Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Minimum variance control: Minimize the variations in y at k: y(n + k) = EkB C u(n) + Fk C y(n) + Ekξ(n + k) To minimize E
- y2(n + k)
- .
SLIDE 106
13.
Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Minimum variance control: Minimize the variations in y at k: y(n + k) = EkB C u(n) + Fk C y(n) + Ekξ(n + k) To minimize E
- y2(n + k)
- . ξ(n + k) is ind. of u(n), y(n)
SLIDE 107
13.
Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Minimum variance control: Minimize the variations in y at k: y(n + k) = EkB C u(n) + Fk C y(n) + Ekξ(n + k) To minimize E
- y2(n + k)
- . ξ(n + k) is ind. of u(n), y(n)
EkBu(n) + Fky(n) = 0
SLIDE 108
13.
Minimum Variance Control: Regulation ARMAX Model: Ay(n) = Bu(n − k) + Cξ(n) C = EjA + z−jFj y(n + j) = EjB C u(n + j − k) + Fj C y(n) + Ejξ(n + j) Minimum variance control: Minimize the variations in y at k: y(n + k) = EkB C u(n) + Fk C y(n) + Ekξ(n + k) To minimize E
- y2(n + k)
- . ξ(n + k) is ind. of u(n), y(n)
EkBu(n) + Fky(n) = 0 u(n) = − Fk EkBy(n)
Digital Control
13
Kannan M. Moudgalya, Autumn 2007
SLIDE 109
14.
Example: Minimum Variance Control
SLIDE 110
14.
Example: Minimum Variance Control y(n) = 0.5 1 − 0.5z−1u(n − 1) + 1 1 − 0.9z−1ξ(n)
SLIDE 111
14.
Example: Minimum Variance Control y(n) = 0.5 1 − 0.5z−1u(n − 1) + 1 1 − 0.9z−1ξ(n) A = (1 − 0.5z−1)(1 − 0.9z−1) = 1 − 1.4z−1 + 0.45z−2 B = 0.5(1 − 0.9z−1) C = (1 − 0.5z−1) k = 1
SLIDE 112
14.
Example: Minimum Variance Control y(n) = 0.5 1 − 0.5z−1u(n − 1) + 1 1 − 0.9z−1ξ(n) A = (1 − 0.5z−1)(1 − 0.9z−1) = 1 − 1.4z−1 + 0.45z−2 B = 0.5(1 − 0.9z−1) C = (1 − 0.5z−1) k = 1 C = EkA + z−kFk 1 − 0.5z−1 = E1(1 − 1.4z−1 + 0.45z−2) + z−1F1
SLIDE 113
14.
Example: Minimum Variance Control y(n) = 0.5 1 − 0.5z−1u(n − 1) + 1 1 − 0.9z−1ξ(n) A = (1 − 0.5z−1)(1 − 0.9z−1) = 1 − 1.4z−1 + 0.45z−2 B = 0.5(1 − 0.9z−1) C = (1 − 0.5z−1) k = 1 C = EkA + z−kFk 1 − 0.5z−1 = E1(1 − 1.4z−1 + 0.45z−2) + z−1F1 Solving, E1 = 1 F1 = 0.9 − 0.45z−1
Digital Control
14
Kannan M. Moudgalya, Autumn 2007
SLIDE 114
15.
Example: Minimum Variance Control
SLIDE 115
15.
Example: Minimum Variance Control B = 0.5(1 − 0.9z−1) E1 = 1 F1 = 0.9 − 0.45z−1
SLIDE 116
15.
Example: Minimum Variance Control B = 0.5(1 − 0.9z−1) E1 = 1 F1 = 0.9 − 0.45z−1 u(n) = − Fk EkBy(n)
SLIDE 117
15.
Example: Minimum Variance Control B = 0.5(1 − 0.9z−1) E1 = 1 F1 = 0.9 − 0.45z−1 u(n) = − Fk EkBy(n) = − 0.9 − 0.45z−1 0.5(1 − 0.9z−1)y(n)
SLIDE 118
15.
Example: Minimum Variance Control B = 0.5(1 − 0.9z−1) E1 = 1 F1 = 0.9 − 0.45z−1 u(n) = − Fk EkBy(n) = − 0.9 − 0.45z−1 0.5(1 − 0.9z−1)y(n) = −0.9 2 − z−1 1 − 0.9z−1y(n) E
- y2(n + k)
- = E
- (Ekξ(n + k))2
SLIDE 119
15.
Example: Minimum Variance Control B = 0.5(1 − 0.9z−1) E1 = 1 F1 = 0.9 − 0.45z−1 u(n) = − Fk EkBy(n) = − 0.9 − 0.45z−1 0.5(1 − 0.9z−1)y(n) = −0.9 2 − z−1 1 − 0.9z−1y(n) E
- y2(n + k)
- = E
- (Ekξ(n + k))2
= E
- (ξ(n + 1))2
SLIDE 120
15.
Example: Minimum Variance Control B = 0.5(1 − 0.9z−1) E1 = 1 F1 = 0.9 − 0.45z−1 u(n) = − Fk EkBy(n) = − 0.9 − 0.45z−1 0.5(1 − 0.9z−1)y(n) = −0.9 2 − z−1 1 − 0.9z−1y(n) E
- y2(n + k)
- = E
- (Ekξ(n + k))2
= E
- (ξ(n + 1))2
= σ2
Digital Control
15
Kannan M. Moudgalya, Autumn 2007
SLIDE 121
16.
Minimum Variance Control for ARIX Model
SLIDE 122
16.
Minimum Variance Control for ARIX Model Recall Ay(n) = Bu(n − k) + 1 ∆ξ(n)
SLIDE 123
16.
Minimum Variance Control for ARIX Model Recall Ay(n) = Bu(n − k) + 1 ∆ξ(n) ˆ y(n + j|n) = EjB∆u(n + j − k) + Fjy(n) 1 = EjA∆ + z−jFj
SLIDE 124
16.
Minimum Variance Control for ARIX Model Recall Ay(n) = Bu(n − k) + 1 ∆ξ(n) ˆ y(n + j|n) = EjB∆u(n + j − k) + Fjy(n) 1 = EjA∆ + z−jFj Minimum variance control law is obtained by forcing ˆ y(n+j|n) to be zero:
SLIDE 125
16.
Minimum Variance Control for ARIX Model Recall Ay(n) = Bu(n − k) + 1 ∆ξ(n) ˆ y(n + j|n) = EjB∆u(n + j − k) + Fjy(n) 1 = EjA∆ + z−jFj Minimum variance control law is obtained by forcing ˆ y(n+j|n) to be zero: EkB∆u(n) = −Fky(n)
SLIDE 126
16.
Minimum Variance Control for ARIX Model Recall Ay(n) = Bu(n − k) + 1 ∆ξ(n) ˆ y(n + j|n) = EjB∆u(n + j − k) + Fjy(n) 1 = EjA∆ + z−jFj Minimum variance control law is obtained by forcing ˆ y(n+j|n) to be zero: EkB∆u(n) = −Fky(n) ∆u(n) = − Fk EkBy(n)
SLIDE 127