Adaptive Control Chapter 5: Recursive plant model identification in - - PowerPoint PPT Presentation

adaptive control
SMART_READER_LITE
LIVE PREVIEW

Adaptive Control Chapter 5: Recursive plant model identification in - - PowerPoint PPT Presentation

Adaptive Control Chapter 5: Recursive plant model identification in open loop 1 Adaptive Control Landau, Lozano, MSaad, Karimi Chapter 5: Recursive plant model identification in open loop 2 Adaptive Control Landau, Lozano,


slide-1
SLIDE 1

Adaptive Control – Landau, Lozano, M’Saad, Karimi

1

Chapter 5: Recursive plant model identification in open loop

Adaptive Control

slide-2
SLIDE 2

Adaptive Control – Landau, Lozano, M’Saad, Karimi

2

Chapter 5: Recursive plant model identification in open loop

slide-3
SLIDE 3

Adaptive Control – Landau, Lozano, M’Saad, Karimi

3

Why to review “open loop system identification” ?

  • We need information upon the complexity of the plant model and its basic

dynamical features for building an “adaptive control scheme”

  • Recursive identification algorithms are used in indirect adaptive control
  • It is an introduction to “identification in closed loop “ which is used for “Iterative

identification in closed loop and controller redesign” (an adaptation technique)

slide-4
SLIDE 4

Adaptive Control – Landau, Lozano, M’Saad, Karimi

4

OUTLINE Open loop system identification

  • Data acquisition
  • Model complexity
  • Parameter estimation
  • Validation
slide-5
SLIDE 5

Adaptive Control – Landau, Lozano, M’Saad, Karimi

5

Objective of system identification (for control) To extract from experimental data a dynamic model of the plant which will allow to design a controller in order to match the control specifications

slide-6
SLIDE 6

Adaptive Control – Landau, Lozano, M’Saad, Karimi

6

I/0 Data Acquisition under anExperimental Protocol Model Complexity Estimation (or Selection) Choice of the Noise Model Parameter Estimation Model Validation Yes No Control Design

System Identification Methodology

slide-7
SLIDE 7

Adaptive Control – Landau, Lozano, M’Saad, Karimi

7

I/O Data Acqusition Signal : a P.R.B.S sequence Magnitude : few % of the input operating point Clock frequency : Length : Largest pulse : 3 , 2 , 1 ; ) / 1 ( = = p f p f

s clock

) ( frequency sampling fs =

s s s N

f T cells

  • f

number N pT / 1 , ; ) 1 2 (

1

= = −

− s

NpT Length : < allowed duration for the experiment Largest pulse : (rise time)

R

t ≥

P.R.B.S. NpTs tr

slide-8
SLIDE 8

Adaptive Control – Landau, Lozano, M’Saad, Karimi

8

power amplifier filter

u(t) y(t)

inertia d.c. motor . tacho generator

M

TG

An I/O File

slide-9
SLIDE 9

Adaptive Control – Landau, Lozano, M’Saad, Karimi

9

0.05 0.10 0.15 0.20 0.25 0.30 0.35 0.40 0.45 0.5

  • 40
  • 30
  • 20
  • 10

10

a) p = 1 dB

0.05 0.10 0.15 0.20 0.25 0.30 0.35 0.40 0.45 0.5

  • 40
  • 30
  • 20
  • 10

10

b) p = 2 dB

0.05 0.10 0.15 0.20 0.25 0.30 0.35 0.40 0.45 0.5

  • 40
  • 30
  • 20
  • 10

10

c) p = 3 dB f/fs

Spectral density of a P.R.B.S.

p=1 fclock=fs p=2 fclock=fs/2 p=3 fclock=fs/3

N=8

slide-10
SLIDE 10

Adaptive Control – Landau, Lozano, M’Saad, Karimi

10

Data pre-processing The I/O data files should be centered The use of non centered data files can cause serious errors

slide-11
SLIDE 11

Adaptive Control – Landau, Lozano, M’Saad, Karimi

11

Complexity Estimation from I/O Data Objective : To get a good estimation of the model complexity directly from noisy data ) , , ( d n n

B A

complexity penalty term error term (should be unbiased) S(n,N) n CJ (n) J (n) minimum nopt

[ ]

) , ˆ ( ) ˆ ( min min ˆ

ˆ ˆ

N n S n J CJ n

n n

  • pt

+ = = ) , max( d n n n

B A

+ = To get a good order estimation, J should tend to the value for noisy free data when (use of instrumental variables) ∞ → N

slide-12
SLIDE 12

Adaptive Control – Landau, Lozano, M’Saad, Karimi

12 u(t) y(t)

Plant ADC Discretized plant

+

  • Parameter

estimation algorithm

y(t)

ε(t)

Estimated model parameters ) ( ˆ t θ Adjustable discrete-time model

DAC + ZOH

Parameter Estimation It does not exists a unique algorithm providing good results in all the situations encountered in practice

Parameter estimation alg.: batch (non recursive) recursive

slide-13
SLIDE 13

Adaptive Control – Landau, Lozano, M’Saad, Karimi

13

Plant Model ) ( ) ( * ) ( ) ( ) (

1 1 1 1 1 1 − − − − − − − −

= = q A q B q q A q B q q G

d d

) ( * 1 ... 1 ) (

1 1 1 1 1 − − − − −

+ = + + + = q A q q a q a q A

A A

n n

) ( * ... ) (

1 1 1 1 1 − − − − −

= + + = q B q q b q b q B

B B

n n

) ( ) ( ) ( * ) ( ) ( * ) 1 (

1 1

t d t u q B t y q A t y

θ = − + − = +

− −

[ ]

B A

n n T

b b a a ,..., , ,...

1 1

= θ

[ ]

) 1 ( )... ( ), 1 ( )... ( ) ( + − − − + − − − =

B A T

n d t u d t u n t y t y t ψ G

u y

slide-14
SLIDE 14

Adaptive Control – Landau, Lozano, M’Saad, Karimi

14

Recursive Parameter Estimation Methods ) ( ) ( ) ( * ) ( ) ( * ) 1 (

1 1

t d t u q B t y q A t y

θ = − + − = +

− −

vector t measuremen vector parameter − − ψ θ ;

Plant Model

) ( ) ( ˆ ) 1 ( ˆ t t t y

T

φ θ = + ° vector n

  • bservatio

vector parameter estimated − − φ

Estimated model

θ ; ˆ ) 1 ( ˆ ) 1 ( ) ( ) ( ˆ ) 1 ( ) 1 ( + − + = Φ − + = + t y t y t t t y t

T

θ ε

Prediction error (a priori) Parameter adaptation algorithm (P.A.A.)

2 ) ( ; 1 ) ( ) ( ) ( ) ( ) ( ) ( ) 1 ( ) 1 ( ) ( ) 1 ( ) ( ˆ ) 1 ( ˆ

2 1 2 1 1 1

< ≤ ≤ < Φ Φ + = + + Φ + + = +

− −

t t t t t t F t t F t t t F t t

T

λ λ λ λ ε θ θ

[ ]

) ( ) ( t f t φ = Φ regressor vector

slide-15
SLIDE 15

Adaptive Control – Landau, Lozano, M’Saad, Karimi

15

Recursive Least Squares ) ( ) ( ) ( * ) ( ) ( * ) 1 (

1 1

t d t u q B t y q A t y

θ = − + − = +

− −

vector t measuremen vector parameter − − ψ θ ;

Plant Model

) ( ) ( ˆ ) 1 ( ˆ t t t y

T

φ θ = + ° vector n

  • bservatio

vector parameter estimated − − φ

Estimated model

θ ; ˆ

[ ]

B A

n n T

b b a a ,..., , ,...

1 1

= θ

[ ]

) 1 ( )... ( ), 1 ( )... ( ) ( ) ( + − − − + − − − = =

B A T T

n d t u d t u n t y t y t t ψ φ

;

) ( ) ( ) ( t t t ψ φ = = Φ

+

  • Plant

Disturbance P.A.A

y(t) u(t)

Adjustable Predictor ) (

0 t

ε

y°(t)

>

y(t-1)

1 −

q

See functions: rls.sci(.m)

  • n the book website

Regressor vector

slide-16
SLIDE 16

Adaptive Control – Landau, Lozano, M’Saad, Karimi

16

Effect of stochastic disturbances (noise measurement)

  • Identification algorithms operates at low signal to noise ratio

(in order to disturb as little as possible a plant under operation)

  • This often causes an error on estimated parameters called “bias”
  • The reason for the existence of many identification algorithms

is that it does not exist an unique algorithm which gives unbiased estimates in all practical situations

slide-17
SLIDE 17

Adaptive Control – Landau, Lozano, M’Saad, Karimi

17

Non recursive least squares

=

− =

N i

i i y N F N

1

) 1 ( ) ( ) ( ) ( ˆ φ θ

= −

− − =

N i T

i i N F

1 1

) 1 ( ) 1 ( ) ( φ φ

See functions: nrls.sci(.m) on the book website

(*)

slide-18
SLIDE 18

Adaptive Control – Landau, Lozano, M’Saad, Karimi

18

In the presence of measurement noise the estimation of parameters is “biased” when using least squares algorithm

) 1 ( ) ( ) 1 ( ) ( ) 1 ( + + = + + = + t w t t w t t y

T T

φ θ ψ θ

⎥ ⎦ ⎤ ⎢ ⎣ ⎡ ∑ − ⎥ ⎦ ⎤ ⎢ ⎣ ⎡ − ∑ − + =

= − =

) ( ) 1 ( 1 ) 1 ( ) 1 ( 1 ) ( ˆ

1 1 1

t w t N t t N N

N t T N t

φ φ φ θ θ Bias for the least squares algorithm (replace y in (*) by (**) ):

{ }

) ( ) 1 ( ) ( ) 1 ( 1 lim

1 1

= − = ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎣ ⎡ −

− = ∞ →

t w t E t w t N

N i N

φ φ

Condition for asymptotic unbiased estimation

Plant output in the presence of noise:

noise regressor (observation) vector

For the least squares this implies : w(t) = e(t) (white noise). For all the other cases the estimated parameters will be biased

It is necessary that φ(t-1)(the regressor) and w(t) be uncorrelated

Bias in Least Squares Parameter Estimation

(***) (**)

slide-19
SLIDE 19

Adaptive Control – Landau, Lozano, M’Saad, Karimi

19

Unbiased estimation in the presence of noise

Suppose : and we want that the algorithm leaves unchanged this value θ θ = ˆ

) ( ) 1 ( ˆ t t y

θ θ = + ) 1 ( ) 1 ( ˆ ) 1 ( ) 1 ( + = + − + = + t w t y t y t θ θ ε

Necessary condition for unbiased estimation:

{ }

) , ( ) , 1 ( ) , ( ) , 1 ( 1 lim

1 1

= − = ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎣ ⎡ −

− = ∞ →

θ ε θ φ θ ε θ φ t t E t t N

N i N

(****)

To eliminate the bias :

{ }

θ θ ε φ ≡ = + ˆ f ) 1 ( ) (

  • r

t t E

One modifies the LS algorithm in order to obtain: ε(t+1) as a white noise for: θ θ = ˆ uncorrelated φ(t) and ε(t+1) for: θ θ = ˆ

  • r:

necessary condition

slide-20
SLIDE 20

Adaptive Control – Landau, Lozano, M’Saad, Karimi

20

Parameter Estimation Methods I- Based on the asymptotic whitening of the prediction error (Recursive Least Squares, Extended Least Squares, Recursive

  • Max. Likelihood, O.E. with Extended Prediction Model )

II- Based on the asymptotic decorrelation between the prediction error and the observation vector (Output Error, Instrumental Variable) One makes assumptions on the “noise” and One constructs the appropriate algorithm

Remark:

slide-21
SLIDE 21

Adaptive Control – Landau, Lozano, M’Saad, Karimi

21 + + u(t) y(t) 1 A q-d B A e(t)

) ( ) ( ) ( ) ( ) ( : 1

1 1

t e t u q B q t y q A S

d

+ =

− − −

(Recursive) Least Squares Ouput Error(O.E.) Instrumental Variable…

+ + u(t) y(t) 1 q-d B A e(t) CA

[ ]

) ( ) ( / 1 ) ( ) ( ) ( ) ( : 4

1 1 1

t e q C t u q B q t y q A S

d − − − −

+ = Generalized Least Squares

+ + u(t) y(t) A q-d B A e(t) C

) ( ) ( ) ( ) ( ) ( ) ( : 3

1 1 1

t e q C t u q B q t y q A S

d − − − −

+ = Extended Least Squares O.E. with Extended Prediction Model (Recursive) Maximum Likelihood

«Plant + Noise » Models

~ 64% % of use: ~ 1% % of use: ~ 2% % of use:

+ + u(t) y(t) q-d B A w(t)

) ( ) ( ) ( ) ( ) ( ) ( : 2

1 1 1

t w q A t u q B q t y q A S

d − − − −

+ = ~ 33% % of use:

u and w are independent

slide-22
SLIDE 22

Adaptive Control – Landau, Lozano, M’Saad, Karimi

22

Structure of recursive identification methods

Perturbation PROCEDE A.A.P. PREDICTEUR AJUSTABLE q

  • 1

y(t) +

  • y(t)

u(t) φ( t- 1) ) ε

ο

θ(t ) > > (t)

Characteristic elements:

  • predictor structure
  • signals used for the observation (φ) and regressor vectors (Φ)
  • dimension of the vector of estimated parameters and Φ
  • generation of the prediction error (ε)
  • all use the same parameter adaptation algorithm

θ ˆ

Types of identification methods:

I) Based on the asymptotic whitening of the prediction error (ε) II) Based on the asymptotic decorrelation of Φ and ε

) 1 ( ) ( ) 1 ( ) ( ˆ ) 1 ( ˆ + Φ + + = + t t t F t t ε θ θ

) ( ; 2 ) ( ; 1 ) ( ) ( ) ( ) ( ) ( ) 1 (

2 1 2 1 1

> < ≤ ≤ < Φ Φ + = +

F t t t t t F t t F

T

λ λ λ λ

(#)

slide-23
SLIDE 23

Adaptive Control – Landau, Lozano, M’Saad, Karimi

23

Extended Least Square (ELS) Idea : Identification of the plant model and of the disturbance (ARMAX) in order to obtain a white prediction error

) 1 ( ) ( ) ( ) ( ) 1 (

1 1 1

+ + + + − = + t e t e c t u b t y a t y Plant + disturbance (ARMAX): Optimal predictor (known parameters) ) ( ) ( ) ( ) 1 ( ˆ

1 1 1

t e c t u b t y a t y + + − = + Predicition error (known parameters) : ) 1 ( ) 1 ( ˆ ) 1 ( ) 1 ( + = + − + = + t e t y t y t ε

One replaces e(t) par ε(t)

Adjustable predictor (unknown parameters): ) ( ) ( ˆ ) ( ) ( ˆ ) ( ) ( ˆ ) ( ) ( ˆ ) 1 ( ˆ

1 1 1

t t t t c t u t b t y t a t y

T

  • φ

θ ε = + + − = +

[ ]

[ ]

) ( ), ( ), ( ) ( ; ) ( ˆ ), ( ˆ ), ( ˆ ) ( ˆ

1 1 1

t t u t y t t c t b t a t

T T

ε φ θ − = =

T T

t t t t c t u t b t y t a t y ) ( ) 1 ( ˆ ) ( ) 1 ( ˆ ) ( ) 1 ( ˆ ) ( ) 1 ( ˆ ) 1 ( ˆ

1 1 1

φ θ ε + = + + + + + − = + (a priori) (a posteriori)

) ( ) ( t t φ = Φ

Regressor:

slide-24
SLIDE 24

Adaptive Control – Landau, Lozano, M’Saad, Karimi

24

Prediction error (unknown parameters) ) 1 ( ˆ ) 1 ( ) 1 ( + − + = + t y t y t

  • ε

) 1 ( ˆ ) 1 ( ) 1 ( + − + = + t y t y t ε PAA: One uses (#) Φ and θ ˆ Rem.: The size of grows with respect to the least squares Properties:

  • ε(t) tends asymptotically towards a white noise (unbiased parameter estimation

if in addition the input is persistently exciting)

  • Sufficient convergence condition:

) ( max 2 ; 2 ) ( 1

2 2 2 1

t z C λ λ λ ≥ > ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ −

Strictly positive real transfer function (SPR) General case :

[ ]

) 1 ( )... ( ), 1 ( )... ( ), 1 ( )... ( ) ( + − + − − − + − − − = Φ

C B A T

n t t n d t u d t u n t y t y t ε ε

[ ]

) ( ˆ )... ( ˆ ), ( ˆ )... ( ˆ , ˆ )... ( ˆ ) ( ˆ

1 1 1

t c t c t b t b a t a t

C B A

n n n T =

θ

Can explain the non convergence for some noise ;

See function: rels.sci(.m) on the book web site

Extended Least Square (ELS)

slide-25
SLIDE 25

Adaptive Control – Landau, Lozano, M’Saad, Karimi

25

Strictly Positive Real Transfer Function (SRP)

  • asymptotically stable
  • )

( , 1 ) ( Re π ω

ω ω

< < = >

j j

e all for e H

(discrete case) An SPR transfer fct. introduces a phase lag less than 90° at all frequencies

ω Continuous time s σ ω

j

R

e H

Im H z = e

j ω

Discrete time z

  • 1

R

e z 1

+

j j

z = e

ω j

  • Im z

s = j R

e H

Im H H (e

ω

)

j

Re H < 0 Re H > 0 Re H < 0 Re H > 0 H ( ω)

j

slide-26
SLIDE 26

Adaptive Control – Landau, Lozano, M’Saad, Karimi

26

Output error with extended prediction model (XOLOE)

An extension of output error methods for ARMAX models. Can be viewed as a modification of RELS ) 1 ( ) ( ) ( ) ( ) 1 (

1 1 1

+ + + + − = + t e t e c t u b t y a t y Plant + disturbance (ARMAX): Adjustable predictor for RELS: ) ( ˆ ) ( ˆ ) ( ) ( ˆ ) ( ) ( ˆ ) ( ) ( ˆ ) 1 ( ˆ

1 1 1 1

t y t a t t c t u t b t y t a t yo ± + + − = + ε Adjustable predictor for XOLOE: ) ( ) ( ˆ ) ( ) ( ˆ ) ( ) ( ˆ ) ( ˆ ) ( ˆ ) 1 ( ˆ

1 1 1

t t t t h t u t b t y t a t y

T

  • φ

θ ε = + + − = + ) ( ˆ ) ( ˆ ) ( ˆ

1 1 1

t a t c t h − = ;

[ ]

[ ]

) ( ), ( ), ( ˆ ) ( ) ( ; ) ( ˆ ), ( ˆ ), ( ˆ ) ( ˆ

1 1 1

t t u t y t t t h t b t a t

T T

ε φ θ − = = Φ =

) 1 ( ˆ ) 1 ( ) 1 ( + − + = + t y t y t

  • ε

Instead of y(t) in RELS

PAA: One uses (#) Prediction error (a priori): See function: xoloe.sci(.m) on the book web site

slide-27
SLIDE 27

Adaptive Control – Landau, Lozano, M’Saad, Karimi

27

Recursive output error (OLOE) ) ( ) ( ) ( * ) ( ) ( * ) 1 (

1 1

t d t u q B t y q A t y

θ = − + − = +

− −

vector t measuremen vector parameter − − ψ θ ;

Estimated model Plant Model

vector n

  • bservatio

vector parameter estimated − − φ θ ; ˆ

[ ]

B A

n n T

b b a a ,..., , ,...

1 1

= θ

[ ]

) 1 ( )... ( ), 1 ( ˆ )... ( ˆ ) ( + − − − + − − − =

B A T

n d t u d t u n t y t y t φ

;

) ( ) 1 ( ˆ ) 1 ( ˆ t t t y

T

φ θ + = +

a posteriori

) ( ) (

a priori

t t φ = Φ

Plant Disturbance P.A.A

+

  • y(t)

u(t)

Adjustable Predictor

y°(t)

>

) (

0 t

ε

y°(t-1)

>

1 −

q

) ( ) ( ˆ ) ( ) , ( * ˆ ) ( ˆ ) , ( * ˆ ) 1 ( ˆ

1 1

t t d t u q t B t y q t A t y

T

φ θ = − + − = +

− −

[ ]

B A

n n T

b b a a ˆ ,..., ˆ , ˆ ,... ˆ ˆ

1 1

= θ

Regressor vector

slide-28
SLIDE 28

Adaptive Control – Landau, Lozano, M’Saad, Karimi

28

Prediction error (unknown parameters) ) 1 ( ˆ ) 1 ( ) 1 ( + − + = + t y t y t

  • ε

) 1 ( ˆ ) 1 ( ) 1 ( + − + = + t y t y t ε

;

PAA: One uses (#)

  • Unbiased estimation of the plant parameters without identifiyng the noise

model (useful for non gaussian noise)

  • Sufficient convergence condition:

) ( max 2 ; 2 ) ( 1

2 2 2 1

t z A λ λ λ ≥ > ⎟ ⎠ ⎞ ⎜ ⎝ ⎛ −

Strictly positive real transfer function (SPR) Properties: Remark: The SPR condition can be relaxed by filtering the prediction error or the regressor

Recursive output error

See function oloe.sci (.m) on the web site of the book

slide-29
SLIDE 29

Adaptive Control – Landau, Lozano, M’Saad, Karimi

29

Output error with filtered observations (OEFO)

Adjustable predictor (output error):

) ( ) ( ˆ ) 1 ( ˆ t t t y

T

  • φ

θ = +

) 1 ( ) ( ˆ ) ( ˆ ) ( ) 1 ( ˆ ) 1 ( ˆ − = ⇒ + = + t t t y t t t y

T T

φ θ φ θ Prediction error: ) 1 ( ˆ ) 1 ( ) 1 ( + − + = + t y t y t

  • ε

) 1 ( ˆ ) 1 ( ) 1 ( + − + = + t y t y t ε ; PAA: One uses (#) with : ) ( ) ( t t

f

φ = Φ ) ( ˆ ) (

1 1 − −

= q A q L Filtering the observations: Filter: ) ( ) ( ˆ 1 ) ( ) (

1

t q A t t

f

φ φ

= = Φ

An estimation of the polynomial A(q-1)

  • Sufficient convergence condition :

) ( max 2 ; 2 ) ( ) ( ˆ

2 2 2 1 1

t z A z A λ λ λ ≥ > ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ −

− −

Strictly positive real transfer function (SPR)

[ ]

B A

n n T

b b a a ˆ ,..., ˆ , ˆ ,... ˆ ˆ

1 1

= θ [ ]

) 1 ( )... ( ), 1 ( ˆ )... ( ˆ ) ( + − − − + − − − =

B A T

n d t u d t u n t y t y t φ

Regressor: See function foloe.sci(.m) on the book web site

slide-30
SLIDE 30

Adaptive Control – Landau, Lozano, M’Saad, Karimi

30

Output error with adaptive filtered observations (OEAFO)

Uses an adaptive filter on the observations instead of a fixed filter (takes advantage of the improvement of the estimation of A as times goes) ) , ( ˆ ) , (

1 1 − −

= q t A q t L Filtering the observations: Filtre: ) ( ) , ( ˆ 1 ) ( ) (

1

t q t A t t

f

φ φ

= = Φ

Estimation of polynomial A(q-1) provided by the algorithm itself

Initialization: ) ( ˆ ) , ( ˆ

1 1 − −

= q A q A 1 ) , ( ˆ

1 = −

q A

  • r:

(simpler and more efficient) Remove in most of the case the problems related to the SPR condition See function: afoloe.sci(.m) on the book web site

slide-31
SLIDE 31

Adaptive Control – Landau, Lozano, M’Saad, Karimi

31

1 ; 17 . 2 ) ( ≥ ≤ i N i RN

Normalized autocorrelations

  • r crosscorelations

number

  • f data

1 N 2.17 N 97%

136 . ) ( 256 ≤ → = i RN N 15 . ) ( ≤ i RN

pratical value :

+

  • u

y y ε Plant Model q-1 q-1 ARMAX (ARARX) predictor

Whiteness Test

+

  • u

y y ε Plant Model q-1 Output Error Predictor

Uncorrelation Test

Validation of Identified Models Statistical Validation

slide-32
SLIDE 32

Adaptive Control – Landau, Lozano, M’Saad, Karimi

32

«Whiteness » test

{ε(t)} : centered sequence of residual prediction errors One computes: 1 ) ( ) ( ) ( ; ) ( 1 ) (

1 2

= = = ∑

=

R R RN t N R

N t

ε

max 1

... 3 , 2 , 1 ; ) ( ) ( ) ( ; ) ( ) ( 1 ) ( i i R i R i RN i t t N i R

N t

= = ∑ − =

=

ε ε ; Theoretical values: RN(i) =0; i = 1, 2…imax

  • Finite number of data
  • Residual structural errors ( orders, nonlinearities, noise)
  • Objective: to obtain « good » simple models

Real situation: Validation criterion (N = number of data):

1 ; 17 . 2 ) ( ≥ ≤ i N i RN

max

,..., 1 ; 15 . ) ( i i i RN = ≤

  • r:
slide-33
SLIDE 33

Adaptive Control – Landau, Lozano, M’Saad, Karimi

33

« Uncorrelation » test

: centered sequences of residual prediction errors and predictions One computes:

max 1

,..., 2 , 1 , ; ) ( ˆ ) ( 1 ) ( i i i t y t N i R

N t

= − = ∑

=

ε

max 2 / 1 1 2 1 2

,..., 2 , 1 , ; ) ( 1 ) ( ˆ 1 ) ( ) ( i i t N t y N i R i RN

N t N t

= ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎣ ⎡ ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ ⎟ ⎟ ⎠ ⎞ ⎜ ⎜ ⎝ ⎛ =

∑ ∑

= =

ε ) , max(

max

d n n i

B A

+ =

; Remark: 1 ) ( ≠ RN

) ( ) ( ˆ ) ( ˆ ) ( ˆ

1 1

t u q B q t y q A

d − − −

=

{ } { }

) ( ˆ , ) ( t y t ε

Theoretical values: RN(i) =0; i = 1, 2…imax

  • Finite number of data
  • Residual structural errors ( orders, nonlinearities, noise)
  • Objective: to obtain « good » simple models

Real situation: Validation criterion (N = number of data):

1 ; 17 . 2 ) ( ≥ ≤ i N i RN

max

,..., 1 ; 15 . ) ( i i i RN = ≤

  • r:

Output error predictor:

slide-34
SLIDE 34

Adaptive Control – Landau, Lozano, M’Saad, Karimi

34

Matlab/Scilab routines for Open Loop System identification To be downloaded from the web site: http//:landau-bookic.lag.ensieg.inpg.fr

  • function files(.m and .sci)
  • data(.mat)
slide-35
SLIDE 35

Adaptive Control – Landau, Lozano, M’Saad, Karimi

35

List of functions for open loop identification

Scilab Functions Matlab functions Description estorderls.sci estorderls.m Order estimation with the least squares criterion estorderiv.sci estorderiv.m Order estimation with the instrumental variable criterion nrls.sci nrls .m Non recursive Least squares rls.sci rls.m Recursive least squares rels.sci rels.m Recursive extended least squares

  • loe.sci
  • loe.sci

Output Error(recursive) foloe.sci foloe.m Output error with filtered observations afoloe.sci afoloe.m Output error with adaptive filtered observations xoloe.sci xoloe.m Output error with extended prediction model

  • lvalid.m

Validation of plant models identified in open loop

slide-36
SLIDE 36

Adaptive Control – Landau, Lozano, M’Saad, Karimi

36

[V,S,VS]=estorderiv(y,u,nmax)

n corresponds to the minimum of VS(n) (attn. for the n axis : should start with 0)

Measured

  • utput

Upper bound for the order

>> help estorderiv

Estorderiv – Order estimation using Instrumental Variable Method

) , max( d n n n

B A

+ =

Order of a discrete time system:

Excitation

V - vector of the error criterion V=[V(0) V(1) V(2) ….]^T,where V(0)=1 S – vector of penalty coefficients. Model complexity n is penalized VS – normalized vector of penalized criterion VS=V+S S n VS V minimum n opt

slide-37
SLIDE 37

Adaptive Control – Landau, Lozano, M’Saad, Karimi

37

>> help rls lam1=1;lam0=1 : decreasing gain (default algorithm) 0.95<lam1<1; lam0=1 : decreasing gain with fixed forgetting factor 0.95<lam1,lam0<1 : decreasing gain with variable forgetting factor (typical value :0.97)

[B,A]=rls(y,u,na,nb,d,Fin,lam1,lam0)

Initial adaption gain (Fin=1000 by default)

1

Measured

  • utput

Excitation Model

  • rders

λ λ

Model Polynomials (identified)

RLS –recursive least squares identification function

slide-38
SLIDE 38

Adaptive Control – Landau, Lozano, M’Saad, Karimi

38

>> help oloe lam1=1;lam0=1 : decreasing gain (default algorithm) 0.95<lam1<1; lam0=1 : decreasing gain with fixed forgetting factor 0.95<lam1,lam0<1 : decreasing gain with variable forgetting factor (typical value :0.97)

[B,A]=oloe(y,u,na,nb,d,Fin,lam1,lam0)

Measured

  • utput

Excitation Initial adaption gain (Fin=1000 by default)

1

Model

  • rders

λ λ

Model Polynomials (identified)

OLOE – open loop output error identification function

slide-39
SLIDE 39

Adaptive Control – Landau, Lozano, M’Saad, Karimi

39

>> help rels lam1=1;lam0=1 : decreasing gain (default algorithm) 0.95<lam1<1; lam0=1 : decreasing gain with fixed forgetting factor 0.95<lam1,lam0<1 : decreasing gain with variable forgetting factor (typical value :0.97)

[B,A,C]=rels(y,u,na,nb,nc,d,Fin,lam1,lam0)

Measured

  • utput

Excitation Initial adaption gain (Fin=1000 by default)

1

Model

  • rders

λ λ

Model Polynomials (identified)

RELS – Extended least squares identification function XOLOE- Output error with extended prediction model [B,A,C]=xoloe(y,u,na,nb,nc,d,Fin,lam1,lam0)

>> help xoloe

slide-40
SLIDE 40

Adaptive Control – Landau, Lozano, M’Saad, Karimi

40

OLVALID – Validation of plant models identified in open loop [wlossf,ulossf,wrni,urni,wyhat,uyhat]=olvalid(B,A,C,y,u)

Model Polynomials (identified)

Performs “whiteness test”(w) and “uncorrelation test”(u)

>> help olvalid

Measured

  • utput

Excitation Predicted output (output error predictor) Correlations (normalized) Error variance (output error predictor) Error variance (ARMAX predictor) Predicted output (ARMAX predictor) Crosscorrelations (normalized)

Attention: If polynomial C is not available enter [1]

slide-41
SLIDE 41

Adaptive Control – Landau, Lozano, M’Saad, Karimi

41

Data files for open loop identification

  • T0.mat: simulated example without noise

1 1 1 1

5 . 1 ; 7 . 5 . 1 1 ; 1

− − − −

+ = + − = = q q B q q A d

  • T1.mat: same simulated example with noise
  • poulbo1c.mat: flexible transmission

2 ; 4 ; 2 = = =

B A

n n d

  • aeroc.mat : air heater
  • mot3c.mat: D.C. motor
  • rob2.mat: flexible robot arm with two vibration modes