Observer convergence: From necessary to sufficient conditions
Laurent PRALY and Vincent ANDRIEU
IHP, June 2016
Observer convergence: From necessary to sufficient conditions - - PowerPoint PPT Presentation
Observer convergence: From necessary to sufficient conditions Laurent PRALY and Vincent ANDRIEU IHP, June 2016 1 / 12 Observation Problem 1 1.1 Observation Problem model dx dt ( t ) = x = f ( x, t, u ( t )) ( ) Exogenous
IHP, June 2016
1
Observation Problem model
Exogenous actions
reality Observation problem: Find (=estimate) x(t) solution of (*) to make
✡ ✡ ✡ ✡ ✡ ✡ ✡ ✣
this hold.
2
Observation Problem Observer = any device solving this problem. Focus on observers which are dynamical systems of the following form:
with y the measurement, ˆ
Very simplified/degraded form of observer Facing data-loss and unable to give any information on confidence. Compare with stochastic filters or set-valued observers. Interested only in convergence ˆ
t → ∞
3
4
The technical context
(1) with x in an open set Sx of Rn and y in Rp. Its solutions are denoted X(x, t).
(2) with ξ in an open set Sξ of Rm and ˆ
Its solutions are denoted Ξy(ξ, t) or Ξ((x, ξ), t) .
We omit time-dependence to simplify notations. No loss of generality as long as we do not want to consider families of models indexed by inputs (exogenous actions).
5
The technical context The functions f, h, ϕ and η are assumed to be such that :
maximally on ]σ−
Sx×Sξ(x, ξ) , σ+ Sx×Sξ(x, ξ)[.
6
Main property Conv Property “Conv”: The zero error set
❅ ❅ ❘
contains an asymptotically stable subset Zω with Sx × Sξ as domain of attraction. More precisely, there exist i) a function β of class1 KL ii) a continuous function γ : Sx × Sξ → R+ such that :
Sx×Sξ(x, ξ) = σ+ Sx(x)
and, for all t in [0, σ+
Sx(x)),
Notation: Sx
ω =Cartesian projection of Zω on Sx=set of estimable points x.
1A function β : R+ × R+ → R+ is said of class KL if, for each s in R+, the function
s → β(r, s) is continuous, strictly increasing and zero at zero, and, for each r in R+, the
function s → β(r, s) is strictly decreasing and satisfies lims→∞ β(r, s) = 0.
7
Variation of main property Convtune Property “Convtune”: Given an integer m and an open subset Sξ of Rm, for any compact subset C of Rm and for all pairs (εs, εt) of strictly positive real numbers, we can find i) a compact subset Γ of Sξ, ii) a locally Lipschitz function ϕ : Rm × Rp → Rm iii) a uniformly continuous function η : Rm × Rp → Sx such that, for the observer given by these functions, we have, for all (x, ξ) in Sx × Rm,
Sx×Rm(x, ξ) = σ+ Sx(x) ,
Sx(x) > εt, we have :
Sx(x)) .
As fast as we want (εt) to as small as we want (εs), but maybe not convergence.
8
9
10
Necessary condition 1 : Detectability Proposition 1: [V. Andrieu, G. Besan¸ con, U. Serres] Under Assumption Conv, the model is forward detectable, i.e. for all xa and xb in Sx satisfying :
we have :
t→+∞ dx
11
Necessary condition 1 : Detectability
xa xb X(xa, t) X(xb, t) the distance converges to 0 h(xa) = h(xb) h(X(xa, t)) = h(X(xb, t))
12
13
Necessary condition 2 :
Proposition 2: Under Assumption Conv, there exists an injective set-valued map x ∈ Sx
ω →
ω .
ω × Sξ : ξ ∈ ηinv (x)
and therefore is asymptotically stable with Sx × Sξ as domain of attrac- tion.
14
Necessary condition 2 :
Away from singularities and with sufficient differentiability . . . it is necessary to have: dimension m of ξ (observe state) ≥ dimension n of x (model state) − dimension p of y (measurement)
15
Supplementary assumption Assumption Obs Assumption “Obs”: The system:
(= observer) with state ξ, input y and output ˆ
uniformly in y in h(Sx) i.e. for each continuous time function s → y(s) ∈ h(Sx), there is no pair of distinct points ξa and ξb in Sξ such that ∃ σ in [0, min{σ+
Sξ(ξa), σ+ Sξ(ξb)})
such that we have :
16
Necessary condition 2 with Obs :
Proposition 2 continued: If besides Assumption Conv, Assumption Obs holds also then ηinv is single- valued. Hence, it is necessary that there exists a function x ∈ Sx
ω → ηinv (x) ∈ Sξ
ω .
17
Necessary condition 2 with Obs :
Luenberger wrote in 1962 Instead of requiring that the observer reconstructs the state vector
[ηinv ] of the state vector . . . It is clear that it would then be possible to reconstruct the state vector itself, provided that the transforma- tion were invertible.
18
19
Necessary condition 3 : Instantaneous Observability Proposition 3: [V. Andrieu, G. Besan¸ con, U. Serres] Under Assumption Convtune, the model is instantaneously observable at Sx.
20
Supplementary Assumption Assumption Inj Assumption “Inj”: The observer output function η is injective given h, i.e.: There exists a function αη of class1 K such that:
1A function α : R+ → R+ is said of class K if it is continuous, strictly increasing and zero
at zero. It is of class K∞ if it is onto R+.
21
Supplementary Assumption Assumption Inj Away from singularities and with sufficient differentiability . . . Inj requires: dimension m of ξ (observe state) ≤ dimension n of x (model state)
22
Supplementary Assumption Assumption Inj Example (Frequency estimation): For the model:
an observer satisfying Conv is:
where the λj are complex numbers with strictly negative real parts and
1
. . . . . . . . .
m λm ξm
1ξ1
. . .
mξm
Given ˆ
degree 4 in m unknowns ⇒ infinitely many solutions when m ≥ 4. Hence the observer output function η is not injective.
Assumption Inj does not hold.
23
24
Necessary condition 4 : Weak differential detectability Proposition 4: [R. Sanfelice, L. P.] If Assumptions Conv and Inj hold and the matrix
∂η
∂ξ(ξ, y) ∂η ∂y(ξ, y)
invertible for each (ξ, y) in Sξ × h(Sx), then the model is weakly differentially detectable, i.e. there exists a covariant 2-tensor P : Sx
ω → Pn + satisfying1 :
ω × Sn : ∂h
1
vTLfP(x) v = ∂ ∂x
∂x(x) + ∂f ∂x(x)P(x)
25
Necessary condition 4 : Differential detectability
x x + sv x + tf(x) (x + sv) + tf(x + sv) d(s, t) = s
s
]⊤P(x + tf(x)) [v + t f(x+sv)−f(x)
s
] h(x) = y d(s, t) ≤ d(s, 0)
(s,t)→0
26
Necessary condition 4 : Differential detectability Strong differential detectability on Rn: There exists a covariant 2-tensor
ω → Pn + satisfying :
Proposition 5: (link with “tangent systems”) [R. Sanfelice, L. P.] Suppose the model is forward complete and we have the strong differential detectability on Rn, Then the model is infinitesimally detectable on Rn, i.e. for each x ∈ Rn, the linear time-varying system given by the linearization of the model along X(x, t) is uniformly detectable i.e. with the notations
there exists a continuous (maybe unbounded) function t → Kx(t) such that the origin is uniformly stable for the linear system
27
28
Summary on necessary conditions Notations: The model is
The observer is
Properties: Conv = The zero error set
an asymptotically stable set Zω. Convtune = For all (εs, εt), exist (ϕ, η) such that, for all t ∈ [εt, σ+
Sx(x))
Obs = The observer is instantaneously observable Inj
29
Summary on necessary conditions Conv
The model is forward detectable.
The graph of ηinv , an injective set-valued map, right inverse of η, is asymptotically stable. Conv + Obs
Convtune
The model is instantaneously observable Conv + Inj
Weak differential detectability
30
31
32
Assumption Backward observability Backward observability: The model:
is backward observable at a neighborhood N of Sx if, for any pair of distinct points xa and xb in Sx, there exists t in ] max{σ−
N(xa), σ− N(xb)}, 0] such that we have :
Reminder : Forward detectability
t→+ dx(X(xa, t), X(xb, t) = 0
is necessary for Conv.
33
Sufficient condition 1 : Backward observability Proposition 6: [V. Andrieu, L. P.]
❄
To simplify this presentation
Assume the model is backward complete and backward observable at Sx. Assume also the existence of i) an injective C1 function b : Rp → Cp, ii) a continuous function M : Sx → R+, iii) and a negative real number ℓ such that, for each x and each each t in (σ−
Sx(x), 0], we have
34
Sufficient condition 1 : Backward observability Proposition 7 (continued): Under these conditions, there exists a subset S of Cn+1 of zero Lebesgue measure such that, for any diagonal matrix A with n + 1 complex eigenvalues λi arbitrarily chosen in Cn+1 \ S and with real part strictly smaller than ℓ, there exists a C1 and injective function x ∈ Sx → ηinv (x) solution of
. . .
b(h(x))
Proof: . . . Thank you Jean-Michel Corollary: (Luenberger observer) There exists a continuous function η such that Conv is satisfied by
. . .
b(y)
35
36
Assumption Differential observability of order m Differential observability of order m: The model:
is differentially observable of order m at Sx if the function1 x → Hm(x) is injective.
. . .
f
Reminder : Instantaneous observability
is necessary for Convtune.
1Lfh(x) = ∂h
∂x(x)f(x) , h(X(x, t)) =
q
tj j!Lj
fh(x) + o(tq).
37
Sufficient condition 2 : Differential observability of order m Proposition 8: (High gain observer) [H. Khalil, A. Saberi], [S. Emelyanov, S. Korovin S. Nikitin, M. Nikitina], [J. Gauthier, H. Hammouri, S. Othman] [Tornamb` e], . . . , [P. Bernard, V. Andrieu, L. P.] If the model is differentially observable with order m, there exist continuous functions ϕm and η such that the observer
satisfies Convtune when tuning the real numbers (k1, . . . , km). We have even Conv when Hm is an immersion. This can be extended to the case with control
when we have forward observability uniformly in u.
38
39
Assumption Uniform infinitesimal backward observability Uniform infinitesimal backward observability The model is uniformly infinitesimally backward observable on Rn if there exist strictly positive real numbers τ and ǫ such that we have1 :
−τ Φx(s, 0)⊤Cx(s)⊤Cx(s)Φx(s, 0) ds ≥ ǫ I
Reminder :
1Φx is the transition matrix of ˙
ξ = Ax(t)ξ.
40
Uniform infinitesimal backward observability Proposition 9:
If the model is uniformly infinitesimally backward observable then it is strongly differentially detectable.
41
Uniform infinitesimal backward observability Proposition 10 (local convergence): (Riemannian observer)
If the model is uniformly infinitesimally backward observable then, with P given by the strong differential detectability, there exist strictly positive real numbers k, ε and r such that, with the observer given by1
with k ≥ k, the following holds23
1gradPh(ξ) = P(ξ)−1∂h ∂x(ξ)⊤. 2d(ˆ
x, x) is the Riemannian distance, induced by P , between ˆ x and x.
3D+d(ˆ
x, x) = lim sup
t→0+
d( ˆ X(ˆ x, t), X(x, t)) − d(ˆ x, x) t
.
42
Supplementary assumption Assumption GM Assumption “GM”: The function h : Rn → Rp is geodesically monotonic if there exists a C2 function δ : Rp × Rp → R+ such that
a
and any minimal P -geodesic γ∗ between xa = γ∗(sa) and xb = γ∗(sb), with sa ≤ sb, we have :
Property : The function geodesic monotonicity of h implies its level sets are strongly geodesically convex and totally geodesic and conversely when p = dim(y) = 1.
43
Uniform infinitesimal backward observability Proposition 11 (semi-global convergence):
If the model is uniformly infinitesimally backward observable and the function h : Rn → Rp is geodesically monotonic then, for any positive real number E there exists a continuous function
the following holds
44
45
Summary on sufficient conditions
implies a Luenberger observer satisfies Conv
implies a high gain observer satisfies Convtune
implies a Riemannian observer satisfies Conv locally, semi-globally whenever the function h is geodesically monotonic.
46
47
Example Non uniformly contracting observer “Proposition ”:
[Aeyels], [Takens], [J-P. Gauthier, H. Hammouri, I. Kupka], [J-M Coron], [E. Sontag]
“Generically”, from the measurement y evaluated at 2n + 1 different time instants we can reconstruct the model state Idea : By processing the measurement by different 2n + 1 systems with bounded forward solutions we hope to be able reconstruct the model state
48
Example Non uniformly contracting observer The model is
2 ,
with (x1, x2) ∈ Sx = R2 \ {(0, 0)}. All its solutions are periodic. The pattern system to make the observer is
with a and b strictly positive real numbers. Its flow is contracting (uniformly in y) only when ξ >
a
3b.
To ecah model periodic solutions with r =
1 + x4 2 small,
correspond 3 periodic solutions,
49
Example Non uniformly contracting observer
10 20 30 40 50
t
0.5 1 1.5
r = 0.2
The graph of ηinv is exponentially attractive.
50
Example Non uniformly contracting observer
0.1 0.2 0.3 0.4
δx
0.05 0.1 0.15
δξ modulus of injectivity with m patterns
51
Example Non uniformly contracting observer The observer is :
1 + y
2 + y
3 + y
3 + y
ˆ
= η(ξ)
numerical table
52
Example Non uniformly contracting observer
10 20 30 40 50
t
5
r = 0.2
ˆ ξ1 ˆ ξ2 ˆ ξ3 ˆ ξ4
10 20 30 40 50
t
1 2
x2 ˆ x2
Simulation with various initial conditions ξ
53