Summary: B AYES ian (Multi-) Sensor Tracking Basis: In the course of - - PowerPoint PPT Presentation

summary b ayes ian multi sensor tracking
SMART_READER_LITE
LIVE PREVIEW

Summary: B AYES ian (Multi-) Sensor Tracking Basis: In the course of - - PowerPoint PPT Presentation

Summary: B AYES ian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time.


slide-1
SLIDE 1

Summary: BAYESian (Multi-) Sensor Tracking

  • Basis: In the course of time one or several sensors produce measurements of

targets of interest. Each target is characterized by its current state vector, being expected to change with time.

  • Objective: Learn as much as possible about the individual target states at each

time by analyzing the ‘time series’ which is constituted by the sensor data.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 1

slide-2
SLIDE 2

Summary: BAYESian (Multi-) Sensor Tracking

  • Basis: In the course of time one or several sensors produce measurements of

targets of interest. Each target is characterized by its current state vector, being expected to change with time.

  • Objective: Learn as much as possible about the individual target states at each

time by analyzing the ‘time series’ which is constituted by the sensor data.

  • Problem: imperfect sensor information: inaccurate, incomplete, and eventually
  • ambiguous. Moreover, the targets’ temporal evolution is usually not well-known.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 2

slide-3
SLIDE 3

Summary: BAYESian (Multi-) Sensor Tracking

  • Basis: In the course of time one or several sensors produce measurements of

targets of interest. Each target is characterized by its current state vector, being expected to change with time.

  • Objective: Learn as much as possible about the individual target states at each

time by analyzing the ‘time series’ which is constituted by the sensor data.

  • Problem: imperfect sensor information: inaccurate, incomplete, and eventually
  • ambiguous. Moreover, the targets’ temporal evolution is usually not well-known.
  • Approach:

Interpret measurements and target vectors as random variables (RVs). Describe by probability density functions (pdf) what is known about them.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 3

slide-4
SLIDE 4

Summary: BAYESian (Multi-) Sensor Tracking

  • Basis: In the course of time one or several sensors produce measurements of

targets of interest. Each target is characterized by its current state vector, being expected to change with time.

  • Objective: Learn as much as possible about the individual target states at each

time by analyzing the ‘time series’ which is constituted by the sensor data.

  • Problem: imperfect sensor information: inaccurate, incomplete, and eventually
  • ambiguous. Moreover, the targets’ temporal evolution is usually not well-known.
  • Approach:

Interpret measurements and target vectors as random variables (RVs). Describe by probability density functions (pdf) what is known about them.

  • Solution: Derive iteration formulae for calculating the pdfs! Develop a mech-

anism for initiation! By doing so, exploit all background information available! Derive state estimates from the pdfs along with appropriate quality measures!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 4

slide-5
SLIDE 5

Elements for multisensor situation pictures: tracks of temporally evolving objects

Which object properties are of interest? → state Xk at time tk

  • road-moving vehicle: odometer count xk

Xk = (xk, ˙ xk)

  • position, speed, acceleration:

Xk = (rk, ˙

rk,¨ rk)

  • joint state of several objects:

Xk = (x1

k, x2 k, . . .)

  • attributes, e.g. radar cross section xk ∈ R+: Xk = (xk, xk)
  • maneuvering phase, object class ik ∈ N:

Xk = (xk, ik)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 5

slide-6
SLIDE 6

Elements for multisensor situation pictures: tracks of temporally evolving objects

Which object properties are of interest? → state Xk at time tk How to lean states Xk? → from sensor data Zk = {Zk, Zk−1}, context How to imprecise information? → e.g. by conditional pdfs p(Xk|Zk) What means “learning” in this context? → iterative calculation of p(Xk|Zk)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 6

slide-7
SLIDE 7

The general tracking equations

Prediction

p(Xk|Zk−1) =

  • dXk−1 p(Xk|Xk−1)
  • evolution

p(Xk−1|Zk−1)

  • filtering tk−1

filtering

p(Xk|Zk) = p(Zk|Xk) p(Xk|Zk−1)

  • dXk p(Zk|Xk)
  • sensor model

p(Xk|Zk−1)

  • prediction

retrodiction

p(Xl|Zk) =

  • dXl+1

evolution

  • p(Xl+1|Xl)

filtering tl

  • p(Xl|Zl)

p(Xl+1|Zl)

  • prediction tl+1

p(Xl+1|Zk)

  • retrodiction tl+1

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 7

slide-8
SLIDE 8

Elements for situation pictures: tracks of temporally evolving objects

Which object properties are of interest? → state Xk at time tk How to lean states Xk? → from sensor data Zk = {Zk, Zk−1}, context How to imprecise information? → e.g. by conditional pdfs p(Xk|Zk) What means “learning” in this context? → iterative calculation of p(Xk|Zk) What is needed for this? → evolution / sensor models p(Xk|Xk−1), p(Zk|Xk) How to initiate / terminate tracking processes? → sequential decisions

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 8

slide-9
SLIDE 9

Why is ‘target tracking’ a key function?

infer secondary quantities from incomplete measurement data. Eliminate fluctuating false return background (clutter). Create a time basis for classification from attribute data. . . . ‘Shape’ of objects / object groups relevant in many applications.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 9

slide-10
SLIDE 10

Track-based inference of object properties

  • Velocity history:

vehicle, helicopter, plane

  • Acceleration history:

threat: no under-wing weapons

  • Rare events:

truck by night on dirt road near a border

  • Object interrelations:

resulting from formation, convoy

  • Object sources / sinks:

classification by origin / designation

  • Classification:

road-moving vehicle, ‘on-road’ → ‘off-road’

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 10

slide-11
SLIDE 11

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 11

slide-12
SLIDE 12

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y!

  • Bayes: p(x|y)= p(y|x)p(x)

p(y)

=

p(y|x)p(x)

  • dx p(y|x)p(x):

p(x|y) ← p(y|x), p(x)!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 12

slide-13
SLIDE 13

Recapitulation: The Multivariate GAUSSian Pdf

– wanted: probabilities ‘concentrated’ around a center x – quadratic distance: q(x) = 1

2(x − x)P−1(x − x)⊤

q(x) defines an ellipsoid around x, its volume and orien- tation being determined by a matrix P (symmetric: P⊤ =

P, positively definite: all eigenvalues > 0).

– first attempt: p(x) = e−q(x)/

dx e−q(x) (normalized!)

p(x) = N(x; x, P) = 1

  • |2πP|

e−1

2(x−x)⊤P−1(x−x)

– GAUSSian Mixtures: p(x) =

i pi N(x; xi, Pi) (weighted sums)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 13

slide-14
SLIDE 14

pdf: tk−1 Prädiktion: tk

Exploit imprecise knowledge on the dynamical behavior of the object.

p(xk|Zk−1)

  • prediction

= dxk−1 N

  • xk; Fxk−1, D
  • dynamics

N

  • xk−1; xk−1|k−1, Pk−1|k−1
  • ld knowledge

.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 14

slide-15
SLIDE 15

A Useful Product Formula for GAUSSians

N

  • z; Fx, D
  • N
  • x; y, P
  • = N
  • z; Fy, S
  • independent of x

N

  • x; y + Wν, P − WSW⊤

ν = z − Fy, S = FPF⊤ + D, W = PF⊤S−1.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 15

slide-16
SLIDE 16

Kalman filter: xk = (r⊤

k , ˙

r⊤

k )⊤, Zk = {zk, Zk−1}

initiation: p(x0) = N

  • x0; x0|0, P0|0
  • ,

initial ignorance:

P0|0 ‘large’

prediction: N

  • xk−1; xk−1|k−1, Pk−1|k−1
  • dynamics model

− − − − − − − − − →

Fk|k−1, Dk|k−1

N

  • xk; xk|k−1, Pk|k−1
  • xk|k−1 = Fk|k−1xk−1|k−1

Pk|k−1 = Fk|k−1Pk−1|k−1Fk|k−1

⊤ + Dk|k−1

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 16

slide-17
SLIDE 17

pdf: tk−1 tk: kein plot

missing sensor detection: ‘data processing’ = prediction (not always: exploitation of ‘negative’ sensor evidence)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 17

slide-18
SLIDE 18

pdf: tk−1 pdf: tk Prädiktion: tk+1

missing sensor information: increasing knowledge dissipation

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 18

slide-19
SLIDE 19

pdf: tk−1 pdf: tk tk+1: ein plot

sensor information on the kinematical object state

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 19

slide-20
SLIDE 20

pdf: tk−1 pdf: tk Prädiktion: tk+1 likelihood

(Sensormodell)

BAYES’ formula:

p(xk+1|Zk+1)

  • new knowledge

= p(zk+1|xk+1) p(xk+1|Zk)

  • dxk+1 p(zk+1
  • plot

|xk+1) p(xk+1|Zk)

  • prediction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 20

slide-21
SLIDE 21

pdf: tk−1 pdf: tk Prädiktion: tk+1 likelihood

(Sensormodell)

BAYES’ formula:

p(xk+1|Zk+1)

  • new knowledge

= N

  • zk; Hxk, R
  • N
  • xk; xk|k−1, Pk|k−1
  • dxk+1 N
  • zk; Hxk, R
  • N
  • xk; xk|k−1, Pk|k−1
  • prediction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 21

slide-22
SLIDE 22

A Useful Product Formula for GAUSSians

N

  • z; Hx, R
  • N
  • x; y, P
  • = N
  • z; Hy, S
  • independent of x

N

  • x; y + Wν, P − WSW⊤

ν = z − Hy, S = HPH⊤ + D, W = PH⊤S−1.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 22

slide-23
SLIDE 23

Kalman filter: xk = (r⊤

k , ˙

r⊤

k )⊤, Zk = {zk, Zk−1}

initiation: p(x0) = N

  • x0; x0|0, P0|0
  • ,

initial ignorance:

P0|0 ‘large’

prediction: N

  • xk−1; xk−1|k−1, Pk−1|k−1
  • dynamics model

− − − − − − − − − →

Fk|k−1, Dk|k−1

N

  • xk; xk|k−1, Pk|k−1
  • xk|k−1 = Fk|k−1xk−1|k−1

Pk|k−1 = Fk|k−1Pk−1|k−1Fk|k−1

⊤ + Dk|k−1

filtering: Nxk; xk|k−1, Pk|k−1

  • current measurement zk

− − − − − − − − − − − − − →

sensor model: Hk, Rk

N

  • xk; xk|k, Pk|k
  • xk|k

=

xk|k−1 + Wk|k−1νk|k−1, νk|k−1 = zk − Hkxk|k−1 Pk|k

=

Pk|k−1 − Wk|k−1Sk|k−1Wk|k−1⊤, Sk|k−1 = HkPk|k−1Hk⊤ + Rk Wk|k−1 = Pk|k−1Hk⊤Sk|k−1−1

‘KALMAN gain matrix’

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 23

slide-24
SLIDE 24

pdf: tk−1 pdf: tk pdf: tk+1

(Bayes)

filtering = sensor data processing

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 24

slide-25
SLIDE 25

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y!

  • Bayes: p(x|y)= p(y|x)p(x)

p(y)

=

p(y|x)p(x)

  • dx p(y|x)p(x):

p(x|y) ← p(y|x), p(x)!

  • certain knowledge on x: p(x) = δ(x − y) ‘=’ limσ→0

1 √ 2πσe−1

2 (x−y)2 σ2

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 25

slide-26
SLIDE 26

Why is ‘tracking’ a key function for ‘understanding’ a situation?

infer secondary quantities from incomplete measurement data. Eliminate fluctuating false return background (clutter). Create a time basis for classification from attribute data. . . . ‘Shape’ of objects / object groups relevant in many applications.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 26

slide-27
SLIDE 27

modeling: sensor data produced by extended objects

  • actual measurement errors of individual scattering centers unimportant
  • the ‘message’ of individual plots is dominated by the object extension
  • individual plots to be interpreted as measurements of the object center
  • related ‘measurement error’ proportional to extension to be estimated

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 27

slide-28
SLIDE 28
  • bject extension: ‘covariance-type’ matrices
  • state augmentation by a random matrix

– object extension exceeding resolution – closely spaced vehicle convoys – collectively moving object clouds

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 28

slide-29
SLIDE 29

Elements for Situation Pictures: Tracks of Time-varying Objects

Which object properties are of interest? → state Xk at time tk

  • road-moving vehicle: odometer count xk

Xk = (xk, ˙ xk)

  • position, speed, acceleration:

Xk = (rk, ˙

rk,¨ rk)

  • joint state of several objects:

Xk = (x1

k, x2 k, . . .)

  • attributes, e.g. radar cross section xk ∈ R+: Xk = (xk, xk)
  • maneuvering phase, object class ik ∈ N:

Xk = (xk, ik)

  • object shape: SPD random matrices Xk

Xk = (xk, Xk)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 29

slide-30
SLIDE 30

extended objects: simplified description

  • kinematical state at time tk: xk = (position, velocity, . . . )
  • object extension at time tk:

approximately by an ellipse

  • size: volume, shape: ratio of semi-axes, spatial orientation
  • extension: SPD matrix Xk (Symmetric, Positively Definite)

augmented state: kinematical state vector xk, extension matrix Xk

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 30

slide-31
SLIDE 31

Generalize BAYESian tracking to extended objects.

nk plots Zk = {zj

k}nk j=1 at time tk, accumulated data Zk = {Zk, nk, Zk−1}

The conditional pdf p(xk, Xk|Zk) describes what is known about the extended object state xk, Xk based on all sensor data up to time tk. ‘extended object tracking’: iterative calculation of p(xk, Xk|Zk).

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 32

slide-32
SLIDE 32

Generalize BAYESian tracking to extended objects.

nk plots Zk = {zj

k}nk j=1 at time tk, accumulated data Zk = {Zk, nk, Zk−1}

The conditional pdf p(xk, Xk|Zk) describes what is known about the extended object state xk, Xk based on all sensor data up to time tk. ‘extended object tracking’: iterative calculation of p(xk, Xk|Zk).

p(xk, Xk|Zk) = p(Zk, nk|xk, Xk) p(xk, Xk|Zk−1)

  • dxkdXk p(Zk, nk|xk, Xk) p(xk, Xk|Zk−1)
  • p(xk, Xk|Zk) = p(xk|Xk, Zk) p(Xk|Zk)

extended object ‘track’

  • p(Zk, nk|xk, Xk) sensor output to be processed, i.e. the likelihood
  • p(xk|Zk) =

dXk p(xk, Xk|Zk) kinematics of the extended object

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 33

slide-33
SLIDE 33

Generalize BAYESian tracking to extended objects.

nk plots Zk = {zj

k}nk j=1 at time tk, accumulated data Zk = {Zk, nk, Zk−1}

The conditional pdf p(xk, Xk|Zk) describes what is known about the extended object state xk, Xk based on all sensor data up to time tk. ‘extended object tracking’: iterative calculation of p(xk, Xk|Zk).

p(xk, Xk|Zk) = p(Zk, nk|xk, Xk) p(xk, Xk|Zk−1)

  • dxkdXk p(Zk, nk|xk, Xk) p(xk, Xk|Zk−1)
  • p(xk, Xk|Zk) = p(xk|Xk, Zk) p(Xk|Zk)

extended object ‘track’ p(xk|Xk, Zk) : assume Gaussian density p(Xk|Zk) : assume inverse Wishart density, details later!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 34

slide-34
SLIDE 34

Characterize an object by quantitatively describable properties: object state

Examples: – object position x on a strait line: x ∈ R – kinematic state x = (r⊤, ˙

r⊤,¨ r⊤)⊤, x ∈ R9

position r = (x, y, z)⊤, velocity ˙

r, acceleration ¨ r

– joint state of two objects: x = (x⊤

1 , x⊤ 2 )⊤

– kinematic state x, object extension X

z.B. ellipsoid: symmetric, positively definite matrix

– kinematic state x, object class class

z.B. bird, sailing plane, helicopter, passenger jet, ...

Learn unknown object states from imperfect measurements and describe by functions p(x) imprecise knowledge mathematically precisely!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 35

slide-35
SLIDE 35

Interpret unknown object states as random variables, x [1D] or x, X [vector / matrix variate]), characterized by corresponding probability density functions (pdf). The concrete shape of the pdf p(x) contains the full knowledge on x!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 36

slide-36
SLIDE 36

Information on a random variable (RV) can be extracted by integration from the corresponding pdf. ! at present: one dimensional case: How probable is it that x ∈ (a, b) ⊆ R holds? Answer: P{x ∈ (a, b)} =

b

a dx p(x)

⇒ p(x) ≥ 0 in particular: P{x ∈ R} =

−∞ dx p(x) = 1

(normalzation) intuitive interpretation: “the object is somewhere in R” loosely: p(x) dx is probabity for x having a value between x and x + dx

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 37

slide-37
SLIDE 37

How to characterize the properties of a pdf?

specifically: How to associate a single “expected” value to a RV? The maximum of the pdf is sometimes but not always useful!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 38

slide-38
SLIDE 38

How to characterize the properties of a pdf?

specifically: How to associate a single “expected” value to a RV? The maximum of the pdf is sometimes but not always useful! (→ examples) instead: Calculate the centroid of the pdf! E[x] =

−∞ dx x p(x) = ¯

x “expectation value” more generally: Consider functions g : x → g(x) of the RV x! E[g(x)] =

−∞ dx g(x) p(x),

“expectation value of the observable g”’ Example: Consider the observable 1

2mx2 (kinetic energy, x = speed)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 39

slide-39
SLIDE 39

An important observable: the “error” of an estimate

  • Quality:

How useful is an expectation value ¯ x = E[x]? Consider special obervables as distance measure: g(x) = |x − ¯ x|

  • der

g(x) = (x − ¯ x)2

quadratic measures: computationally more comfortable!

‘expected error’ of the expectation value ¯ x: V[x] = E[(x − ¯ x)2], σx = √ V[x] variance, standard deviation

Exercise 2.1

Show that V[x] = E[x2] − E[x]2 holds. Expectation value of the observable x2 also called “2nd moment” of the pdf of x.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 40

slide-40
SLIDE 40

Exercise 2.2

Calculate expectation and variance of the uniform density

  • f a RV x ∈ R in the intervall [a, b].

p(x) = U( x

  • ZV

; a, b

  • Parameter

) =

  

1 b−a

x ∈ [a, b] sonst ! Pdf correctly normalized?

−∞ dx U(x; a, b) =

1 b − a

b

a dx = 1

E[x] =

−∞ dx x U(x; a, b) = b + a

2 V[x] = 1 b − a

b

a dx x2 − E[x]2 = 1

12(b − a)2

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 41

slide-41
SLIDE 41

Important example: x normally distributed over R (Gauss)

– wanted: probabilities concentrated around µ – quadratic distance: ||x−µ||2 = 1

2(x−µ)2/σ2 (mathematically convenient!)

– Parameter σ is a measure of the “width” of the pdf: ||σ||2 = 1

2

– for ‘large’ distances, i.e. ||x − µ||2 ≫ 1

2, the pdf shall decay quickly.

– simplest approach: ˜ p(x) = e−||x−µ||2 (> 0 ∀x ∈ R, normalization?) – Normalized for p(x) = ˜ p(x)/

−∞ dx ˜

p(x)! Formula collection delivers:

−∞ dx ˜

p(x) = √ 2πσ An admissible pdf with the required properties is obviously given by: N(x; µ, σ) = 1 √ 2πσ exp

  • −(x − µ)2

2σ2

  • Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019

— slide 42

slide-42
SLIDE 42

Exercise 2.3

Show for the Gaussian density p(x) = N(x; µ, σ): E[x] = µ, V[x] = σ2 E[x] =

−∞ dx x N(x; µ, σ) = µ

V[x] = E[x2] − E[x]2 = σ2 Use substitution and partial integration! Use

−∞ dx e−1

2x2 =

√ 2π!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 23, 2019 — slide 43