Introduction to Sensor Data Fusion Methods and Applications Last - - PowerPoint PPT Presentation

introduction to sensor data fusion methods and
SMART_READER_LITE
LIVE PREVIEW

Introduction to Sensor Data Fusion Methods and Applications Last - - PowerPoint PPT Presentation

Introduction to Sensor Data Fusion Methods and Applications Last lecture: Why Sensor Data Fusion? Motivation, general context Discussion of examples Today: Steep climb to a first algorithm. oral examination: 6 credit points


slide-1
SLIDE 1

Introduction to Sensor Data Fusion Methods and Applications

  • Last lecture: Why Sensor Data Fusion?

– Motivation, general context – Discussion of examples

  • Today: Steep climb to a first algorithm.
  • oral examination: 6 credit points after the end of the semester
  • prerequisite: participate in the excercises, explain a good program
  • job opportunities as research assistant in ongoing projects, practicum
  • subsequently: bachelor at Fraunhofer FKIE, master / PhD possible
  • slides/script: email to wolfgang.koch@fkie.fraunhofer.de, download

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 1

slide-2
SLIDE 2

Sensor & Information Fusion: Basic Task

  • /

information sources: defined by operational requirements

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 2

slide-3
SLIDE 3

Sensor & Information Fusion: Basic Task

information to be fused: imprecise, incomplete, ambiguous, un- resolved, false, deceptive, hard to formalize, contradictory . . .

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 3

slide-4
SLIDE 4

Sensor & Information Fusion: Basic Task

information to be fused: imprecise, incomplete, ambiguous, un- resolved, false, deceptive, hard to formalize, contradictory . . . information sources: defined by operational requirements

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 4

slide-5
SLIDE 5

Create your own ground truth generator!

Exercise 2.1

Consider an object that moves in two dimensions on the trajectory:

r(t) =

  • x(t)

y(t)

  • = A
  • sin(ωt)

sin(2ωt)

  • with

A = v2

q ,

ω =

q 2v

and speed and acceleration parameters: v = 300 m

s , q = 9 m s2!

  • 1. Plot the trajectory. Why is it periodical? What is its period T = T(v, q)?
  • 2. Show for the velocity and acceleration vector:

˙

r(t) = v

  • cos(ωt)/2

cos(2ωt)

  • ,

¨

r(t) = −q

  • sin(ωt)/4

sin(2ωt)

  • !
  • 3. Calculate for each instance of time t the tangential and normal vectors in r(t):

t(t) =

1 |˙ r(t)|

  • ˙

x(t) ˙ y(t)

  • ,

n(t) =

1 |˙ r(t)|

  • − ˙

y(t) ˙ x(t)

  • !
  • 4. Plot |˙

r(t)|, |¨ r(t)|, ¨ r(t)t(t) and ¨ r(t)n(t) over a period T!

  • 5. Discuss the temporal behaviour based on the trajectory r(t)!
  • 6. What are the maximum speeds and accelerations, vmax, qmax?

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 5

slide-6
SLIDE 6

pdf: tk−1

‘Probability densities functions (pdf)’

p(xk−1|Zk−1) represent imprecise

knowledge on the ‘state’ xk−1 based on imprecise measurements Zk−1.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 6

slide-7
SLIDE 7

Characterize an object by quantitatively describable properties: object state

Examples: – object position x on a strait line: x ∈ R – kinematic state x = (r⊤, ˙

r⊤,¨ r⊤)⊤, x ∈ R9

position r = (x, y, z)⊤, velocity ˙

r, acceleration ¨ r

– joint state of two objects: x = (x⊤

1 , x⊤ 2 )⊤

– kinematic state x, object extension X

z.B. ellipsoid: symmetric, positively definite matrix

– kinematic state x, object class class

z.B. bird, sailing plane, helicopter, passenger jet, ...

Learn unknown object states from imperfect measurements and describe by functions p(x) imprecise knowledge mathematically precisely!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 7

slide-8
SLIDE 8

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 8

slide-9
SLIDE 9

pdf: tk−1 Prädiktion: tk

Exploit imprecise knowledge on the dynamical behavior of the object.

p(xk|Zk−1)

  • prediction

= dxk−1 p(xk|xk−1)

  • dynamics

p(xk−1|Zk−1)

  • ld knowledge

.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 9

slide-10
SLIDE 10

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y! p(xk|Zk−1) =

  • dxk−1 p(xk, xk−1|Zk−1)

=

  • dxk−1 p(xk|xk−1, Zk−1) p(xk−1|Zk−1)

=

  • dxk−1 p(xk|xk−1) p(xk−1|Zk−1)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 10

slide-11
SLIDE 11

pdf: tk−1 tk: kein plot

missing sensor detection: ‘data processing’ = prediction (not always: exploitation of ‘negative’ sensor evidence)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 11

slide-12
SLIDE 12

pdf: tk−1 pdf: tk Prädiktion: tk+1

missing sensor information: increasing knowledge dissipation

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 12

slide-13
SLIDE 13

pdf: tk−1 pdf: tk tk+1: ein plot

sensor information on the kinematical object state

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 13

slide-14
SLIDE 14

pdf: tk−1 pdf: tk Prädiktion: tk+1 likelihood

(Sensormodell)

BAYES’ formula:

p(xk+1|Zk+1)

  • new knowledge

= p(zk+1|xk+1) p(xk+1|Zk)

  • dxk+1 p(zk+1
  • plot

|xk+1) p(xk+1|Zk)

  • prediction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 14

slide-15
SLIDE 15

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y!

  • Bayes: p(x|y)= p(y|x)p(x)

p(y)

=

p(y|x)p(x)

  • dx p(y|x)p(x):

p(x|y) ← p(y|x), p(x)! p(x|y) p(y) = p(x, y) = p(y, z) = p(y|x) p(x)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 15

slide-16
SLIDE 16

pdf: tk−1 pdf: tk pdf: tk+1

(Bayes)

filtering = sensor data processing

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 16

slide-17
SLIDE 17

Target or Object Tracking: Basic Idea

Iterative updating of conditional probability densities!

kinematic target state xk at time tk, accumulated sensor data Zk a priori knowledge: target dynamics models, sensor model

  • prediction:

p(xk−1|Zk−1)

dynamics model

− − − − − − − − − − → p(xk|Zk−1)

  • filtering:

p(xk|Zk−1)

sensor data Zk

− − − − − − − − − − →

sensor model

p(xk|Zk)

  • retrodiction:

p(xl−1|Zk)

filtering output

← − − − − − − − − − −

dynamics model

p(xl|Zk)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 17

slide-18
SLIDE 18

pdf: tk−1 Prädiktion: tk

Exploit imprecise knowledge on the dynamical behavior of the object.

p(xk|Zk−1)

  • prediction

= dxk−1 p(xk|xk−1)

  • dynamics

p(xk−1|Zk−1)

  • ld knowledge

.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 18

slide-19
SLIDE 19

The Multivariate GAUSSian Pdf

– wanted: probabilities ‘concentrated’ around a center x – quadratic distance: q(x) = 1

2(x − x)P−1(x − x)⊤

q(x) defines an ellipsoid around x, its volume and orien- tation being determined by a matrix P (symmetric: P⊤ =

P, positively definite: all eigenvalues > 0).

– first attempt: p(x) = e−q(x)/

dx e−q(x) (normalized!)

p(x) = N(x; x, P) = 1

  • |2πP|

e−1

2(x−x)⊤P−1(x−x)

– GAUSSian Mixtures: p(x) =

i pi N(x; xi, Pi) (weighted sums)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 19

slide-20
SLIDE 20

pdf: tk−1 Prädiktion: tk

Exploit imprecise knowledge on the dynamical behavior of the object.

p(xk|Zk−1)

  • prediction

= dxk−1 N

  • xk; Fxk−1, D
  • dynamics

N

  • xk−1; xk−1|k−1, Pk−1|k−1
  • ld knowledge

.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 20

slide-21
SLIDE 21

A Useful Product Formula for GAUSSians

N

  • z; Fx, D
  • N
  • x; y, P
  • = N
  • z; Fy, S
  • independent of x

N

  • x; y + Wν, P − WSW⊤

ν = z − Fy, S = FPF⊤ + D, W = PF⊤S−1.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 21

slide-22
SLIDE 22

Kalman filter: xk = (r⊤

k , ˙

r⊤

k )⊤, Zk = {zk, Zk−1}

initiation: p(x0) = N

  • x0; x0|0, P0|0
  • ,

initial ignorance:

P0|0 ‘large’

prediction: N

  • xk−1; xk−1|k−1, Pk−1|k−1
  • dynamics model

− − − − − − − − − →

Fk|k−1, Dk|k−1

N

  • xk; xk|k−1, Pk|k−1
  • xk|k−1 = Fk|k−1xk−1|k−1

Pk|k−1 = Fk|k−1Pk−1|k−1Fk|k−1

⊤ + Dk|k−1

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 22

slide-23
SLIDE 23

pdf: tk−1 tk: kein plot

missing sensor detection: ‘data processing’ = prediction (not always: exploitation of ‘negative’ sensor evidence)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 23

slide-24
SLIDE 24

pdf: tk−1 pdf: tk Prädiktion: tk+1

missing sensor information: increasing knowledge dissipation

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 24

slide-25
SLIDE 25

pdf: tk−1 pdf: tk tk+1: ein plot

sensor information on the kinematical object state

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 25

slide-26
SLIDE 26

pdf: tk−1 pdf: tk Prädiktion: tk+1 likelihood

(Sensormodell)

BAYES’ formula:

p(xk+1|Zk+1)

  • new knowledge

= p(zk+1|xk+1) p(xk+1|Zk)

  • dxk+1 p(zk+1
  • plot

|xk+1) p(xk+1|Zk)

  • prediction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 26

slide-27
SLIDE 27

pdf: tk−1 pdf: tk Prädiktion: tk+1 likelihood

(Sensormodell)

BAYES’ formula:

p(xk+1|Zk+1)

  • new knowledge

= N

  • zk+1; Hxk+1, R
  • N
  • xk+1; xk+1|k, Pk+1|k
  • dxk+1 N
  • zk+1; Hxk+1, R
  • N
  • xk+1; xk+1|k, Pk+1|k
  • prediction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 27

slide-28
SLIDE 28

A Useful Product Formula for GAUSSians

N

  • z; Hx, R
  • N
  • x; y, P
  • = N
  • z; Hy, S
  • independent of x

N

  • x; y + Wν, P − WSW⊤

ν = z − Hy, S = HPH⊤ + R, W = PH⊤S−1.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 28

slide-29
SLIDE 29

Kalman filter: xk = (r⊤

k , ˙

r⊤

k )⊤, Zk = {zk, Zk−1}

initiation: p(x0) = N

  • x0; x0|0, P0|0
  • ,

initial ignorance:

P0|0 ‘large’

prediction: N

  • xk−1; xk−1|k−1, Pk−1|k−1
  • dynamics model

− − − − − − − − − →

Fk|k−1, Dk|k−1

N

  • xk; xk|k−1, Pk|k−1
  • xk|k−1 = Fk|k−1xk−1|k−1

Pk|k−1 = Fk|k−1Pk−1|k−1Fk|k−1

⊤ + Dk|k−1

filtering: Nxk; xk|k−1, Pk|k−1

  • current measurement zk

− − − − − − − − − − − − − →

sensor model: Hk, Rk

N

  • xk; xk|k, Pk|k
  • xk|k

=

xk|k−1 + Wk|k−1νk|k−1, νk|k−1 = zk − Hkxk|k−1 Pk|k

=

Pk|k−1 − Wk|k−1Sk|k−1Wk|k−1⊤, Sk|k−1 = HkPk|k−1Hk⊤ + Rk Wk|k−1 = Pk|k−1Hk⊤Sk|k−1−1

‘KALMAN gain matrix’

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 29

slide-30
SLIDE 30

pdf: tk−1 pdf: tk pdf: tk+1

(Bayes)

filtering = sensor data processing

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 30

slide-31
SLIDE 31

pdf: tk−1 pdf: tk tk+1: drei plots

ambiguities by false plots: 1 + 3 data interpretation hypotheses (‘detection probability’, false alarm statistics)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 31

slide-32
SLIDE 32

The Multivariate GAUSSian Pdf

– wanted: probabilities ‘concentrated’ around a center x – quadratic distance: q(x) = 1

2(x − x)P−1(x − x)⊤

q(x) defines an ellipsoid around x, its volume and orien- tation being determined by a matrix P (symmetric: P⊤ =

P, positively definite: all eigenvalues > 0).

– first attempt: p(x) = e−q(x)/

dx e−q(x) (normalized!)

p(x) = N(x; x, P) = 1

  • |2πP|

e−1

2(x−x)⊤P−1(x−x)

– GAUSSian Mixtures: p(x) =

i pi N(x; xi, Pi) (weighted sums)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 32

slide-33
SLIDE 33

pdf: tk−1 pdf: tk tk+1: drei plots

p(Zk, mk|xk) = const.

 (1 − PD)ρF + PD

mk

  • j=1

N

  • zj

k; Hxk, Rj k

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 33

slide-34
SLIDE 34

pdf: tk−1 pdf: tk pdf: tk+1

Multimodal pdfs reflect ambiguities inherent in the data.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 34

slide-35
SLIDE 35

pdf: tk−1 pdf: tk pdf: tk+1 Prädiktion: tk+2

temporal propagation: dissipation of the probability densities

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 35

slide-36
SLIDE 36

pdf: tk−1 pdf: tk pdf: tk+1 tk+2: ein plot

association tasks: sensor data ↔ interpretation hypotheses

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 36

slide-37
SLIDE 37

pdf: tk−1 pdf: tk pdf: tk+1 Prädiktion: tk+2 likelihood

BAYES: p(xk+2|Zk+2) =

p(zk+2|xk+2) p(xk+2|Zk+1)

  • dxk+2 p(zk+2|xk+2) p(xk+2|Zk+1)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 37

slide-38
SLIDE 38

pdf: tk−1 pdf: tk pdf: tk+1 pdf: tk+2

in particular: re-calculation of the hypothesis weights

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 38

slide-39
SLIDE 39

pdf: tk−1 pdf: tk pdf: tk+1 pdf: tk+2

How does new knowledge affect the knowledge in the past of a past state?

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 39

slide-40
SLIDE 40

pdf: tk−1 pdf: tk Retrodiktion: tk+1 pdf: tk+2

‘retrodiction’: a retrospective analysis of the past

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 40

slide-41
SLIDE 41

tk−1 tk tk+1 tk+2

  • ptimal information processing at present and for the past

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 41

slide-42
SLIDE 42

Multiple Hypothesis Tracking: Basic Idea

Iterative updating of conditional probability densities!

kinematic target state xk at time tk, accumulated sensor data Zk a priori knowledge: target dynamics models, sensor model, road maps

  • prediction:

p(xk−1|Zk−1)

dynamics model

− − − − − − − − − − →

road maps

p(xk|Zk−1)

  • filtering:

p(xk|Zk−1)

sensor data Zk

− − − − − − − − − − →

sensor model

p(xk|Zk)

  • retrodiction:

p(xl−1|Zk)

filtering output

← − − − − − − − − − −

dynamics model

p(xl|Zk) – finite mixture: inherent ambiguity (data, model, road network) – optimal estimators: e.g. minimum mean squared error (MMSE) – initiation of pdf iteration: multiple hypothesis track extraction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on April 17, 2019 — slide 42