Introduction to Sensor Data Fusion Methods and Applications Last - - PowerPoint PPT Presentation

introduction to sensor data fusion methods and
SMART_READER_LITE
LIVE PREVIEW

Introduction to Sensor Data Fusion Methods and Applications Last - - PowerPoint PPT Presentation

Introduction to Sensor Data Fusion Methods and Applications Last lecture: Why Sensor Data Fusion? Motivation, general context Discussion of examples Today: Steep climb to a first algorithm. oral examination: 6 credit points


slide-1
SLIDE 1

Introduction to Sensor Data Fusion Methods and Applications

  • Last lecture: Why Sensor Data Fusion?

– Motivation, general context – Discussion of examples

  • Today: Steep climb to a first algorithm.
  • oral examination: 6 credit points after the end of the semester
  • prerequisite: participate in the excercises, explain a good program
  • job opportunities as research assistant in ongoing projects, practicum
  • subsequently: bachelor at Fraunhofer FKIE, master / PhD possible
  • slides/script: email to wolfgang.koch@fkie.fraunhofer.de, download

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-2
SLIDE 2

A Generic Tracking and Sensor Data Fusion System

Track Association Sensor Data to Track File Storage Track Maintenance: Retrodiction Prediction, Filtering Sensing Hardware: Signal Processing: Parameter Estimation Received Waveforms Detection Process: Data Rate Reduction Track Initiation: Multiple Frame

  • Object Environment
  • Object Characteristics

A Priori Knowledge:

  • Sensor Performance
  • Track Cancellation
  • Object Classification / ID
  • Track-to-Track Fusion

Track Processing:

  • Interaction Facilities

Man-Machine Interface:

  • Displaying Functions
  • Object Representation

Tracking & Fusion System

Sensor System Sensor System

Sensor Data Sensor Control

Sensor System

Track Extraction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-3
SLIDE 3

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-4
SLIDE 4

Tracking Application: Ground Picture Production

GMTI Radar: Ground Moving Target Indicator

wide area, all-weather, day/night, real-time surveillance of a dynamically evolving ground or near-to-ground situation

GMTI Tracking: Some Characteristic Aspects

backbone of a ground picture: moving target tracks

  • airborne, dislocated, mobile sensor platforms
  • vehicles, ships, ‘low-flyers’, radars, convoys
  • occlusions: Doppler-blindness, topography
  • road maps, terrain information, tactical rules
  • dense target / dense clutter situations: MHT

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-5
SLIDE 5

Examples of GMTI Tracks (live exercise)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-6
SLIDE 6

2

For OFFI CI AL USE ONLY

Exploit Heterogeneous Multiple Sensor Systems. Covert & Automated Surveillance of a Person Stream: I dentification of Anomalous Behavior

Towards a Solution Towards a Solution General Task General Task

Data Sensor Surveillance

Fusion

Data Person Classification Attributes:

What? When?

Kinematics:

Where? When?

Multiple Sensor Security Assistance Systems

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-7
SLIDE 7

On Characterizing Tracking / Fusion Performance

a well-understood paradigm: air surveillance with multiple radars

Many results can be transfered to other sensors (IR, E/O, sonar, acoustics).

Sensor Data Fusion: ‘tracks’ represent the available information on the targets associated to them with appropriate quality measures, thus providing answers to: When? Where? How many? To which direction? How fast, accelerating? What?

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-8
SLIDE 8

On Characterizing Tracking / Fusion Performance

a well-understood paradigm: air surveillance with multiple radars

Many results can be transfered to other sensors (IR, E/O, sonar, acoustics).

Sensor Data Fusion: ‘tracks’ represent the available information on the targets associated to them with appropriate quality measures, thus providing answers to: When? Where? How many? To which direction? How fast, accelerating? What? By sensor data fusion we wish to establish one-to-one associations between: targets in the field of view ↔ identified tracks in the tracking computer Strictly speaking, this is only possible under ideal conditions regarding the sensor performance and underlying target situation. The tracking/fusion performance can thus be measured by its deficiencies when compared with this ideal goal.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-9
SLIDE 9
  • 1. Let a target be detected at first by a sensor at time ta. Usually, a time

delay is involved until a confirmed track has finally been established at time te (track extraction). A ‘measure of deficiency’ is thus:

  • extraction delay te − ta.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-10
SLIDE 10
  • 1. Let a target be detected at first by a sensor at time ta. Usually, a time

delay is involved until a confirmed track has finally been established at time te (track extraction). A ‘measure of deficiency’ is thus:

  • extraction delay te − ta.
  • 2. Unavoidably, false tracks will be extracted in case of a high false

return density (e.g. clutter, jamming/detection), i.e. tracks related to unreal or unwanted targets. Corresponding ‘deficiencies’ are:

  • mean number of falsely extracted targets per time,
  • mean life time of a false track before its deletion.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-11
SLIDE 11
  • 1. Let a target be detected at first by a sensor at time ta. Usually, a time

delay is involved until a confirmed track has finally been established at time te (track extraction). A ‘measure of deficiency’ is thus:

  • extraction delay te − ta.
  • 2. Unavoidably, false tracks will be extracted in case of a high false

return density (e.g. clutter, jamming/detection), i.e. tracks related to unreal or unwanted targets. Corresponding ‘deficiencies’ are:

  • mean number of falsely extracted targets per time,
  • mean life time of a false track before its deletion.
  • 3. A target should be represented by one and the same track until leav-

ing the field of view. Related performance measures/deficiencies:

  • mean life time of tracks related to true targets,
  • probability of an ‘identity switch’ between targets,
  • probability of a target not being represented by a track.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-12
SLIDE 12
  • 4. The track inaccuracy (error covariance of a state estimate) should be

as small as possible. The deviations between estimated and actual target states should at least correspond with the error covariances produced (consistency). If this is not the case, we speak of a ‘track loss’.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-13
SLIDE 13
  • 4. The track inaccuracy (error covariance of a state estimate) should be

as small as possible. The deviations between estimated and actual target states should at least correspond with the error covariances pro- duced (consistency). If this is not the case, we speak of a ‘track loss’.

  • A track must really represent a target!

Challenges:

  • low detection probability • high clutter density • low update rate
  • agile targets • dense target situations • formations, convoys
  • target-split events (formation, weapons) • jamming, deception

Basic Tasks:

  • models: sensor, target, environment

→ physics

  • data association problems

→ combinatorics

  • estimation problems

→ probability, statistics

  • process control, realization

→ computer science

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-14
SLIDE 14

pdf: tk−1

‘Probability densities functions (pdf)’

p(xk−1|Zk−1) represent imprecise

knowledge on the ‘state’ xk−1 based on imprecise measurements Zk−1.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-15
SLIDE 15

Characterize an object by quantitatively describable properties: object state

Examples: – object position x on a strait line: x ∈ R – kinematic state x = (r⊤, ˙

r⊤,¨ r⊤)⊤, x ∈ R9

position r = (x, y, z)⊤, velocity ˙

r, acceleration ¨ r

– joint state of two objects: x = (x⊤

1 , x⊤ 2 )⊤

– kinematic state x, object extension X

z.B. ellipsoid: symmetric, positively definite matrix

– kinematic state x, object class class

z.B. bird, sailing plane, helicopter, passenger jet, ...

Learn unknown object states from imperfect measurements and describe by functions p(x) imprecise knowledge mathematically precisely!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-16
SLIDE 16

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-17
SLIDE 17

pdf: tk−1 Prädiktion: tk

Exploit imprecise knowledge on the dynamical behavior of the object.

p(xk|Zk−1)

  • prediction

= dxk−1 p(xk|xk−1)

  • dynamics

p(xk−1|Zk−1)

  • ld knowledge

.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-18
SLIDE 18

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y! p(xk|Zk−1) =

  • dxk−1 p(xk, xk−1|Zk−1)

=

  • dxk−1 p(xk|xk−1, Zk−1) p(xk−1|Zk−1)

=

  • dxk−1 p(xk|xk−1) p(xk−1|Zk−1)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-19
SLIDE 19

pdf: tk−1 tk: kein plot

missing sensor detection: ‘data processing’ = prediction (not always: exploitation of ‘negative’ sensor evidence)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-20
SLIDE 20

pdf: tk−1 pdf: tk Prädiktion: tk+1

missing sensor information: increasing knowledge dissipation

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-21
SLIDE 21

pdf: tk−1 pdf: tk tk+1: ein plot

sensor information on the kinematical object state

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-22
SLIDE 22

pdf: tk−1 pdf: tk Prädiktion: tk+1 likelihood

(Sensormodell)

BAYES’ formula:

p(xk+1|Zk+1)

  • new knowledge

= p(zk+1|xk+1) p(xk+1|Zk)

  • dxk+1 p(zk+1
  • plot

|xk+1) p(xk+1|Zk)

  • prediction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-23
SLIDE 23

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y!

  • Bayes: p(x|y)= p(y|x)p(x)

p(y)

=

p(y|x)p(x)

  • dx p(y|x)p(x):

p(x|y) ← p(y|x), p(x)! p(x|y) p(y) = p(x, y) = p(y, z) = p(y|x) p(x)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-24
SLIDE 24

pdf: tk−1 pdf: tk pdf: tk+1

(Bayes)

filtering = sensor data processing

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-25
SLIDE 25

Target or Object Tracking: Basic Idea

Iterative updating of conditional probability densities!

kinematic target state xk at time tk, accumulated sensor data Zk a priori knowledge: target dynamics models, sensor model

  • prediction:

p(xk−1|Zk−1)

dynamics model

− − − − − − − − − − → p(xk|Zk−1)

  • filtering:

p(xk|Zk−1)

sensor data Zk

− − − − − − − − − − →

sensor model

p(xk|Zk)

  • retrodiction:

p(xl−1|Zk)

filtering output

← − − − − − − − − − −

dynamics model

p(xl|Zk)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-26
SLIDE 26

pdf: tk−1 Prädiktion: tk

Exploit imprecise knowledge on the dynamical behavior of the object.

p(xk|Zk−1)

  • prediction

= dxk−1 p(xk|xk−1)

  • dynamics

p(xk−1|Zk−1)

  • ld knowledge

.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-27
SLIDE 27

The Multivariate GAUSSian Pdf

– wanted: probabilities ‘concentrated’ around a center x – quadratic distance: q(x) = 1

2(x − x)P−1(x − x)⊤

q(x) defines an ellipsoid around x, its volume and orien- tation being determined by a matrix P (symmetric: P⊤ =

P, positively definite: all eigenvalues > 0).

– first attempt: p(x) = e−q(x)/

dx e−q(x) (normalized!)

p(x) = N(x; x, P) = 1

  • |2πP|

e−1

2(x−x)⊤P−1(x−x)

– GAUSSian Mixtures: p(x) =

i pi N(x; xi, Pi) (weighted sums)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-28
SLIDE 28

pdf: tk−1 Prädiktion: tk

Exploit imprecise knowledge on the dynamical behavior of the object.

p(xk|Zk−1)

  • prediction

= dxk−1 N

  • xk; Fxk−1, D
  • dynamics

N

  • xk−1; xk−1|k−1, Pk−1|k−1
  • ld knowledge

.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-29
SLIDE 29

A Useful Product Formula for GAUSSians

N

  • z; Fx, D
  • N
  • x; y, P
  • = N
  • z; Fy, S
  • independent of x

N

  • x; y + Wν, P − WSW⊤

ν = z − Fy, S = FPF⊤ + D, W = PF⊤S−1.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-30
SLIDE 30

Kalman filter: xk = (r⊤

k , ˙

r⊤

k )⊤, Zk = {zk, Zk−1}

initiation: p(x0) = N

  • x0; x0|0, P0|0
  • ,

initial ignorance:

P0|0 ‘large’

prediction: N

  • xk−1; xk−1|k−1, Pk−1|k−1
  • dynamics model

− − − − − − − − − →

Fk|k−1, Dk|k−1

N

  • xk; xk|k−1, Pk|k−1
  • xk|k−1 = Fk|k−1xk−1|k−1

Pk|k−1 = Fk|k−1Pk−1|k−1Fk|k−1

⊤ + Dk|k−1

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-31
SLIDE 31

pdf: tk−1 tk: kein plot

missing sensor detection: ‘data processing’ = prediction (not always: exploitation of ‘negative’ sensor evidence)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-32
SLIDE 32

pdf: tk−1 pdf: tk Prädiktion: tk+1

missing sensor information: increasing knowledge dissipation

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-33
SLIDE 33

pdf: tk−1 pdf: tk tk+1: ein plot

sensor information on the kinematical object state

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-34
SLIDE 34

pdf: tk−1 pdf: tk Prädiktion: tk+1 likelihood

(Sensormodell)

BAYES’ formula:

p(xk+1|Zk+1)

  • new knowledge

= p(zk+1|xk+1) p(xk+1|Zk)

  • dxk+1 p(zk+1
  • plot

|xk+1) p(xk+1|Zk)

  • prediction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-35
SLIDE 35

pdf: tk−1 pdf: tk Prädiktion: tk+1 likelihood

(Sensormodell)

BAYES’ formula:

p(xk+1|Zk+1)

  • new knowledge

= N

  • zk; Hxk, R
  • N
  • xk; xk|k−1, Pk|k−1
  • dxk+1 N
  • zk; Hxk, R
  • N
  • xk; xk|k−1, Pk|k−1
  • prediction

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-36
SLIDE 36

A Useful Product Formula for GAUSSians

N

  • z; Hx, R
  • N
  • x; y, P
  • = N
  • z; Hy, S
  • independent of x

N

  • x; y + Wν, P − WSW⊤

ν = z − Hy, S = HPH⊤ + D, W = PH⊤S−1.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-37
SLIDE 37

Kalman filter: xk = (r⊤

k , ˙

r⊤

k )⊤, Zk = {zk, Zk−1}

initiation: p(x0) = N

  • x0; x0|0, P0|0
  • ,

initial ignorance:

P0|0 ‘large’

prediction: N

  • xk−1; xk−1|k−1, Pk−1|k−1
  • dynamics model

− − − − − − − − − →

Fk|k−1, Dk|k−1

N

  • xk; xk|k−1, Pk|k−1
  • xk|k−1 = Fk|k−1xk−1|k−1

Pk|k−1 = Fk|k−1Pk−1|k−1Fk|k−1

⊤ + Dk|k−1

filtering: Nxk; xk|k−1, Pk|k−1

  • current measurement zk

− − − − − − − − − − − − − →

sensor model: Hk, Rk

N

  • xk; xk|k, Pk|k
  • xk|k

=

xk|k−1 + Wk|k−1νk|k−1, νk|k−1 = zk − Hkxk|k−1 Pk|k

=

Pk|k−1 − Wk|k−1Sk|k−1Wk|k−1⊤, Sk|k−1 = HkPk|k−1Hk⊤ + Rk Wk|k−1 = Pk|k−1Hk⊤Sk|k−1−1

‘KALMAN gain matrix’

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-38
SLIDE 38

pdf: tk−1 pdf: tk pdf: tk+1

(Bayes)

filtering = sensor data processing

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-39
SLIDE 39

Summary: BAYESian (Multi-) Sensor Tracking

  • Basis: In the course of time one or several sensors produce measurements of

targets of interest. Each target is characterized by its current state vector, being expected to change with time.

  • Objective: Learn as much as possible about the individual target states at each

time by analyzing the ‘time series’ which is constituted by the sensor data.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-40
SLIDE 40

Summary: BAYESian (Multi-) Sensor Tracking

  • Basis: In the course of time one or several sensors produce measurements of

targets of interest. Each target is characterized by its current state vector, being expected to change with time.

  • Objective: Learn as much as possible about the individual target states at each

time by analyzing the ‘time series’ which is constituted by the sensor data.

  • Problem: imperfect sensor information: inaccurate, incomplete, and eventually
  • ambiguous. Moreover, the targets’ temporal evolution is usually not well-known.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-41
SLIDE 41

Summary: BAYESian (Multi-) Sensor Tracking

  • Basis: In the course of time one or several sensors produce measurements of

targets of interest. Each target is characterized by its current state vector, being expected to change with time.

  • Objective: Learn as much as possible about the individual target states at each

time by analyzing the ‘time series’ which is constituted by the sensor data.

  • Problem: imperfect sensor information: inaccurate, incomplete, and eventually
  • ambiguous. Moreover, the targets’ temporal evolution is usually not well-known.
  • Approach:

Interpret measurements and target vectors as random variables (RVs). Describe by probability density functions (pdf) what is known about them.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-42
SLIDE 42

Summary: BAYESian (Multi-) Sensor Tracking

  • Basis: In the course of time one or several sensors produce measurements of

targets of interest. Each target is characterized by its current state vector, being expected to change with time.

  • Objective: Learn as much as possible about the individual target states at each

time by analyzing the ‘time series’ which is constituted by the sensor data.

  • Problem: imperfect sensor information: inaccurate, incomplete, and eventually
  • ambiguous. Moreover, the targets’ temporal evolution is usually not well-known.
  • Approach:

Interpret measurements and target vectors as random variables (RVs). Describe by probability density functions (pdf) what is known about them.

  • Solution: Derive iteration formulae for calculating the pdfs! Develop a mech-

anism for initiation! By doing so, exploit all background information available! Derive state estimates from the pdfs along with appropriate quality measures!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-43
SLIDE 43

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y!

  • Bayes: p(x|y)= p(y|x)p(x)

p(y)

=

p(y|x)p(x)

  • dx p(y|x)p(x):

p(x|y) ← p(y|x), p(x)!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-44
SLIDE 44

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y!

  • Bayes: p(x|y)= p(y|x)p(x)

p(y)

=

p(y|x)p(x)

  • dx p(y|x)p(x):

p(x|y) ← p(y|x), p(x)!

  • certain knowledge on x: p(x) = δ(x − y) ‘=’ limσ→0

1 √ 2πσe−1

2 (x−y)2 σ2

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-45
SLIDE 45

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y!

  • Bayes: p(x|y)= p(y|x)p(x)

p(y)

=

p(y|x)p(x)

  • dx p(y|x)p(x):

p(x|y) ← p(y|x), p(x)!

  • certain knowledge on x: p(x) = δ(x − y) ‘=’ limσ→0

1 √ 2πσe−1

2 (x−y)2 σ2

  • transformed RV y = t[x]:

p(y) =

dx p(y, x) = dx p(y|x) px(x)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-46
SLIDE 46

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y!

  • Bayes: p(x|y)= p(y|x)p(x)

p(y)

=

p(y|x)p(x)

  • dx p(y|x)p(x):

p(x|y) ← p(y|x), p(x)!

  • certain knowledge on x: p(x) = δ(x − y) ‘=’ limσ→0

1 √ 2πσe−1

2 (x−y)2 σ2

  • transformed RV y = t[x]:

p(y) =

dxp(y, x) = dxp(y|x)px(x) = dx δ(y − t[x]) px(x) =: [T px](y)

(T : px → p, “transfer operator”)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-47
SLIDE 47

Create your own ground truth generator!

Exercise 2.1

Consider an object that moves in two dimensions on the trajectory:

r(t) =

  • x(t)

y(t)

  • = A
  • sin(ωt)

sin(2ωt)

  • with

A = v2

q ,

ω =

q 2v

and speed and acceleration parameters: v = 300 m

s , q = 9 m s2!

  • 1. Plot the trajectory. Why is it periodical? What is its period T = T(v, q)?
  • 2. Show for the velocity and acceleration vector:

˙

r(t) = v

  • cos(ωt)/2

cos(2ωt)

  • ,

¨

r(t) = −q

  • sin(ωt)/4

sin(2ωt)

  • !
  • 3. Calculate for each instance of time t the tangential and normal vectors in r(t):

t(t) =

1 |˙ r(t)|

  • ˙

x(t) ˙ y(t)

  • ,

n(t) =

1 |˙ r(t)|

  • − ˙

y(t) ˙ x(t)

  • !
  • 4. Plot |˙

r(t)|, |¨ r(t)|, ¨ r(t)t(t) and ¨ r(t)n(t) over a period T!

  • 5. Discuss the temporal behaviour based on the trajectory r(t)!
  • 6. What are the maximum speeds and accelerations, vmax, qmax?

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-48
SLIDE 48

Characterize an object by quantitatively describable properties: object state

Examples: – object position x on a strait line: x ∈ R – kinematic state x = (r⊤, ˙

r⊤,¨ r⊤)⊤, x ∈ R9

position r = (x, y, z)⊤, velocity ˙

r, acceleration ¨ r

– joint state of two objects: x = (x⊤

1 , x⊤ 2 )⊤

– kinematic state x, object extension X

z.B. ellipsoid: symmetric, positively definite matrix

– kinematic state x, object class class

z.B. bird, sailing plane, helicopter, passenger jet, ...

Learn unknown object states from imperfect measurements and describe by functions p(x) imprecise knowledge mathematically precisely!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-49
SLIDE 49

Interpret unknown object states as random variables, x [1D] or x, X [vector / matrix variate]), characterized by corresponding probability density functions (pdf). The concrete shape of the pdf p(x) contains the full knowledge on x!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-50
SLIDE 50

Information on a random variable (RV) can be extracted by integration from the corresponding pdf. ! at present: one dimensional case: How probable is it that x ∈ (a, b) ⊆ R holds? Answer: P{x ∈ (a, b)} =

b

a dx p(x)

⇒ p(x) ≥ 0 in particular: P{x ∈ R} =

−∞ dx p(x) = 1

(normalzation) intuitive interpretation: “the object is somewhere in R” loosely: p(x) dx is probabity for x having a value between x and x + dx

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-51
SLIDE 51

How to characterize the properties of a pdf?

specifically: How to associate a single “expected” value to a RV? The maximum of the pdf is sometimes but not always useful!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-52
SLIDE 52

How to characterize the properties of a pdf?

specifically: How to associate a single “expected” value to a RV? The maximum of the pdf is sometimes but not always useful! (→ examples) instead: Calculate the centroid of the pdf! E[x] =

−∞ dx x p(x) = ¯

x “expectation value” more generally: Consider functions g : x → g(x) of the RV x! E[g(x)] =

−∞ dx g(x) p(x),

“expectation value of the observable g”’ Example: Consider the observable 1

2mx2 (kinetic energy, x = speed)

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-53
SLIDE 53

An important observable: the “error” of an estimate

  • Quality:

How useful is an expectation value ¯ x = E[x]? Consider special obervables as distance measure: g(x) = |x − ¯ x|

  • der

g(x) = (x − ¯ x)2

quadratic measures: computationally more comfortable!

‘expected error’ of the expectation value ¯ x: V[x] = E[(x − ¯ x)2], σx = √ V[x] variance, standard deviation

Exercise 2.2

Show that V[x] = E[x2] − E[x]2 holds. Expectation value of the observable x2 also called “2nd moment” of the pdf of x.

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-54
SLIDE 54

Exercise 2.3

Calculate expectation and variance of the uniform density

  • f a RV x ∈ R in the intervall [a, b].

p(x) = U( x

  • ZV

; a, b

  • Parameter

) =

  

1 b−a

x ∈ [a, b] sonst ! Pdf correctly normalized?

−∞ dx U(x; a, b) =

1 b − a

b

a dx = 1

E[x] =

−∞ dx x U(x; a, b) = b + a

2 V[x] = 1 b − a

b

a dx x2 − E[x]2 = 1

12(b − a)2

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018

slide-55
SLIDE 55

Important example: x normally distributed over R (Gauss)

– wanted: probabilities concentrated around µ – quadratic distance: ||x−µ||2 = 1

2(x−µ)2/σ2 (mathematically convenient!)

– Parameter σ is a measure of the “width” of the pdf: ||σ||2 = 1

2

– for ‘large’ distances, i.e. ||x − µ||2 ≫ 1

2, the pdf shall decay quickly.

– simplest approach: ˜ p(x) = e−||x−µ||2 (> 0 ∀x ∈ R, normalization?) – Normalized for p(x) = ˜ p(x)/

−∞ dx ˜

p(x)! Formula collection delivers:

−∞ dx ˜

p(x) = √ 2πσ An admissible pdf with the required properties is obviously given by: N(x; µ, σ) = 1 √ 2πσ exp

  • −(x − µ)2

2σ2

  • Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018
slide-56
SLIDE 56

Exercise 2.4

Show for the Gaussian density p(x) = N(x; µ, σ): E[x] = µ, V[x] = σ2 E[x] =

−∞ dx x N(x; µ, σ) = µ

V[x] = E[x2] − E[x]2 = σ2 Use substitution and partial integration! Use

−∞ dx e−1

2x2 =

√ 2π!

Sensor Data Fusion - Methods and Applications, 2nd Lecture on October 24, 2018