Multiple Sensor Target Tracking: Basic Idea Iterative updating of - - PowerPoint PPT Presentation

multiple sensor target tracking basic idea
SMART_READER_LITE
LIVE PREVIEW

Multiple Sensor Target Tracking: Basic Idea Iterative updating of - - PowerPoint PPT Presentation

Multiple Sensor Target Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state x k at time t k , accumulated sensor data Z k a priori knowledge: target dynamics models, sensor model dynamics model p ( x


slide-1
SLIDE 1

Multiple Sensor Target Tracking: Basic Idea

Iterative updating of conditional probability densities!

kinematic target state xk at time tk, accumulated sensor data Zk a priori knowledge: target dynamics models, sensor model

  • prediction:

p(xk−1|Zk−1)

dynamics model

− − − − − − − − − − → p(xk|Zk−1)

  • filtering:

p(xk|Zk−1)

sensor data Zk

− − − − − − − − − − →

sensor model

p(xk|Zk)

  • retrodiction:

p(xl−1|Zk)

filtering output

← − − − − − − − − − −

dynamics model

p(xl|Zk)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 1

slide-2
SLIDE 2

The Multivariate GAUSSian Pdf

– wanted: probabilities ‘concentrated’ around a center x – quadratic distance: q(x) = 1

2(x − x)P−1(x − x)⊤

q(x) defines an ellipsoid around x, its volume and orien- tation being determined by a matrix P (symmetric: P⊤ =

P, positively definite: all eigenvalues > 0).

– first attempt: p(x) = e−q(x)/

dx e−q(x) (normalized!)

p(x) = N(x; x, P) = 1

  • |2πP|

e−1

2(x−x)⊤P−1(x−x)

– GAUSSian Mixtures: p(x) =

i pi N(x; xi, Pi) (weighted sums)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 2

slide-3
SLIDE 3

Very First Look at an Important Data Fusion Algorithm

Kalman filter: xk = (r⊤

k , ˙

r⊤

k )⊤, Zk = {zk, Zk−1}

initiation: p(x0) = Nx0; x0|0, P0|0

  • ,

initial ignorance:

P0|0 ‘large’

prediction: Nxk−1; xk−1|k−1, Pk−1|k−1

  • dynamics model

− − − − − − − − − →

Fk|k−1, Dk|k−1

Nxk; xk|k−1, Pk|k−1

  • xk|k−1 = Fk|k−1xk−1|k−1

Pk|k−1 = Fk|k−1Pk−1|k−1Fk|k−1

⊤ + Dk|k−1

filtering: N

  • xk; xk|k−1, Pk|k−1
  • current measurement zk

− − − − − − − − − − − − − →

sensor model: Hk, Rk

N

  • xk; xk|k, Pk|k
  • xk|k

=

xk|k−1 + Wk|k−1νk|k−1, νk|k−1 = zk − Hkxk|k−1 Pk|k

=

Pk|k−1 − Wk|k−1Sk|k−1Wk|k−1⊤, Sk|k−1 = HkPk|k−1Hk⊤ + Rk Wk|k−1 = Pk|k−1Hk⊤Sk|k−1−1

‘KALMAN gain matrix’ A deeper look into the dynamics and sensor models necessary!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 3

slide-4
SLIDE 4

Create your own ground truth generator!

Exercise 3.1

Consider a car moving on a mountain pass road modeled by:

r(t) =

  • x(t)

y(t) z(t)

  • = A
  • vt

ay sin( 4πv

ax t)

az sin( πv

ax)

  • v = 20km

h , ay = az = 1 km, t ∈ [0, ax/vx].

  • 1. Plot the trajectory. Are the parameters reasonable? Try alternatives.
  • 2. Calculate and plot the velocity and acceleration vectors:

˙

r(t) =

  • ˙

x(t) ˙ y(t) ˙ z(t)

  • ,

¨

r(t) = −q

  • ¨

x(t) ¨ y(t) ¨ z(t)

  • .
  • 3. Calculate for each instance of time t the tangential vectors in r(t):

t(t) =

1 |˙ r(t)|˙

r(t).

  • 4. Plot |˙

r(t)|, |¨ r(t)|, and ¨ r(t)t(t) over the time interval.

  • 5. Discuss the temporal behaviour based on the trajectory r(t)!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 4

slide-5
SLIDE 5

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y!

  • Bayes: p(x|y)= p(y|x)p(x)

p(y)

=

p(y|x)p(x)

  • dx p(y|x)p(x):

p(x|y) ← p(y|x), p(x)!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 5

slide-6
SLIDE 6

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y!

  • Bayes: p(x|y)= p(y|x)p(x)

p(y)

=

p(y|x)p(x)

  • dx p(y|x)p(x):

p(x|y) ← p(y|x), p(x)!

  • certain knowledge on x: p(x) = δ(x − y) ‘=’ limσ→0

1 √ 2πσe−1

2 (x−y)2 σ2

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 6

slide-7
SLIDE 7

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y!

  • Bayes: p(x|y)= p(y|x)p(x)

p(y)

=

p(y|x)p(x)

  • dx p(y|x)p(x):

p(x|y) ← p(y|x), p(x)!

  • certain knowledge on x: p(x) = δ(x − y) ‘=’ limσ→0

1 √ 2πσe−1

2 (x−y)2 σ2

  • transformed RV y = t[x]:

p(y) =

dx p(y, x) = dx p(y|x) px(x)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 7

slide-8
SLIDE 8

How to deal with probability density functions?

  • pdf p(x): Extract probability statements about the RV x by integration!
  • na¨

ıvely: positive and normalized functions (p(x) ≥ 0,

dx p(x) = 1)

  • conditional pdf p(x|y) = p(x,y)

p(y) :

Impact of information on y on RV x?

  • marginal density p(x) =

dy p(x, y) = dy p(x|y) p(y):

Enter y!

  • Bayes: p(x|y)= p(y|x)p(x)

p(y)

=

p(y|x)p(x)

  • dx p(y|x)p(x):

p(x|y) ← p(y|x), p(x)!

  • certain knowledge on x: p(x) = δ(x − y) ‘=’ limσ→0

1 √ 2πσe−1

2 (x−y)2 σ2

  • transformed RV y = t[x]:

p(y) =

dx p(y, x) = dx p(y|x) px(x) = dx δ(y − t[x]) px(x) =: [T px](y)

(T : px → p, “transfer operator”)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 8

slide-9
SLIDE 9

The Multivariate GAUSSian Pdf

– wanted: probabilities ‘concentrated’ around a center x – quadratic distance: q(x) = 1

2(x − x)P−1(x − x)⊤

q(x) defines an ellipsoid around x, its volume and orien- tation being determined by a matrix P (symmetric: P⊤ =

P, positively definite: all eigenvalues > 0).

– first attempt: p(x) = e−q(x)/

dx e−q(x) (normalized!)

p(x) = N(x; x, P) = 1

  • |2πP|

e−1

2(x−x)⊤P−1(x−x)

Exercise 3.2

Show: dx e−q(x) =

  • |2πP|,

E[x] = x, E[(x − x)(x − x)⊤] = P Trick: Symmetric, positively definite matrices can be diagonalized by an orthogonal coordinate transform.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 9

slide-10
SLIDE 10

The Multivariate GAUSSian Pdf

– wanted: probabilities ‘concentrated’ around a center x – quadratic distance: q(x) = 1

2(x − x)P−1(x − x)⊤

q(x) defines an ellipsoid around x, its volume and orien- tation being determined by a matrix P (symmetric: P⊤ =

P, positively definite: all eigenvalues > 0).

– first attempt: p(x) = e−q(x)/

dx e−q(x) (normalized!)

p(x) = N(x; x, P) = 1

  • |2πP|

e−1

2(x−x)⊤P−1(x−x)

E[x] = x, E[(x − x)(x − x)⊤] = P (covariance)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 10

slide-11
SLIDE 11

The Multivariate GAUSSian Pdf

– wanted: probabilities ‘concentrated’ around a center x – quadratic distance: q(x) = 1

2(x − x)P−1(x − x)⊤

q(x) defines an ellipsoid around x, its volume and orien- tation being determined by a matrix P (symmetric: P⊤ =

P, positively definite: all eigenvalues > 0).

– first attempt: p(x) = e−q(x)/

dx e−q(x) (normalized!)

p(x) = N(x; x, P) = 1

  • |2πP|

e−1

2(x−x)⊤P−1(x−x)

E[x] = x, E[(x − x)(x − x)⊤] = P (covariance) – Covariance Matrix: Expected error of the expectation.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 11

slide-12
SLIDE 12

A Useful Product Formula for GAUSSians

N

  • z; Hx, R
  • N
  • x; y, P
  • = N
  • z; Hy, S
  • independent of x

×

  • N
  • x; y + Wν, P − WSW⊤

Nx; Q−1(P−1x + H⊤R−1z), Q

  • ν = z − Hy,

S = HPH⊤ + R, W = PH⊤S−1, Q−1 = P−1 + H⊤R−1H.

Sketch of the proof (done in an exercise later!):

  • Interpret N
  • z; Hx, R
  • N
  • x; y, P
  • as a joint pdf p(z|x)p(x) = p(z, x).
  • Show that p(z, x) is a GAUSSian: p(z, x) = N

z

x

  • ;

Hy

y

  • ,
  • S

HP PH⊤ P

  • .
  • Calculate from p(z, x) the marginal and conditional pdfs p(z) and p(x|z).
  • From p(z, x) = p(z|x)p(x) = p(x|z)p(z) = p(x, z) we obtain the result.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 12

slide-13
SLIDE 13

Affine Transforms of GAUSSian Random Variables

N

  • x; E[x], C[x]
  • y=t+Tx

− − − − − − − → N

  • y; t + TE[x], TC[x]T⊤

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 13

slide-14
SLIDE 14

Affine Transforms of GAUSSian Random Variables

N

  • x; E[x], C[x]
  • y=t+Tx

− − − − − − − → N

  • y; t + TE[x], TC[x]T⊤

p(y) =

  • dx p(x, y) =
  • dx p(y|x) p(x) =
  • dx δ(y − t − Tx) p(x)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 14

slide-15
SLIDE 15

Affine Transforms of GAUSSian Random Variables

N

  • x; E[x], C[x]
  • y=t+Tx

− − − − − − − → N

  • y; t + TE[x], TC[x]T⊤

p(y) =

  • dx p(x, y) =
  • dx p(y|x) p(x) =
  • dx δ(y − t − Tx) p(x)

A possible representation: δ(x − y) = N

  • x; y, D
  • with D → O!

p(y) =

  • dx N(y; t + Tx, D) N(x; E[x], C[x])

for D → 0

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 15

slide-16
SLIDE 16

Affine Transforms of GAUSSian Random Variables

N

  • x; E[x], C[x]
  • y=t+Tx

− − − − − − − → N

  • y; t + TE[x], TC[x]T⊤

p(y) =

  • dx p(x, y) =
  • dx p(y|x) p(x) =
  • dx δ(y − t − Tx) p(x)

A possible representation: δ(x − y) = N

  • x; y, D
  • with D → O!

p(y) =

  • dx N(y; t + Tx, D) N(x; E[x], C[x])

for D → 0 = N

  • y; t + TE[x], TC[x]T⊤ + D
  • for D → 0;

product formula! Also true if dim(x) = dim(y)!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 16

slide-17
SLIDE 17

A popular model for object evolutions

Piecewise Constant White Acceleration Model

Consider state vectors: xk = (r⊤

k , ˙

r⊤

k )⊤

(position, velocity) For known xk−1 and without external influences we have with ∆Tk = tk − tk−1:

xk =

  • I

∆Tk I

O I rk−1

˙

rk−1

  • =: Fk|k−1xk−1,

see blackboard! Assume during the interval ∆Tk a constant acceleration ak causing the state evolution:

1

2∆T 2 k I

∆Tk I

  • ak =: Gkak,

linear transform! Let ak be a Gaussian RV with pdf: p(ak) = N

  • ak; o, Σ2

kI

  • , we therefore have:

p(Gkak) = N

  • Gkak; o, Σ2

kGkG⊤ k

  • .

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 17

slide-18
SLIDE 18

Therefore: p(xk|xk−1) = N

  • xk; Fk|k−1xk−1, Dk|k−1
  • with

Fk|k−1 =

  • I

∆Tk I

O I

  • ,

Dk|k−1 = Σ2

k

1

4∆T 4 k I 1 2∆T 3 k I 1 2∆T 3 k I

∆T 2

k I

  • Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019

— slide 18

slide-19
SLIDE 19

Sensor Fusion: Gain in Localization Accuracy

If a stationary target is observed by N sensors, we na¨ ıvely expect an improvement in accuracy ∝ 1/ √ N.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 19

slide-20
SLIDE 20

Sensor Fusion: Gain in Localization Accuracy

If a stationary target is observed by N sensors, we na¨ ıvely expect an improvement in accuracy ∝ 1/ √ N. a closer look: The error of each measurement zi is described by a related measurement error covariance matrix Ri (‘error ellipsoids’). In 2 dimensions:

s1 s2 s3

Ri can strongly depend on the underlying senor-to-target geometry!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 20

slide-21
SLIDE 21

More Realistic: Range, Azimuth Measurements

  • measurements in polar coordinates:

zk = (rk, ϕk)⊤, measurement error: R =

  • σ2

r 0

0 σ2

ϕ

  • , r, ϕ independent

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 21

slide-22
SLIDE 22

More Realistic: Range, Azimuth Measurements

  • measurements in polar coordinates:

zk = (rk, ϕk)⊤, measurement error: R =

  • σ2

r 0

0 σ2

ϕ

  • , r, ϕ independent

Likelihood function in polar coordinates: p(zk|xk) = N(zk; xp

k, Rp)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 22

slide-23
SLIDE 23

More Realistic: Range, Azimuth Measurements

  • measurements in polar coordinates:

zk = (rk, ϕk)⊤, measurement error: R =

  • σ2

r 0

0 σ2

ϕ

  • , r, ϕ independent

Likelihood function in polar coordinates: p(zk|xk) = N(zk; xp

k, Rp)

  • What is the likelihood function in Cartesian coordinates?

t[zk] = rk

cos ϕk

sin ϕk

  • Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019

— slide 23

slide-24
SLIDE 24

More Realistic: Range, Azimuth Measurements

  • measurements in polar coordinates:

zk = (rk, ϕk)⊤, measurement error: R =

  • σ2

r 0

0 σ2

ϕ

  • , r, ϕ independent
  • in Cartesian coord.: expand around rk|k−1 = (rk|k−1, ϕk|k−1)⊤:

t[zk] = rk cos ϕk

sin ϕk

  • ≈ t[rk|k−1] + T (zk − rk|k−1)

constant and linear term of a Taylor series only, blackboard!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 24

slide-25
SLIDE 25

More Realistic: Range, Azimuth Measurements

  • measurements in polar coordinates:

zk = (rk, ϕk)⊤, measurement error: R =

  • σ2

r 0

0 σ2

ϕ

  • , r, ϕ independent
  • in Cartesian coord.: expand around rk|k−1 = (rk|k−1, ϕk|k−1)⊤:

t[zk] = rk cos ϕk

sin ϕk

  • ≈ t[rk|k−1] + T (zk − rk|k−1)

T = ∂t[rk|k−1]

∂rk|k−1

=

  • cos ϕk|k−1 −rk|k−1 sin ϕk|k−1

sin ϕk|k−1 rk|k−1 cos ϕk|k−1

  • =
  • cos ϕ

− sin ϕ sin ϕ cos ϕ

  • rotation Dϕ
  • 1

r

  • dilation Sr

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 25

slide-26
SLIDE 26

More Realistic: Range, Azimuth Measurements

  • measurements in polar coordinates:

zk = (rk, ϕk)⊤, measurement error: R =

  • σ2

r 0

0 σ2

ϕ

  • , r, ϕ independent
  • in Cartesian coord.: expand around rk|k−1 = (rk|k−1, ϕk|k−1)⊤:

t[zk] = rk cos ϕk

sin ϕk

  • ≈ t[rk|k−1] + T (zk − rk|k−1)

T = ∂t[rk|k−1]

∂rk|k−1

=

  • cos ϕk|k−1 −rk|k−1 sin ϕk|k−1

sin ϕk|k−1 rk|k−1 cos ϕk|k−1

  • =
  • cos ϕ

− sin ϕ sin ϕ cos ϕ

  • rotation Dϕ
  • 1

r

  • dilation Sr
  • affine transform of GAUSSian random variables:

Nz; x, R

z′=t+Tz

− − − − − − → Nz′; t + Tx, TRT⊤

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 26

slide-27
SLIDE 27

More Realistic: Range, Azimuth Measurements

  • measurements in polar coordinates:

zk = (rk, ϕk)⊤, measurement error: R =

  • σ2

r 0

0 σ2

ϕ

  • , r, ϕ independent
  • in Cartesian coord.: expand around rk|k−1 = (rk|k−1, ϕk|k−1)⊤:

t[zk] = rk cos ϕk

sin ϕk

  • ≈ t[rk|k−1] + T (zk − rk|k−1)

T = ∂t[rk|k−1]

∂rk|k−1

=

  • cos ϕk|k−1 −rk|k−1 sin ϕk|k−1

sin ϕk|k−1 rk|k−1 cos ϕk|k−1

  • =
  • cos ϕ

− sin ϕ sin ϕ cos ϕ

  • rotation Dϕ
  • 1

r

  • dilation Sr
  • Cartesian error covariance (time dependent):

TRT⊤ = DϕSr R SrD⊤

ϕ = Dϕ

  • σ2

r

0 (rσϕ)2

  • D⊤

ϕ

  • sensor fusion: sensor-to-target-geometry enters into TRT⊤

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 27

slide-28
SLIDE 28

s1 s2 s3

sensor fusion: sensor-to-target-geometry enters into TRT⊤

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 28

slide-29
SLIDE 29

Sk Sensors Producing Target Measurement at the Same Time

One possibility:

Hkxk =

   

H1

k

. . .

HSk

k

    xk,

Rk = diag[R1

k, . . . , RSk k ]

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 29

slide-30
SLIDE 30

Sk Sensors Producing Target Measurement at the Same Time

One possibility:

Hkxk =

   

H1

k

. . .

HSk

k

    xk,

Rk = diag[R1

k, . . . , RSk k ]

Alternatively, provided that Hi

k = Hk, i = 1, . . . , Sk:

p(z1

k, z2 k|xk) = p(z1 k|xk) p(z2 k|xk)

independent sensors = N

  • z1

k; Hkxk, R1 k

  • N
  • z2

k; Hkxk, R2 k

  • Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019

— slide 30

slide-31
SLIDE 31

Sk Sensors Producing Target Measurement at the Same Time

One possibility:

Hkxk =

   

H1

k

. . .

HSk

k

    xk,

Rk = diag[R1

k, . . . , RSk k ]

Alternatively, provided that Hi

k = Hk, i = 1, . . . , Sk:

p(z1

k, z2 k|xk) = p(z1 k|xk) p(z2 k|xk)

independent sensors = N

  • z1

k; Hkxk, R1 k

  • N
  • z2

k; Hkxk, R2 k

  • Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019

— slide 31

slide-32
SLIDE 32

A Useful Product Formula for GAUSSians

N

  • z; Hx, R
  • N
  • x; y, P
  • = N
  • z; Hy, S
  • independent of x

×

  • N
  • x; y + Wν, P − WSW⊤

Nx; Q−1(P−1x + H⊤R−1z), Q

  • ν = z − Hy,

S = HPH⊤ + R, W = PH⊤S−1, Q−1 = P−1 + H⊤R−1H.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 32

slide-33
SLIDE 33

Sk Sensors Producing Target Measurement at the Same Time

One possibility:

Hkxk =

   

H1

k

. . .

HSk

k

    xk,

Rk = diag[R1

k, . . . , RSk k ]

Alternatively, provided that Hi

k = Hk, i = 1, . . . , Sk:

p(z1

k, z2 k|xk) = p(z1 k|xk) p(z2 k|xk)

independent sensors = N

  • z1

k; Hkxk, R1 k

  • N
  • z2

k; Hkxk, R2 k

  • = N
  • Hkxk; z1

k, R1 k

  • N
  • z2

k; Hkxk, R2 k

  • ∝ N
  • Hkxk; Rk(R1 −1

k

z1

k + R2 −1 k

z2

k)

  • =zk

, (R1 −1

k

+ R2 −1

k

)−1

  • =Rk
  • = N
  • zk; Hkxk, Rk
  • Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019

— slide 33

slide-34
SLIDE 34

Exercise 3.3

Generalize to the case Sk > 2 (induction argument)!

One possible fusion strategy: Create a singe effective measurement by preprocessing of the individual measurements!

zk = Rk

Sk

  • s=1
  • Rs

k

−1zs

k

weighted arithmetic mean of measurements

Rk =

 

Sk

  • s=1
  • Rs

k

−1  

−1

harmonic mean of measuremt covariances

A typical structure for fusion equations!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on October 30, 2019 — slide 34