Introduction to Mobile Robotics Bayes Filter Extended Kalman - - PowerPoint PPT Presentation

introduction to mobile robotics bayes filter extended
SMART_READER_LITE
LIVE PREVIEW

Introduction to Mobile Robotics Bayes Filter Extended Kalman - - PowerPoint PPT Presentation

Introduction to Mobile Robotics Bayes Filter Extended Kalman Filter Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras 1 Bayes Filter Reminder bel ( x t ) = p ( z t | x t ) p ( x t | u t , x t 1 ) bel ( x t 1 )


slide-1
SLIDE 1

1

Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras

Bayes Filter – Extended Kalman Filter Introduction to Mobile Robotics

slide-2
SLIDE 2

Bayes Filter Reminder

§ Prediction § Correction

bel(xt) = p(xt | ut, xt−1)

bel(xt−1) dxt−1

) ( ) | ( ) (

t t t t

x bel x z p x bel η =

bel(xt) =η p(zt | xt) p(xt | ut, xt−1)

bel(xt−1) dxt−1

slide-3
SLIDE 3

Discrete Kalman Filter

3

t t t t t t

u B x A x ε + + =

−1 t t t t

x C z δ + =

Estimates the state x of a discrete-time controlled process with a measurement

slide-4
SLIDE 4

4

Components of a Kalman Filter

t

ε

Matrix (nxn) that describes how the state evolves from t-1 to t without controls or noise.

t

A

Matrix (nxl) that describes how the control ut changes the state from t-1 to t.

t

B

Matrix (kxn) that describes how to map the state xt to an observation zt.

t

C

t

δ

Random variables representing the process and measurement noise that are assumed to be independent and normally distributed with covariance Qt and Rt respectively.

slide-5
SLIDE 5

Kalman Filter Update Example

5

prediction measurement correction It's a weighted mean!

slide-6
SLIDE 6

Kalman Filter Update Example

prediction correction measurement

slide-7
SLIDE 7

Kalman Filter Algorithm

1. Algorithm Kalman_filter( µt-1, Σt-1, ut, zt):

2. Prediction: 3. 4. 5. Correction: 6. 7. 8. 9. Return µt, Σt

µt = Atµt−1 + Btut Σt = AtΣt−1At

T + Qt

Kt = ΣtCt

T (Ct ΣtCt T + Rt)−1

µt = µt + Kt(zt − Ctµt) Σt = (I − KtCt)Σt

slide-8
SLIDE 8

Nonlinear Dynamic Systems

§ Most realistic robotic problems involve nonlinear functions

t t t t t t

u B x A x ε + + =

−1 t t t t

x C z δ + =

) , (

1 −

=

t t t

x u g x ) (

t t

x h z =

slide-9
SLIDE 9

Linearity Assumption Revisited

slide-10
SLIDE 10

Non-Linear Function

Non-Gaussian!

slide-11
SLIDE 11

Non-Gaussian Distributions

§ The non-linear functions lead to non- Gaussian distributions § Kalman filter is not applicable anymore! What can be done to resolve this?

slide-12
SLIDE 12

Non-Gaussian Distributions

§ The non-linear functions lead to non- Gaussian distributions § Kalman filter is not applicable anymore! What can be done to resolve this? Local linearization!

slide-13
SLIDE 13

EKF Linearization: First Order Taylor Expansion

§ Prediction: § Correction:

) ( ) , ( ) , ( ) ( ) , ( ) , ( ) , (

1 1 1 1 1 1 1 1 1 1 − − − − − − − − − −

− + ≈ − ∂ ∂ + ≈

t t t t t t t t t t t t t t t t

x G u g x u g x x u g u g x u g µ µ µ µ µ ) ( ) ( ) ( ) ( ) ( ) ( ) (

t t t t t t t t t t t

x H h x h x x h h x h µ µ µ µ µ − + ≈ − ∂ ∂ + ≈

Jacobi matrices

slide-14
SLIDE 14

Reminder: Jacobian Matrix

§ It is a non-square matrix in general § Given a vector-valued function § The Jacobian matrix is defined as

slide-15
SLIDE 15

Reminder: Jacobian Matrix

§ It is the orientation of the tangent plane to the vector-valued function at a given point § Generalizes the gradient of a scalar valued function

slide-16
SLIDE 16

EKF Linearization: First Order Taylor Expansion

§ Prediction: § Correction:

) ( ) , ( ) , ( ) ( ) , ( ) , ( ) , (

1 1 1 1 1 1 1 1 1 1 − − − − − − − − − −

− + ≈ − ∂ ∂ + ≈

t t t t t t t t t t t t t t t t

x G u g x u g x x u g u g x u g µ µ µ µ µ ) ( ) ( ) ( ) ( ) ( ) ( ) (

t t t t t t t t t t t

x H h x h x x h h x h µ µ µ µ µ − + ≈ − ∂ ∂ + ≈

Linear function!

slide-17
SLIDE 17

Linearity Assumption Revisited

slide-18
SLIDE 18

Non-Linear Function

slide-19
SLIDE 19

EKF Linearization (1)

slide-20
SLIDE 20

EKF Linearization (2)

slide-21
SLIDE 21

EKF Linearization (3)

slide-22
SLIDE 22

EKF Algorithm

  • 1. Extended_Kalman_filter( µt-1, Σt-1, ut, zt):

2. Prediction: 3. 4. 5. Correction: 6. 7. 8. 9. Return µt, Σt

) , (

1 −

=

t t t

u g µ µ Σt = GtΣt−1Gt

T + Qt

Kt = ΣtHt

T (Ht ΣtHt T + Rt)−1

)) ( (

t t t t t

h z K µ µ µ − + =

t t t t

H K I Σ − = Σ ) (

1 1)

, (

− −

∂ ∂ =

t t t t

x u g G µ

t t t

x h H ∂ ∂ = ) (µ

t t t t t

u B A + =

−1

µ µ Σt = AtΣt−1At

T + Qt

Kt = ΣtCt

T (Ct ΣtCt T + Rt)−1

) (

t t t t t t

C z K µ µ µ − + =

t t t t

C K I Σ − = Σ ) (

slide-23
SLIDE 23

Example: EKF Localization

§ EKF localization with landmarks (point features)

slide-24
SLIDE 24
  • 1. EKF_localization ( µt-1, Σt-1, ut, zt, m):

Prediction: 2. 3. 4. 5. 6.

) , (

1 −

=

t t t

u g µ µ Σt = GtΣt−1Gt

T +VtQtVt T

Gt = ∂g(ut,µt−1) ∂xt−1 = ∂x' ∂µt−1,x ∂x' ∂µt−1,y ∂x' ∂µt−1,θ ∂y' ∂µt−1,x ∂y' ∂µt−1,y ∂y' ∂µt−1,θ ∂θ ' ∂µt−1,x ∂θ ' ∂µt−1,y ∂θ ' ∂µt−1,θ # $ % % % % % % % % % & ' ( ( ( ( ( ( ( ( (

Vt = ∂g(ut,µt−1) ∂ut = ∂x' ∂vt ∂x' ∂ωt ∂y' ∂vt ∂y' ∂ωt ∂θ ' ∂vt ∂θ ' ∂ωt # $ % % % % % % % % % % & ' ( ( ( ( ( ( ( ( ( (

Qt = α1 |vt |+α2 |ωt |

( )

2

α3 |vt |+α4 |ωt |

( )

2

$ % & & ' ( ) )

Motion noise Jacobian of g w.r.t location Predicted mean Predicted covariance (V maps Q into state space) Jacobian of g w.r.t control

slide-25
SLIDE 25
  • 1. EKF_localization ( µt-1, Σt-1, ut, zt, m):

Correction:

2. 3. 4. 5. 6. 7. 8.

) ˆ (

t t t t t

z z K − + = µ µ

( ) t

t t t

H K I Σ − = Σ

! ! ! ! " # $ $ $ $ % & ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂ = ∂ ∂ =

θ θ

µ ϕ µ µ ϕ µ µ ϕ µ µ

, , , , , ,

) , (

t t t t y t t y t t x t t x t t t t t

r r r x m h H

( ) (

) ( )

! ! " # $ $ % & − − − − + − =

θ

µ µ µ µ µ

, , , 2 , 2 ,

, 2 atan ˆ

t x t x y t y y t y x t x t

m m m m z

St = HtΣ

tHt T + Rt

1 −

Σ =

t T t t t

S H K

Rt = σ r

2

σ r

2

# $ % & ' (

Predicted measurement mean (depends on observation type) Innovation covariance Kalman gain Updated mean Updated covariance Jacobian of h w.r.t location

slide-26
SLIDE 26

EKF Prediction Step Examples

VtQtVt

T

VtQtVt

T

VtQtVt

T

VtQtVt

T

slide-27
SLIDE 27

EKF Observation Prediction Step

Rt Rt

slide-28
SLIDE 28

EKF Correction Step

slide-29
SLIDE 29

Estimation Sequence (1)

slide-30
SLIDE 30

Estimation Sequence (2)

slide-31
SLIDE 31

Comparison to GroundTruth

slide-32
SLIDE 32

Extended Kalman Filter Summary

§ Ad-hoc solution to deal with non-linearities § Performs local linearizations in each step § Works well in practice for moderate non- linearities § Example: landmark localization § There exists better ways for dealing with non-linaerities such as the unscented Kalman filter called UKF

32