CSE-571 Probabilistic Robotics Bayes Filter Implementations - - PowerPoint PPT Presentation

cse 571 probabilistic robotics
SMART_READER_LITE
LIVE PREVIEW

CSE-571 Probabilistic Robotics Bayes Filter Implementations - - PowerPoint PPT Presentation

CSE-571 Probabilistic Robotics Bayes Filter Implementations Particle filters Motivation So far, we discussed the Kalman filter: Gaussian, linearization problems Discrete filter: high memory complexity Particle filters are


slide-1
SLIDE 1

CSE-571 Probabilistic Robotics

Bayes Filter Implementations Particle filters

slide-2
SLIDE 2

2

§ So far, we discussed the § Kalman filter: Gaussian, linearization problems § Discrete filter: high memory complexity § Particle filters are a way to efficiently represent

non-Gaussian distributions

§ Basic principle § Set of state hypotheses (“particles”) § Survival-of-the-fittest

Motivation

slide-3
SLIDE 3

Sample-based Localization (sonar)

1/21/12 3 Probabilistic Robotics

slide-4
SLIDE 4

4

§ Particle sets can be used to approximate functions

Function Approximation

§ The more particles fall into an interval, the higher

the probability of that interval

§ How to draw samples form a function/distribution?

slide-5
SLIDE 5

5

§ Let us assume that f(x)<1 for all x § Sample x from a uniform distribution § Sample c from [0,1] § if f(x) > c

keep the sample

  • therwise

reject the sampe

Rejection Sampling

c x f(x) c ’ x ’ f(x’ )

OK

slide-6
SLIDE 6

6

§ We can even use a different distribution g to

generate samples from f

§ By introducing an importance weight w, we can

account for the “differences between g and f ”

§ w = f / g § f is often called

target

§ g is often called

proposal

Importance Sampling Principle

slide-7
SLIDE 7

Importance Sampling with Resampling: Landmark Detection Example

slide-8
SLIDE 8

Distributions

Wanted: samples distributed according to p(x| z1, z2, z3)

slide-9
SLIDE 9

This is Easy!

We can draw samples from p(x|zl) by adding noise to the detection parameters.

slide-10
SLIDE 10

Importance Sampling with Resampling

) ,..., , ( ) ( ) | ( ) ,..., , | ( : f

  • n

distributi Target

2 1 2 1 n k k n

z z z p x p x z p z z z x p

= ) ( ) ( ) | ( ) | ( : g

  • n

distributi Sampling

l l l

z p x p x z p z x p = ) ,..., , ( ) | ( ) ( ) | ( ) ,..., , | ( : w weights Importance

2 1 2 1 n l k k l l n

z z z p x z p z p z x p z z z x p g f

= =

Weighted samples After resampling

slide-11
SLIDE 11

Importance Sampling with Resampling

) ,..., , ( ) ( ) | ( ) ,..., , | ( : f

  • n

distributi Target

2 1 2 1 n k k n

z z z p x p x z p z z z x p

= ) ( ) ( ) | ( ) | ( : g

  • n

distributi Sampling

l l l

z p x p x z p z x p = ) ,..., , ( ) | ( ) ( ) | ( ) ,..., , | ( : w weights Importance

2 1 2 1 n l k k l l n

z z z p x z p z p z x p z z z x p g f

= =

slide-12
SLIDE 12

Importance Sampling with Resampling

Weighted samples After resampling

slide-13
SLIDE 13

Particle Filter Projection

slide-14
SLIDE 14

Density Extraction

slide-15
SLIDE 15

Sampling Variance

slide-16
SLIDE 16

Particle Filters

slide-17
SLIDE 17

) | ( ) ( ) ( ) | ( ) ( ) | ( ) ( x z p x Bel x Bel x z p w x Bel x z p x Bel α α α = ← ←

− − −

Sensor Information: Importance Sampling

slide-18
SLIDE 18 The image cannot be displayed. Your computer may not have enough memory to open the image, or the image may have been corrupted. Restart your computer, and then open the file again. If the red x still appears, you may have to delete the image and then insert it again.

' d ) ' ( ) ' | ( ) (

,

x x Bel x u x p x Bel

Robot Motion

slide-19
SLIDE 19

) | ( ) ( ) ( ) | ( ) ( ) | ( ) ( x z p x Bel x Bel x z p w x Bel x z p x Bel α α α = ← ←

− − −

Sensor Information: Importance Sampling

slide-20
SLIDE 20

Robot Motion

' d ) ' ( ) ' | ( ) (

,

x x Bel x u x p x Bel

slide-21
SLIDE 21
  • 1. Algorithm particle_filter( St-1, ut-1 zt):

2.

  • 3. For Generate new samples

4. Sample index j(i) from the discrete distribution given by wt-1 5. Sample from using and 6. Compute importance weight 7. Update normalization factor 8. Insert

  • 9. For

10. Normalize weights

Particle Filter Algorithm

, = ∅ = η

t

S

n i … 1 = } , { > < ∪ =

i t i t t t

w x S S

i t

w + =η η

i t

x

) , | (

1 1 − − t t t

u x x p

) ( 1 i j t

x −

1 − t

u

) | (

i t t i t

x z p w = n i … 1 = η /

i t i t

w w =

slide-22
SLIDE 22

draw xi

t-1 from Bel(xt-1)

draw xi

t from p(xt | xi t-1,ut-1)

Importance factor for xi

t:

) | ( ) ( ) , | ( ) ( ) , | ( ) | (

  • n

distributi proposal

  • n

distributi target

1 1 1 1 1 1 t t t t t t t t t t t t i t

x z p x Bel u x x p x Bel u x x p x z p w ∝ = =

− − − − − −

η

1 1 1 1

) ( ) , | ( ) | ( ) (

− − − −

=

t t t t t t t t

dx x Bel u x x p x z p x Bel η

Particle Filter Algorithm

slide-23
SLIDE 23

Resampling

  • Given: Set S of weighted samples.
  • Wanted : Random sample, where the

probability of drawing xi is given by wi.

  • Typically done n times with replacement to

generate new sample set S’.

slide-24
SLIDE 24

w2 w3 w1 wn Wn-1

Resampling

w2 w3 w1 wn Wn-1

  • Roulette wheel
  • Binary search, n log n
  • Stochastic universal sampling
  • Systematic resampling
  • Linear time complexity
  • Easy to implement, low variance
slide-25
SLIDE 25
  • 1. Algorithm systematic_resampling(S,n):

2.

  • 3. For

Generate cdf 4. 5. Initialize threshold

  • 6. For

Draw samples … 7. While ( ) Skip until next threshold reached 8. 9. Insert

  • 10. Increment threshold
  • 11. Return S’

Resampling Algorithm

1 1

, ' w c S = ∅ =

n i … 2 =

i i i

w c c + =

−1

1 ], , [ ~

1 1

=

i n U u n j … 1 =

1 −

+ = n u u

j j i j

c u >

{ }

> < ∪ =

−1

, ' ' n x S S

i

1 + = i i

Also called stochastic universal sampling

slide-26
SLIDE 26

Start

Motion Model Reminder

slide-27
SLIDE 27

Proximity Sensor Model Reminder

Laser sensor Sonar sensor

slide-28
SLIDE 28

28

slide-29
SLIDE 29

29

slide-30
SLIDE 30

30

slide-31
SLIDE 31

31

slide-32
SLIDE 32

32

slide-33
SLIDE 33

33

slide-34
SLIDE 34

34

slide-35
SLIDE 35

35

slide-36
SLIDE 36

36

slide-37
SLIDE 37

37

slide-38
SLIDE 38

38

slide-39
SLIDE 39

39

slide-40
SLIDE 40

40

slide-41
SLIDE 41

41

slide-42
SLIDE 42

42

slide-43
SLIDE 43

43

slide-44
SLIDE 44

44

slide-45
SLIDE 45

45

slide-46
SLIDE 46

Using Ceiling Maps for Localization

[Dellaert et al. 99]

slide-47
SLIDE 47

Vision-based Localization

P(z|x) h(x) z

slide-48
SLIDE 48

Under a Light

Measurement z: P(z|x):

slide-49
SLIDE 49

Next to a Light

Measurement z: P(z|x):

slide-50
SLIDE 50

Elsewhere

Measurement z: P(z|x):

slide-51
SLIDE 51

Global Localization Using Vision

slide-52
SLIDE 52

Recovery from Failure

slide-53
SLIDE 53

Localization for AIBO robots

slide-54
SLIDE 54

Adaptive Sampling

slide-55
SLIDE 55
  • Idea:
  • Assume we know the true belief.
  • Represent this belief as a multinomial distribution.
  • Determine number of samples such that we can guarantee

that, with probability (1- d), the KL-distance between the true posterior and the sample-based approximation is less than e.

  • Observation:
  • For fixed d and e, number of samples only depends on

number k of bins with support:

KLD-sampling

3 1 2

) 1 ( 9 2 ) 1 ( 9 2 1 2 1 ) 1 , 1 ( 2 1 ⎭ ⎬ ⎫ ⎩ ⎨ ⎧ − + − − − ≅ − − Χ =

−δ

ε δ ε z k k k k n

slide-56
SLIDE 56
  • 1. Algorithm adaptive_particle_filter( St-1, ut-1 zt, ):

2.

  • 3. Do

Generate new samples 4. Sample index j(n) from the discrete distribution given by wt-1 5.

Sample from using and

6. Compute importance weight 7. Update normalization factor 8. Insert 9. If ( falls into an empty bin b) Update bins with support 10. k=k+1, b = non-empty 11. n=n+1

  • 12. While ( )
  • 13. For

14. Normalize weights

Adaptive Particle Filter Algorithm

} , { > < ∪ =

n t n t t t

w x S S

n t

w + =η η

n t

x

) , | (

1 1 − − t t t

u x x p

) ( 1 n j t

x −

1 − t

u

) | (

n t t n t

x z p w = n i … 1 = η /

i t i t

w w = δ ε, , Δ

∅ = = = = ∅ = b k n St , , , , α

n t

x ) 1 , 1 ( 2 1

2

δ ε − − Χ < k n

slide-57
SLIDE 57

Example Run Sonar

slide-58
SLIDE 58

Example Run Laser

slide-59
SLIDE 59

Evaluation

slide-60
SLIDE 60

Localization Algorithms - Comparison

Kalman filter Multi- hypothesis tracking Topological maps Grid-based

(fixed/variable)

Particle filter Sensors Gaussian Gaussian Features Non-Gaussian Non- Gaussian Posterior Gaussian Multi-modal Piecewise constant Piecewise constant Samples

Efficiency (memory)

++ ++ ++

  • /o

+/++ Efficiency (time) ++ ++ ++

  • /+

+/++ Implementation +

  • +

+/o ++ Accuracy ++ ++

  • +/++

++ Robustness

  • +

+ ++ +/++ Global localization No Yes Yes Yes Yes