CSE-571 So far, we discussed the Kalman filter: Gaussian, - - PowerPoint PPT Presentation

cse 571
SMART_READER_LITE
LIVE PREVIEW

CSE-571 So far, we discussed the Kalman filter: Gaussian, - - PowerPoint PPT Presentation

Motivation CSE-571 So far, we discussed the Kalman filter: Gaussian, linearization problems Probabilistic Robotics Discrete filter: high memory complexity Particle filters are a way to efficiently represent Bayes Filter


slide-1
SLIDE 1

CSE-571 Probabilistic Robotics

Bayes Filter Implementations Particle filters

2

♣ So far, we discussed the ♣Kalman filter: Gaussian, linearization problems ♣Discrete filter: high memory complexity ♣ Particle filters are a way to efficiently represent

non-Gaussian distributions

♣ Basic principle ♣Set of state hypotheses (“particles”) ♣Survival-of-the-fittest Motivation Sample-based Localization (sonar)

4

♣ Particle sets can be used to approximate functions Function Approximation ♣ The more particles fall into an interval, the higher

the probability of that interval

♣ How to draw samples form a function/distribution?

slide-2
SLIDE 2

5

♣ Let us assume that f(x)<1 for all x ♣ Sample x from a uniform distribution ♣ Sample c from [0,1] ♣ if f(x) > c

keep the sample

  • therwise

reject the sampe

Rejection Sampling

c x f(x) c’ x’ f(x’)

OK

6

♣ We can even use a different distribution g to

generate samples from f

♣ By introducing an importance weight w, we can

account for the “differences between g and f ”

♣ w = f / g ♣ f is often called

target

♣ g is often called

proposal

Importance Sampling Principle

Importance Sampling with Resampling: Landmark Detection Example

Distributions

Wanted: samples distributed according to p(x| z1, z2, z3)

slide-3
SLIDE 3

This is Easy!

We can draw samples from p(x|zl) by adding noise to the detection parameters.

Importance Sampling with Resampling

) ,..., , ( ) ( ) | ( ) ,..., , | ( : f

  • n

distributi Target

2 1 2 1 n k k n

z z z p x p x z p z z z x p

  • =

) ( ) ( ) | ( ) | ( : g

  • n

distributi Sampling

l l l

z p x p x z p z x p = ) ,..., , ( ) | ( ) ( ) | ( ) ,..., , | ( : w weights Importance

2 1 2 1 n l k k l l n

z z z p x z p z p z x p z z z x p g f

  • =

=

Weighted samples After resampling

Importance Sampling with Resampling

) ,..., , ( ) ( ) | ( ) ,..., , | ( : f

  • n

distributi Target

2 1 2 1 n k k n

z z z p x p x z p z z z x p

  • =

) ( ) ( ) | ( ) | ( : g

  • n

distributi Sampling

l l l

z p x p x z p z x p = ) ,..., , ( ) | ( ) ( ) | ( ) ,..., , | ( : w weights Importance

2 1 2 1 n l k k l l n

z z z p x z p z p z x p z z z x p g f

  • =

=

Importance Sampling with Resampling

Weighted samples After resampling

slide-4
SLIDE 4

Particle Filter Projection Density Extraction Sampling Variance Particle Filters

slide-5
SLIDE 5

) | ( ) ( ) ( ) | ( ) ( ) | ( ) ( x z p x Bel x Bel x z p w x Bel x z p x Bel

  • =
  • Sensor Information: Importance Sampling
  • '

d ) ' ( ) ' | ( ) (

,

x x Bel x u x p x Bel

Robot Motion

) | ( ) ( ) ( ) | ( ) ( ) | ( ) ( x z p x Bel x Bel x z p w x Bel x z p x Bel

  • =
  • Sensor Information: Importance Sampling

Robot Motion

  • '

d ) ' ( ) ' | ( ) (

,

x x Bel x u x p x Bel

slide-6
SLIDE 6
  • 1. Algorithm particle_filter( St-1, ut-1 zt):

2.

  • 3. For Generate new samples

4. Sample index j(i) from the discrete distribution given by wt-1 5. Sample from using and 6. Compute importance weight 7. Update normalization factor 8. Insert

  • 9. For

10. Normalize weights

Particle Filter Algorithm

, =

  • =
  • t

S n i K 1 = } , { > <

  • =

i t i t t t

w x S S

i t

w + =

  • i

t

x ) , | (

1 1

  • t

t t

u x x p

) ( 1 i j t

x

1

  • t

u ) | (

i t t i t

x z p w = n i K 1 =

  • /

i t i t

w w =

draw xi

t−1 from Bel(xt−1)

draw xi

t from p(xt | xi t−1,ut−1)

Importance factor for xi

t:

) | ( ) ( ) , | ( ) ( ) , | ( ) | (

  • n

distributi proposal

  • n

distributi target

1 1 1 1 1 1 t t t t t t t t t t t t i t

x z p x Bel u x x p x Bel u x x p x z p w

  • =

=

  • 1

1 1 1

) ( ) , | ( ) | ( ) (

  • =

t t t t t t t t

dx x Bel u x x p x z p x Bel

  • Particle Filter Algorithm

Resampling

  • Given: Set S of weighted samples.
  • Wanted : Random sample, where the

probability of drawing xi is given by wi.

  • Typically done n times with replacement to

generate new sample set S’.

w2 w3 w1 wn Wn-1

Resampling

w2 w3 w1 wn Wn-1

  • Roulette wheel
  • Binary search, n log n
  • Stochastic universal sampling
  • Systematic resampling
  • Linear time complexity
  • Easy to implement, low variance
slide-7
SLIDE 7
  • 1. Algorithm systematic_resampling(S,n):

2.

  • 3. For

Generate cdf 4. 5. Initialize threshold

  • 6. For

Draw samples … 7. While ( ) Skip until next threshold reached 8. 9. Insert

  • 10. Increment threshold
  • 11. Return S’

Resampling Algorithm

1 1

, ' w c S =

  • =

n i K 2 =

i i i

w c c + =

1

1 ], , [ ~

1 1

=

  • i

n U u n j K 1 =

1

  • +

= n u u

j j i j

c u >

{ }

> <

  • =

1

, ' ' n x S S

i

1 + = i i

Also called stochastic universal sampling

Start

Motion Model Reminder

Proximity Sensor Model Reminder

Laser sensor Sonar sensor

28

slide-8
SLIDE 8

29 30 31 32

slide-9
SLIDE 9

33 34 35 36

slide-10
SLIDE 10

37 38 39 40

slide-11
SLIDE 11

41 42 43 44

slide-12
SLIDE 12

45

Sample-based Localization (sonar)

Using Ceiling Maps for Localization

[Dellaert et al. 99]

Vision-based Localization

P(z|x) h(x) z

slide-13
SLIDE 13

Under a Light

Measurement z: P(z|x):

Next to a Light

Measurement z: P(z|x):

Elsewhere

Measurement z: P(z|x):

Global Localization Using Vision

slide-14
SLIDE 14

Recovery from Failure

Localization for AIBO robots

Adaptive Sampling

  • Idea:
  • Assume we know the true belief.
  • Represent this belief as a multinomial distribution.
  • Determine number of samples such that we can guarantee

that, with probability (1- δ), the KL-distance between the true posterior and the sample-based approximation is less than ε.

  • Observation:
  • For fixed δ and ε, number of samples only depends on

number k of bins with support:

KLD-sampling

3 1 2

) 1 ( 9 2 ) 1 ( 9 2 1 2 1 ) 1 , 1 ( 2 1

  • +
  • =
  • z

k k k k n

slide-15
SLIDE 15
  • 1. Algorithm adaptive_particle_filter( St-1, ut-1 zt, ):

2.

  • 3. Do

Generate new samples 4. Sample index j(n) from the discrete distribution given by wt-1 5. Sample from using and 6. Compute importance weight 7. Update normalization factor 8. Insert 9. If ( falls into an empty bin b) Update bins with support 10. k=k+1, b = non-empty 11. n=n+1

  • 12. While ( )
  • 13. For

14. Normalize weights

Adaptive Particle Filter Algorithm

} , { > <

  • =

n t n t t t

w x S S

n t

w + =

  • n

t

x ) , | (

1 1

  • t

t t

u x x p

) ( 1 n j t

x

1

  • t

u ) | (

n t t n t

x z p w = n i K 1 =

  • /

i t i t

w w =

  • ,

,

  • =

= = =

  • =

b k n St , , , ,

  • n

t

x ) 1 , 1 ( 2 1

2

  • <

k n

Evaluation Example Run Sonar Example Run Laser

slide-16
SLIDE 16

Localization Algorithms - Comparison

Kalman filter Multi- hypothesis tracking Topological maps Grid-based

(fixed/variable)

Particle filter Sensors Gaussian Gaussian Features Non-Gaussian Non- Gaussian Posterior Gaussian Multi-modal Piecewise constant Piecewise constant Samples

Efficiency (memory)

++ ++ ++

  • /o

+/++ Efficiency (time) ++ ++ ++

  • /+

+/++ Implementation +

  • +

+/o ++ Accuracy ++ ++

  • +/++

++ Robustness

  • +

+ ++ +/++ Global localization No Yes Yes Yes Yes