cse 571 probabilistic robotics
play

CSE-571 Probabilistic Robotics Bayes Filter Implementations - PowerPoint PPT Presentation

CSE-571 Probabilistic Robotics Bayes Filter Implementations Particle filters Motivation So far, we discussed the Kalman filter: Gaussian, linearization problems Discrete filter: high memory complexity Particle filters are


  1. CSE-571 Probabilistic Robotics Bayes Filter Implementations Particle filters

  2. Motivation § So far, we discussed the § Kalman filter: Gaussian, linearization problems § Discrete filter: high memory complexity § Particle filters are a way to efficiently represent non-Gaussian distributions § Basic principle § Set of state hypotheses ( “ particles ” ) § Survival-of-the-fittest 2

  3. Sample-based Localization (sonar) Probabilistic Robotics 1/21/12 3

  4. Function Approximation § Particle sets can be used to approximate functions § The more particles fall into an interval, the higher the probability of that interval § How to draw samples form a function/distribution? 4

  5. Rejection Sampling § Let us assume that f(x)< 1 for all x § Sample x from a uniform distribution § Sample c from [0,1] § if f(x) > c keep the sample otherwise reject the sampe f(x ’ c ) c OK ’ f(x) x x ’ 5

  6. Importance Sampling Principle § We can even use a different distribution g to generate samples from f § By introducing an importance weight w , we can account for the “ differences between g and f ” § w = f / g § f is often called target § g is often called proposal 6

  7. Importance Sampling with Resampling: Landmark Detection Example

  8. Distributions Wanted: samples distributed according to p(x| z 1 , z 2 , z 3 )

  9. This is Easy! We can draw samples from p(x|z l ) by adding noise to the detection parameters.

  10. Importance Sampling with Resampling ∏ p ( z | x ) p ( x ) k = k Target distributi on f : p ( x | z , z ,..., z ) 1 2 n p ( z , z ,..., z ) 1 2 n p ( z | x ) p ( x ) = Sampling distributi on g : p ( x | z ) l l p ( z ) l ∏ p ( z ) p ( z | x ) l k f p ( x | z , z ,..., z ) = = ≠ 1 2 n k l Importance weights w : g p ( x | z ) p ( z , z ,..., z ) l 1 2 n Weighted samples After resampling

  11. Importance Sampling with Resampling ∏ p ( z | x ) p ( x ) k = k Target distributi on f : p ( x | z , z ,..., z ) 1 2 n p ( z , z ,..., z ) 1 2 n p ( z | x ) p ( x ) = Sampling distributi on g : p ( x | z ) l l p ( z ) l ∏ p ( z ) p ( z | x ) l k f p ( x | z , z ,..., z ) = = ≠ 1 2 n k l Importance weights w : g p ( x | z ) p ( z , z ,..., z ) l 1 2 n

  12. Importance Sampling with Resampling Weighted samples After resampling

  13. Particle Filter Projection

  14. Density Extraction

  15. Sampling Variance

  16. Particle Filters

  17. Sensor Information: Importance Sampling − ← α Bel ( x ) p ( z | x ) Bel ( x ) − α p ( z | x ) Bel ( x ) ← = α w p ( z | x ) − Bel ( x )

  18. Robot Motion ∫ − ← Bel ( x ) p ( x | u x ' ) Bel ( x ' ) d x ' , The image cannot be displayed. Your computer may not have enough memory to open the image, or the image may have been corrupted. Restart your computer, and then open the file again. If the red x still appears, you may have to delete the image and then insert it again.

  19. Sensor Information: Importance Sampling − ← α Bel ( x ) p ( z | x ) Bel ( x ) − α p ( z | x ) Bel ( x ) ← = α w p ( z | x ) − Bel ( x )

  20. Robot Motion ∫ − ← Bel ( x ) p ( x | u x ' ) Bel ( x ' ) d x ' ,

  21. Particle Filter Algorithm 1. Algorithm particle_filter ( S t-1 , u t-1 z t ): = ∅ η = 2. S , 0 t = 3. For Generate new samples i 1 … n 4. Sample index j(i) from the discrete distribution given by w t-1 5. Sample from using and i p ( x | x , u ) j ( i ) u x x − − − − t t 1 t 1 t 1 t t 1 w = 6. i i Compute importance weight p ( z | x ) t t t η = η + i w 7. Update normalization factor t = ∪ < > 8. i i Insert S S { x , w } t t t t = 9. For i 1 … n w = η i i 10. w / Normalize weights t t

  22. Particle Filter Algorithm ∫ = η Bel ( x ) p ( z | x ) p ( x | x , u ) Bel ( x ) dx − − − − t t t t t 1 t 1 t 1 t 1 draw x i t - 1 from Bel (x t - 1 ) draw x i t from p ( x t | x i t - 1 , u t - 1 ) Importance factor for x i t : target distributi on = i w t proposal distributi on η p ( z | x ) p ( x | x , u ) Bel ( x ) = − − − t t t t 1 t 1 t 1 p ( x | x , u ) Bel ( x ) − − − t t 1 t 1 t 1 ∝ p ( z | x ) t t

  23. Resampling • Given : Set S of weighted samples. • Wanted : Random sample, where the probability of drawing x i is given by w i . • Typically done n times with replacement to generate new sample set S ’ .

  24. Resampling w 1 w n w 1 w n w 2 w 2 W n-1 W n-1 w 3 w 3 • Stochastic universal sampling • • Roulette wheel Systematic resampling • • Binary search, n log n Linear time complexity • Easy to implement, low variance

  25. Resampling Algorithm 1. Algorithm systematic_resampling ( S,n ): 2. = ∅ = 1 S ' , c w 1 = 3. For Generate cdf i 2 … n = + i c c w 4. − 1 i i − = 1 5. u ~ U [ 0 , n ], i 1 Initialize threshold 1 = 6. For j 1 … n Draw samples … u > 7. While ( ) c Skip until next threshold reached j i = i + 8. i 1 { } = ∪ < − 1 > i 9. S ' S ' x , n Insert = + − 1 10. Increment threshold u u n j j 11. Return S ’ Also called stochastic universal sampling

  26. Motion Model Reminder Start

  27. Proximity Sensor Model Reminder Sonar sensor Laser sensor

  28. 28

  29. 29

  30. 30

  31. 31

  32. 32

  33. 33

  34. 34

  35. 35

  36. 36

  37. 37

  38. 38

  39. 39

  40. 40

  41. 41

  42. 42

  43. 43

  44. 44

  45. 45

  46. Using Ceiling Maps for Localization [Dellaert et al. 99]

  47. Vision-based Localization P(z|x) z h(x)

  48. Under a Light Measurement z: P(z|x) :

  49. Next to a Light Measurement z: P(z|x) :

  50. Elsewhere Measurement z: P(z|x) :

  51. Global Localization Using Vision

  52. Recovery from Failure

  53. Localization for AIBO robots

  54. Adaptive Sampling

  55. KLD-sampling • Idea : • Assume we know the true belief. • Represent this belief as a multinomial distribution. • Determine number of samples such that we can guarantee that, with probability (1- d ) , the KL-distance between the true posterior and the sample-based approximation is less than e . • Observation : • For fixed d and e , number of samples only depends on number k of bins with support: 3 ⎧ ⎫ − 1 k 1 2 2 = Χ − − δ ≅ − + 2 n ( k 1 , 1 ) 1 z ⎨ ⎬ − δ 1 ε ε − − 2 2 9 ( k 1 ) 9 ( k 1 ) ⎩ ⎭

  56. Adaptive Particle Filter Algorithm Δ ε , δ 1. Algorithm adaptive_particle_filter ( S t-1 , u t-1 z t, ): , = ∅ α = = = = ∅ S t , 0 , n 0 , k 0 , b 2. 3. Do Generate new samples 4. Sample index j(n) from the discrete distribution given by w t-1 n p ( x | x , u ) j ( n ) u 5. Sample from using and x x − − − − t t 1 t 1 t 1 t t 1 w = n n p ( z | x ) 6. Compute importance weight t t t η = η + n w 7. Update normalization factor t = ∪ < > n n S S { x , w } 8. Insert t t t t n x 9. If ( falls into an empty bin b ) Update bins with support t 10. k=k+1, b = non-empty 11. n=n+1 1 < Χ − − δ 2 n ( k 1 , 1 ) 12. While ( ) ε 2 13. For = i 1 … n w = η i i w / 14. Normalize weights t t

  57. Example Run Sonar

  58. Example Run Laser

  59. Evaluation

  60. Localization Algorithms - Comparison Kalman Multi- Topological Grid-based Particle filter hypothesis filter (fixed/variable) maps tracking Sensors Gaussian Gaussian Features Non-Gaussian Non- Gaussian Posterior Gaussian Multi-modal Piecewise Piecewise Samples constant constant Efficiency (memory) ++ ++ ++ -/o +/++ Efficiency (time) ++ ++ ++ o/+ +/++ Implementation + o + +/o ++ Accuracy ++ ++ - +/++ ++ Robustness - + + ++ +/++ Global No Yes Yes Yes Yes localization

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend