Sampling Methods Henrik I. Christensen Robotics & Intelligent - - PowerPoint PPT Presentation

sampling methods
SMART_READER_LITE
LIVE PREVIEW

Sampling Methods Henrik I. Christensen Robotics & Intelligent - - PowerPoint PPT Presentation

Introduction Basic Algorithms Example Summary Sampling Methods Henrik I. Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I. Christensen (RIM@GT)


slide-1
SLIDE 1

Introduction Basic Algorithms Example Summary

Sampling Methods

Henrik I. Christensen

Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu

Henrik I. Christensen (RIM@GT) Sampling Methods 1 / 23

slide-2
SLIDE 2

Introduction Basic Algorithms Example Summary

Outline

1

Introduction

2

Basic Algorithms

3

Small Robot Example

4

Summary

Henrik I. Christensen (RIM@GT) Sampling Methods 2 / 23

slide-3
SLIDE 3

Introduction Basic Algorithms Example Summary

Introduction

Last time we talked about approximation of distributions using deterministic approximation Not clear that it is always possible to generate such solutions. Might not be possible to generate an estimate of the distribution How can we find the distribution / expectation of a function wrt a probability distribution p(z)? I.e. can we evaluate E[f ] =

  • f (z)p(z)dz

Henrik I. Christensen (RIM@GT) Sampling Methods 3 / 23

slide-4
SLIDE 4

Introduction Basic Algorithms Example Summary

Introduction

The general idea with sampling

Collect a set of samples z(l) from p(z) Evaluate ˆ f = 1 L

L

  • l=1

f (z(l))

ˆ f is a good estimate of the mode/expectation and the variance of the estimator can be computed Main challenge - how can we generates samples from p(z)?

Henrik I. Christensen (RIM@GT) Sampling Methods 4 / 23

slide-5
SLIDE 5

Introduction Basic Algorithms Example Summary

Introduction

In some cases a graphical model is available From earlier (8.1.2) we have p(z) =

M

  • i=1

p(zi|pai) where pai is the set of parents to a node i We can then sample the conditional distributions Do a pass through the tree and compute the distribution

Henrik I. Christensen (RIM@GT) Sampling Methods 5 / 23

slide-6
SLIDE 6

Introduction Basic Algorithms Example Summary

Outline

1

Introduction

2

Basic Algorithms

3

Small Robot Example

4

Summary

Henrik I. Christensen (RIM@GT) Sampling Methods 6 / 23

slide-7
SLIDE 7

Introduction Basic Algorithms Example Summary

Standard distributions

It is common to generate pseudo random sequences using a standard random number generator If we want to transform the z according to a function f(.) so that y=f(z) we have p(y) = p(z)

  • dz

dy

  • Integrating we have

z = h(y) = y

−∞

p(ˆ y)dˆ y So we are after y = h−1(z)

Henrik I. Christensen (RIM@GT) Sampling Methods 7 / 23

slide-8
SLIDE 8

Introduction Basic Algorithms Example Summary

Standard Distributions

Consider the standard exponential distribution p(y) = λ exp(−λy) Then h(y) = 1 − exp(λy) ⇒ y = −λ−1 ln(1 − z) Now the uniform z will have an exponential distribution

Henrik I. Christensen (RIM@GT) Sampling Methods 8 / 23

slide-9
SLIDE 9

Introduction Basic Algorithms Example Summary

Uniform Gaussian (in 2D)

Consider a mapping f1(z) = 2z − 1 Reject all values where z2

1 + z2 2 > 1

Assign p(z1, z2) = 1/π This is a uniform distribution inside the unit circle Generate the mapping yi = zi −2 ln zi r2 2 The joint distribution is then p(y1, y2) =

  • 1

√ 2π exp(−y2

1

2 ) 1 √ 2π exp(−y2

2

2 )

  • Henrik I. Christensen (RIM@GT)

Sampling Methods 9 / 23

slide-10
SLIDE 10

Introduction Basic Algorithms Example Summary

Standard Distributions

Transformation depends upon the ability to calculate and then invert the indefinite integral Sometimes more general approaches are required for sampling

1

Rejection Sampling

2

Important Sampling

Henrik I. Christensen (RIM@GT) Sampling Methods 10 / 23

slide-11
SLIDE 11

Introduction Basic Algorithms Example Summary

Rejection Sampling

Assume we have a distribution p(z) which is difficult to access, but we have a scaled version ˜ p(z) which is the same up to a scaling p(z) = 1 Zp ˜ p(z) We can propose a distribution q(z) We can then consider kq(z) ≥ ˜ p(z)

Henrik I. Christensen (RIM@GT) Sampling Methods 11 / 23

slide-12
SLIDE 12

Introduction Basic Algorithms Example Summary

Rejection Sampling - Idea z0 z u0 kq(z0) kq(z)

  • p(z)

Henrik I. Christensen (RIM@GT) Sampling Methods 12 / 23

slide-13
SLIDE 13

Introduction Basic Algorithms Example Summary

Rejection Sampling - Example

z p(z) 10 20 30 0.05 0.1 0.15

Henrik I. Christensen (RIM@GT) Sampling Methods 13 / 23

slide-14
SLIDE 14

Introduction Basic Algorithms Example Summary

Rejection Sampling - Example

z p(z) −5 5 0.25 0.5

Henrik I. Christensen (RIM@GT) Sampling Methods 14 / 23

slide-15
SLIDE 15

Introduction Basic Algorithms Example Summary

Importance Sampling

Assume we could generate a uniform tessellation of space We can now evaluate E[f ] ≈

  • l

p(z(l))f (z(l)) Assume we have a simpler distribution q(z) that is easier to sample E[f ] =

  • f (z)p(z)dz

=

  • f (z)p(z)

q(z)q(z)dz ≈ 1 L

  • l

p(z(l)) q(z(l))f (z(l)) the ratio r = p(z)/q(z) is known as the importance weight We can normalize to have weights wi = ri/ rl

Henrik I. Christensen (RIM@GT) Sampling Methods 15 / 23

slide-16
SLIDE 16

Introduction Basic Algorithms Example Summary

Sampling Importance Resampling

The sampling can be re-evaluated over time to avoid a uniform distribution of sampling up-front. I.e. we should sample more densely close major probability masses.

Henrik I. Christensen (RIM@GT) Sampling Methods 16 / 23

slide-17
SLIDE 17

Introduction Basic Algorithms Example Summary

Outline

1

Introduction

2

Basic Algorithms

3

Small Robot Example

4

Summary

Henrik I. Christensen (RIM@GT) Sampling Methods 17 / 23

slide-18
SLIDE 18

Introduction Basic Algorithms Example Summary

Monte-Carlo Based Localization

Monte-Carlo based methods is using a sample model for approximation of the pose estimate Using a grid model as presented earlier Assume we have a number of particles in a collection St =

  • (s(i)

t , π(i) t )|i = 1..N

  • each particle is a hypothesis for the position of the robot, and π(i)

t

is an associated weight We can now approximate p(st|z0, z1, ...zt) for any distribution of the pose hypotheses

Henrik I. Christensen (RIM@GT) Sampling Methods 18 / 23

slide-19
SLIDE 19

Introduction Basic Algorithms Example Summary

Monte-Carlo Strategy

1 Draw N samples from an initial PDF. Typically a uniform distribution.

Give each sample a weight of 1

N

2 Propagate the motion information and draw a new sample from the

distribution p(s(i)

t+1|s(i) t , ot)

3 Set the weight of the sample to π(i)

t+1 = p(zt+1|s(i) t+1) ∗ π(i) t

based on sensory input

4 Generate a new sample set by drawing samples from the current set

and a basis distribution (typically uniform). Normalize the weights

5 Go back to step 2 Henrik I. Christensen (RIM@GT) Sampling Methods 19 / 23

slide-20
SLIDE 20

Introduction Basic Algorithms Example Summary

Monte-Carlo Example

Example of particle distribution around estimate of position Sonar readings for update

  • f the position

Video of system in

  • peration

Henrik I. Christensen (RIM@GT) Sampling Methods 20 / 23

slide-21
SLIDE 21

Introduction Basic Algorithms Example Summary

Monte-Carlo Example

Henrik I. Christensen (RIM@GT) Sampling Methods 21 / 23

slide-22
SLIDE 22

Introduction Basic Algorithms Example Summary

Outline

1

Introduction

2

Basic Algorithms

3

Small Robot Example

4

Summary

Henrik I. Christensen (RIM@GT) Sampling Methods 22 / 23

slide-23
SLIDE 23

Introduction Basic Algorithms Example Summary

Summary

Sampling based methods are efficient for many applications The key is typically in the sampling/re-sampling strategy There are quite a few ways sampling can be applied Next time we will cover the popular MCMC strategy

Henrik I. Christensen (RIM@GT) Sampling Methods 23 / 23