A brief Introduction to Particle Filters Michael Pfeiffer - - PowerPoint PPT Presentation

a brief introduction to particle filters
SMART_READER_LITE
LIVE PREVIEW

A brief Introduction to Particle Filters Michael Pfeiffer - - PowerPoint PPT Presentation

A brief Introduction to Particle Filters Michael Pfeiffer pfeiffer@igi.tugraz.at 18.05.2004 Agenda Problem Statement Classical Approaches Particle Filters Theory Algorithms Applications Problem Statement


slide-1
SLIDE 1

A brief Introduction to Particle Filters

Michael Pfeiffer pfeiffer@igi.tugraz.at

18.05.2004

slide-2
SLIDE 2

Agenda

  • Problem Statement
  • Classical Approaches
  • Particle Filters

– Theory – Algorithms

  • Applications
slide-3
SLIDE 3

Problem Statement

  • Tracking the state of a system as it

evolves over time

  • We have: Sequentially arriving (noisy or

ambiguous) observations

  • We want to know: Best possible

estimate of the hidden variables

slide-4
SLIDE 4

Illustrative Example: Robot Localization

1

Prob

t= 0 Sensory model: never more than 1 mistake Motion model: may not execute action with small prob.

slide-5
SLIDE 5

Illustrative Example: Robot Localization

1

Prob

t= 1

slide-6
SLIDE 6

Illustrative Example: Robot Localization

1

Prob

t= 2

slide-7
SLIDE 7

Illustrative Example: Robot Localization

1

Prob

t= 3

slide-8
SLIDE 8

Illustrative Example: Robot Localization

1

Prob

t= 4

slide-9
SLIDE 9

Illustrative Example: Robot Localization

1 2 3 4

Trajectory

1

Prob

t= 5

slide-10
SLIDE 10

Applications

  • Tracking of aircraft

positions from radar

  • Estimating

communications signals from noisy measurements

  • Predicting economical

data

  • Tracking of people or

cars in surveillance videos

slide-11
SLIDE 11

Bayesian Filtering / Tracking Problem

  • Unknown State Vector x0:t = (x0, …, xt)
  • Observation Vector z1:t
  • Find PDF p(x0:t | z1:t) … posterior distribution
  • or p(xt | z1:t)

… filtering distribution

  • Prior Information given:

– p(x0) … prior on state distribution – p(zt | xt) … sensor model – p(zt | xt-1) … Markovian state-space model

slide-12
SLIDE 12

Sequential Update

  • Storing all incoming measurements is

inconvenient

  • Recursive filtering:

– Predict next state pdf from current estimate – Update the prediction using sequentially arriving new measurements

  • Optimal Bayesian solution: recursively

calculating exact posterior density

slide-13
SLIDE 13

Bayesian Update and Prediction

  • Prediction
  • Update

− − − − −

=

1 1 : 1 1 1 1 : 1

) | ( ) | ( ) | (

t t t t t t t

dx z x p x x p z x p

− − − −

= =

t t t t t t t t t t t t t t t

dx z x p x z p z z p z z p z x p x z p z x p ) | ( ) | ( ) | ( ) | ( ) | ( ) | ( ) | (

1 : 1 1 : 1 1 : 1 1 : 1 : 1

slide-14
SLIDE 14

Agenda

  • Problem Statement
  • Classical Approaches
  • Particle Filters

– Theory – Algorithms

  • Applications
slide-15
SLIDE 15

Kalman Filter

  • Optimal solution for linear-Gaussian

case

  • Assumptions:

– State model is known linear function of last state and Gaussian noise signal – Sensory model is known linear function of state and Gaussian noise signal – Posterior density is Gaussian

slide-16
SLIDE 16

Kalman Filter: Update Equations

) , | ( ) | ( ) , | ( ) | ( ) , | ( ) | ( matrices known : , ) , ( ~ ) , ( ~

| | : 1 1 | 1 | 1 : 1 1 | 1 1 | 1 1 1 : 1 1 1 1 1 1 t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t

P m x N z x p P m x N z x p P m x N z x p H F R N n n x H z Q N v v x F x = = = + = + =

− − − − − − − − − − − − − − 1 1 | 1 | 1 | 1 | | 1 | 1 | | 1 | 1 1 1 | 1 | 1 1 |

) (

− − − − − − − − − − − − − −

= + = − = − + = + = =

t T t t t t t T t t t t t t t t t t t t t t t t t t t t t t T t t t t t t t t t t t t

S H P K R H P H S P H K P P m H z K m m F P F Q P m F m

slide-17
SLIDE 17

Limitations of Kalman Filtering

  • Assumptions are too strong. We often find:

– Non-linear Models – Non-Gaussian Noise or Posterior – Multi-modal Distributions – Skewed distributions

  • Extended Kalman Filter:

– local linearization of non-linear models – still limited to Gaussian posterior

slide-18
SLIDE 18

Grid-based Methods

  • Optimal for discrete and finite state

space

  • Keep and update an estimate of

posterior pdf for every single state

  • No constraints on posterior (discrete)

density

slide-19
SLIDE 19

Limitations of Grid-based Methods

  • Computationally expensive
  • Only for finite state sets
  • Approximate Grid-based Filter

– divide continuous state space into finite number of cells – Hidden Markov Model Filter – Dimensionality increases computational costs dramatically

slide-20
SLIDE 20

Agenda

  • Problem Statement
  • Classical Approaches
  • Particle Filters

– Theory – Algorithms

  • Applications
slide-21
SLIDE 21

Many different names…

Particle Filters

  • Interacting Particle

Approximations

  • Survival of the fittest
  • (Sequential) Monte

Carlo filters

  • Bootstrap filters
  • Condensation
slide-22
SLIDE 22

Sample-based PDF Representation

  • Monte Carlo characterization of pdf:

– Represent posterior density by a set of random i.i.d. samples (particles) from the pdf p(x0:t|z1:t) – For larger number N of particles equivalent to functional description of pdf – For N→∞ approaches optimal Bayesian estimate

slide-23
SLIDE 23

Sample-based PDF Representation

  • Regions of high

density

– Many particles – Large weight of particles

  • Uneven partitioning
  • Discrete

approximation for continuous pdf

=

− =

N i i t t i t t t N

x x w z x P

1 : : : 1 :

) ( δ ) | (

slide-24
SLIDE 24

Importance Sampling

  • Draw N samples x0:t

(i) from Importance

sampling distribution π(x0:t|z1:t)

  • Importance weight
  • Estimation of arbitrary functions ft:

) | ( π ) | ( ) (

: 1 : : 1 : : t t t t t

z x z x p x w =

∫ ∑ ∑

= → = =

∞ → = = t t t t t t s a N t N N i N j j t i t i t i t i t t t N

dx y x p x f f I f I w x w w w x f f I

: : 1 : : . . 1 1 ) ( : ) ( : ) ( ) ( ) ( :

) | ( ) ( ) ( ) ( ˆ ) ( ) ( ~ , ~ ) ( ) ( ˆ

slide-25
SLIDE 25

Sequential Importance Sampling (SIS)

  • Augmenting the samples
  • Weight update

) , | ( π ~ ) , | ( π ) | ( π ) , | ( π ) | ( π ) | ( π

) ( 1 ) ( 1 1 : 1 1 : : 1 1 : 1 : 1 1 : : 1 : t i t t i t t t t t t t t t t t t t

z x x x z x x z x z x x z x z x

− − − − − − −

= = =

) , | ( π ) | ( ) | (

) ( 1 ) ( ) ( 1 ) ( ) ( ) ( 1 ) ( t i t i t i t i t i t t i t i t

z x x x x p x z p w w

− − −

slide-26
SLIDE 26

Degeneracy Problem

  • After a few iterations, all but one

particle will have negligible weight

  • Measure for degeneracy: Effective

sample size

  • Small Neff indicates severe degeneracy
  • Brute force solution: Use very large N

) ( 1

*i t eff

w Var N N + =

wt

* ... true weights at time t

slide-27
SLIDE 27

Choosing Importance Density

  • Choose π to minimize variance of

weights

  • Optimal solution:
  • Practical solution

– importance density = prior

) | ( ) , | ( ) , | ( π

) ( 1 ) ( 1 ) ( ) ( 1 ) ( 1 i t t i t i t t i t t t i t t

  • pt

x z p w w z x x p z x x

− − − −

∝ ⇒ = ) | ( ) | ( ) , | ( π

) ( ) ( 1 ) ( ) ( 1 ) ( 1 i t t i t i t i t t t i t t

x z p w w x x p z x x

− − −

∝ ⇒ =

slide-28
SLIDE 28

Resampling

  • Eliminate particles with small importance

weights

  • Concentrate on particles with large weights
  • Sample N times with replacement from the

set of particles xt

(i) according to importance

weights wt

(i)

  • „Survival of the fittest“
slide-29
SLIDE 29

Sampling Importance Resample Filter: Basic Algorithm

  • 1. INIT, t= 0

– for i= 1,..., N: sample x0

(i)~ p(x0); t:= 1;

  • 2. IMPORTANCE SAMPLING

– for i= 1,..., N: sample xt

(i) ~ p(xt|xt-1 (i))

  • x0:t

(i) := (x0:t-1 (i), xt (i))

– for i= 1,..., N: evaluate importance weights wt

(i)= p(zt|xt (i))

– Normalize the importance weights

  • 3. SELECTION / RESAMPLING

– resample with replacement N particles x0:t

(i) according to the

importance weights – Set t:= t+ 1 and go to step 2

slide-30
SLIDE 30

Variations

  • Auxiliary Particle Filter:

– resample at time t-1 with one-step lookahead (re-evaluate with new sensory information)

  • Regularisation:

– resample from continuous approximation

  • f posterior p(xt|z1:t)
slide-31
SLIDE 31

Visualization of Particle Filter

unweighted measure compute importance weights ⇒ p(xt-1|z1:t-1) resampling move particles predict p(xt|z1:t-1)

slide-32
SLIDE 32

Particle Filter Demo 1

moving Gaussian + uniform, N= 100 particles

slide-33
SLIDE 33

Particle Filter Demo 2

moving Gaussian + uniform, N= 1000 particles

slide-34
SLIDE 34

Particle Filter Demo 3

moving (sharp) Gaussian + uniform, N= 100 particles

slide-35
SLIDE 35

Particle Filter Demo 4

moving (sharp) Gaussian + uniform, N= 1000 particles

slide-36
SLIDE 36

Particle Filter Demo 5

mixture of two Gaussians, filter loses track of smaller and less pronounced peaks

slide-37
SLIDE 37

Obtaining state estimates from particles

  • Any estimate of a function f(xt) can be

calculated by discrete PDF-approximation

  • Mean:
  • MAP-estimate: particle with largest weight
  • Robust mean: mean within window around

MAP-estimate

[ ]

∑ =

=

N j j t j t t

x f w N x f E

1 ) ( ) (

) ( 1 ) (

[ ]

∑ =

=

N j j t j t t

x w N x E

1 ) ( ) (

1

slide-38
SLIDE 38

Pros and Cons of Particle Filters

+ Estimation of full PDFs + Non-Gaussian distributions

+ e.g. multi-modal

+ Non-linear state and

  • bservation model

+ Parallelizable

  • Degeneracy problem
  • High number of

particles needed

  • Computationally

expensive

  • Linear-Gaussian

assumption is often sufficient

slide-39
SLIDE 39

Agenda

  • Problem Statement
  • Classical Approaches
  • Particle Filters

– Theory – Algorithms

  • Applications
slide-40
SLIDE 40

Mobile Robot Localization

  • Animation by

Sebastian Thrun, Stanford

  • http://robots.

stanford.edu

slide-41
SLIDE 41

Positioning Systems1

  • Track car position in

given road map

  • Track car position from

radio frequency measurements

  • Track aircraft position

from estimated terrain elevation

  • Collision Avoidance

(Prediction)

  • Replacement for GPS

1: Gustafsson, et.al.: Particle Filters for Positioning, Navigation and Tracking. IEEE Transactions on Signal Processing Vol. 50, 2002

slide-42
SLIDE 42

Model Estimation

  • Tracking with multiple motion-models

– Discrete hidden variable indicates active model (manoever)

  • Recovery of signal from noisy measurements

– even if signal may be absent (e.g. synaptic currents) – mixture model of several hypotheses

  • Neural Network model selection [de Freitas] 1

– estimate parameters and architecture of RBF network from input-output pairs – on-line classification (time-varying classes)

1: de Freitas, et.al.: Sequential Monte Carlo Methods for Neural Networks. in: Doucet, et.al.: Sequential Monte Carlo Methods in Practice, Springer Verlag, 2001

slide-43
SLIDE 43

Other Applications

  • Visual Tracking

– e.g. human motion (body parts)

  • Prediction of (financial) time series

– e.g. mapping gold price stock price

  • Quality control in semiconductor industry
  • Military applications

– Target recognition from single or multiple images – Guidance of missiles

slide-44
SLIDE 44

Possible Uses for our Group

  • Reinforcement Learning

– POMDPs – Estimating Opponent States

  • RoboCup: Multi-robot localization and tracking
  • Representation of PDFs
  • Prediction Tasks
  • Preprocessing of Visual Input
  • Identifying Neural Network Layouts or other Hidden

System Parameters

  • Applications in Computational Neuroscience (?)
  • Other suggestions?
slide-45
SLIDE 45

Sources

  • Doucet, de Freitas, Gordon: Sequential Monte

Carlo Methods in Practice, Springer Verlag, 2001

  • Arulampalam, Maskell, Gordon, Clapp: A

Tutorial on Particle Filters for on-line Non- linear / Non-Gaussian Bayesian Tracking, IEEE Transactions on Signal Processing, Vol. 50, 2002

slide-46
SLIDE 46

Thank you!