Maximum A Posteriori (MAP) Estimation Pieter Abbeel UC Berkeley - - PowerPoint PPT Presentation

maximum a posteriori map estimation
SMART_READER_LITE
LIVE PREVIEW

Maximum A Posteriori (MAP) Estimation Pieter Abbeel UC Berkeley - - PowerPoint PPT Presentation

Maximum A Posteriori (MAP) Estimation Pieter Abbeel UC Berkeley EECS Overview X 0 X t-1 X t n Filtering: z 0 z t-1 z t n Smoothing: X 0 X t-1 X t X t+1 X T z 0 z t-1 z t z t+1 z T n MAP: X 0 X t-1 X t X t+1 X T z 0 z t-1 z t z t+1 z T


slide-1
SLIDE 1

Maximum A Posteriori (MAP) Estimation

Pieter Abbeel UC Berkeley EECS

slide-2
SLIDE 2

n Filtering: n Smoothing: n MAP:

Overview

Xt-1 Xt X0 zt-1 zt z0 Xt-1 Xt Xt+1 XT X0 zt-1 zt zt+1 zT z0 Xt-1 Xt Xt+1 XT X0 zt-1 zt zt+1 zT z0

slide-3
SLIDE 3

n Generally:

MAP

Naively solving by enumerating all possible combinations

  • f x_0,…,x_T is

exponential in T !

slide-4
SLIDE 4

MAP --- Complete Algorithm

n O(T n2)

slide-5
SLIDE 5

n

Summations à integrals

n

But: can’t enumerate over all instantiations

n

However, we can still find solution efficiently:

n the joint conditional P(x0:T | z0:T) is a multivariate Gaussian n for a multivariate Gaussian the most likely instantiation equals the

mean à we just need to find the mean of P(x0:T | z0:T)

n the marginal conditionals P(xt | z0:T) are Gaussians with mean equal to the

mean of xt under the joint conditional, so it suffices to find all marginal conditionals

n We already know how to do so: marginal conditionals can be computed

by running the Kalman smoother.

n

Alternatively: solve convex optimization problem

Kalman Filter (aka Linear Gaussian) setting