fax pcxiysfex.gs ] ELF interest = t T variable Probability - - PowerPoint PPT Presentation

fax
SMART_READER_LITE
LIVE PREVIEW

fax pcxiysfex.gs ] ELF interest = t T variable Probability - - PowerPoint PPT Presentation

Probabilistic Reasoning Problem This Course Cone in of Quantity f fax pcxiysfex.gs ] ELF interest = t T variable Probability density Random - driving Self Diagnosis Cars Pedestrian Trajectory Symptoms y Examples Future


slide-1
SLIDE 1 Probabilistic Reasoning Cone Problem in This Course f Quantity
  • f
ELF ] =

fax

pcxiysfex.gs interest T t Random variable Probability density Examples Self
  • driving
Cars Diagnosis y Pedestrian Trajectory Symptoms x Future Trajectory Conditions f Will Pedestrian Cross ? Outcomes
slide-2
SLIDE 2

Random

Variables Random Variable : A variable with a stochastic
  • utcome
( mutually exclusive ) X variable x
  • utcome
× e {
  • 1. 2,345,6 }

Event

: A set
  • f
  • utcomes
( X z 3) { 3. 4. 5,6 } ( X=n ) = { 4 } Probability : The chance that an event
  • ccurs
PCX ? } ) = 4/6 P(X=u ) = 1 16
slide-3
SLIDE 3 Distributions A distribution maps
  • utcomes
to probabilities P(X=x ) = 1/6 xe { i. z , 3.4.56 } Commonly used (
  • r
abused ) shorthand ; PK ) = P(X=x )
slide-4
SLIDE 4 Condition

.at

Probabilities Joint Probability PCA , B) i = PLAN B ) A :X > 4 In 9 W B :

Xf5

Events Event DIA ) =

316

Conditional Probability pc B) =

516

PLA I B ) i= PCA , B ) pl A.

133=216

PCB ) PCAIB ) i 21.5 P( BIA )
  • 213
slide-5
SLIDE 5

Sum

Rule General Case Men with Either man
  • n
short hair short hair
  • f
0.4 P (

Au

B ) =

PLA

) +

PCB

)
  • PC A. B)
p in Is man Short hair 0.5+0.6
  • h
0.5 0,6 = 0.7 (
  • rrolariies
Random var Outcome To f PCA ) = [ P(X=x ) P(4=y)={P(4=y,X=× ) x e A r Event ( Most common form )
slide-6
SLIDE 6

Bayes

. Rule Product Rule . P( A. B ) = P ( AIB ) PCB ) } From def . = P( BIA ) p( A )
  • f
conditional Bayes ' Rule P ( A 1 B ) = P ( A. B) ( definition ) PCB ) = PLBIA ) PIA ) ( pnod rule ) PCB )
slide-7
SLIDE 7 Example A : You have a rare disease PCA ) =
  • .
  • ,
PGA ) =
  • .
9999 B i Test for disease is positive P ( B I A) = 0.99 0.99 . 0.0001=0.00099 I .
  • I
P ( B I n A ) =
  • .O
I 0.9999 .
  • . ol
=
  • .o
, PCB ) = PCBIA ) PCA) t Pl BHA ) PGA) =
  • .o

lol

Question : What is

PCA

1B ) ? p ( A/B ) =

PIBlA)P(A)-

= 0.000 ' =
  • .o
, PCB ) 0.0101
slide-8
SLIDE 8 Probability Densities Suppose that X is a continuous variable then P(X=x ) is for any
  • utcome
× X~ Normal ( 0,1 ) PCX=i)=o PC 3 SXS 4) =/
  • Define
density function \ event ( informal ) P( x
  • 8s×<
x +8 ) pxcxs = lim . 8
  • 28
slide-9
SLIDE 9 Probability Space ( D , F , P ) R Sample space ( set
  • f
  • utcomes
) F- Set
  • f
events ( set
  • f
possible subsets
  • f
  • utcomes
) P Probability measure C probabilities far events ) P : F [
  • ,
, ]

PC

! Ei ) = ? P C Ei )

[

maps events to probs ) P C ¢ )
  • C
empty at when Ei disjoint Was prob
  • )

PCs

) =L ( probabilities ( all
  • f
Sample satisfy Sum space hers prob I ) rule )
slide-10
SLIDE 10 Examples
  • f
Measures C not probability measure ) Lebesgue Measure : µ( [ a ,b ] ) = b- a ( width
  • f
interval ) Counting

Measure

: µ ( { xi 3 ;! , ) = n ( number
  • f
elements ) Product Measure : µ ( E ) = µ , ( E , ) µ{ Ez ) ( Cartesian product ) E := ( E , ,Ez )
slide-11
SLIDE 11 Definition
  • f
Probability Measure f differential
  • f
reference measure PCA ) /×eA Px ' it

! If

xw.r.t.ir Radar
  • µ
ihodynr Machine Learning Notation derivative f Implicitly assumes P ( A ) = ↳ pix ) dx reference measure µ \ Implicitly refers to density
  • f
X w.at . µ
slide-12
SLIDE 12 Expected Values ( X is a hand van X ~ pcx ) with density pcx ) ) II [ X ) :-. / x pcx ) dx Conditional Expectation II [ f ( x. 4) I 4=y ]

:=

) f

( x. g) pcxiy ) dx Expectation w.at . a different distribution Eacx , [ f ( x ) ] = / fix > qcx ) dx ( Machine learning)
slide-13
SLIDE 13 Central problem in this course

Epcxiyslftx

) ) Quantity
  • f
Interest a I

µ

Things we know Things We don't know

Examples

Self
  • driving
Cars Diagnosis y Past trajectory Symptoms × Future trajectory Condition f- Will pedestrian cross ? Treatment Outcome
slide-14
SLIDE 14 Probabilistic

Models

as Stochastic Simulators

FE→①

  • f
  • i
7- I ' 2- goal u p C 7 goal ) | IT'
  • "
' c assumptions about ! . ! . i likely destinations ) ! g i
  • f
/ Z lit ~ pctii.tk 7 goal .to ) H . µ p C pedestrian simulation ) Zenit ~ PC Ftt , I F. it , to ) ( inference about trajectories )
slide-15
SLIDE 15 Bayesian Inference is General , But HED P Eti it , to ) P ( Ft , is T I Z lit , to ) = PC 7 ii. t , 7-
  • )
( known as predictive distribution ) pc
  • 2-
lit , 7-
  • )

=/

dtgpcz.it/Zo.2-g)pc7g ) ¢ need to integrate
  • ver
all possible goal locations ( almost always intractable )
slide-16
SLIDE 16

Making

Inference Tractable I . Conditional Independence 2 . Conjugacy 3 . Approximate Inference Monte Carlo Methods

Epcxiyslfk.gs/=ssf?fCx5y1xsnpcxiy7

Variational Methods : 4 # = argqmin D ( potty ) It got ) )
slide-17
SLIDE 17 Conditional Independence Example : Clustering E , Mh , Lu
  • ply
, E ) b- . i , . . . ,k an
  • pen
ni , . . . . " M ynlzn
  • k
a Nlplh , Eu ) P ( Yi ,
  • N )
k×2 KXZXZ K " Intractable : ply ) = ldptdfdttpy.7.lu , E) Tractable :

pgjtiy.lu

, E) = EI , pltnlyn.lu , I ) Kk " KXN
slide-18
SLIDE 18 Conjugacy Example : Biased Coins X n Beta C a , B )

Yn

n Bernoulli ( X ) Bayes Rule p ( X I 4 , = y , , . . . , 4µ= yn ) = P l 41
  • Yi
, . . . , 4N = y " I X ) p CX ) p ( 4 , = y . , . . . , Up , e yn )
slide-19
SLIDE 19 Prior Betak

;qp7=

1- xa " ( , . ×sP ' ' BC 0,137 B 19,13) = p( a) p( p > Same dependence
  • n
× ; P( dtp ) bur xfoo ( ,
  • x )
Likelihood N ply , :N lx ) = M plynlx ) Yn are i. i. d n =| Plyntx ) =

{

× YE ' = ×Yn ( , . × , ' . Yn 1 i
  • x )
yn=o
slide-20
SLIDE 20 Conjngacy Plx I yi : ,u ) = ply , :µ,×7 a ply , :µ,× )
  • plyiin
) P ( Yi :µ,× ) = pk ) ply , :X , 1 x ) N = 1

xo

"c , . xsp ' 'M×Yn( , . × , "
  • 9
" Bam n= , = 1

×lEYmto

' ' , ,

.nl?Ya.ynDtB

. ' Blair

Tµunw

(" Number
  • f
heads
  • f
tails (
  • ut
  • f
N ) (
  • ut
  • f
µ )
slide-21
SLIDE 21 Conjngacy Plx I yi ) = ply , in ,×7 a ply , :µ,× )
  • plyiin
) P ( Yi :µ,× ) = pk ) ply , :X , I × ) N = 1

xo

"a . xsp " 17×9 " ( i. × , "
  • 9
" t ' B( a ,p7 n= , £ = E. ynta ,

×l§

. , ynlta . ' , ,

.nl?Yn.ynDtB

. ' = × ' B( a ,p7 § ={ ( i
  • yn)tB
no , = Bk , ,3 , ×£' ci . am ' = BK.PT Bcqp , PKII , ,5l
slide-22
SLIDE 22 Does not Depends Conjhgacy dependant
  • n
x . . ply , :n , × ) = BC I. B ) Betak;I , F) Bla , B) = plyi :u ) PCX 1 yi ) Posterior pcxiyi :n ) = Betak ; 2 , p ) 2=a+µheadspn=p+µtai 's Marginal Likelihood ply , :u ) = BLEE ) BK ,p )
slide-23
SLIDE 23 Predictive Distribution p ( yµ+ , 1 Yi :X , ) = | dx plymti , × 1 Yi :X ,) =

/

dx p ( yµ+ , 1×1 pklyi :N ) = Epcxiy , :* ) [ plyntilx ) ) Weighted Coin Example ( Exercise )
slide-24
SLIDE 24 Approximate Inference
slide-25
SLIDE 25