probabilistic models cs 4100 artificial intelligence
play

Probabilistic Models CS 4100: Artificial Intelligence Bayes Nets - PDF document

Probabilistic Models CS 4100: Artificial Intelligence Bayes Nets Models describe how (a portion of) the world works ks Mo Model els ar are e al alway ays simplifi ficat cations May not account for every variable May not


  1. Probabilistic Models CS 4100: Artificial Intelligence Bayes’ Nets • Models describe how (a portion of) the world works ks • Mo Model els ar are e al alway ays simplifi ficat cations • May not account for every variable • May not account for all interactions between variables • “All models are wrong; but some are useful.” – George E. P. Box • Wha What do do we do do with h pr proba babi bilistic mode dels? • We (or our agents) need to reason about unknown variables, given evidence • Ex Exampl ple: explanation (diagnostic reasoning) • Ex Exampl ple: prediction (causal reasoning) • Ex Exampl ple: value of information Jan-Willem van de Meent, Northeastern University [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Independence Independence • Tw Two o variables are in independent if if: • This says that their joint distribution factors into a product two simpler distributions • Another form: • We write: • In Indep epen enden ence ce is a a simplifying model eling as assumption • Em Empi pirical jo join int dis istrib ibutio ions: at best “close” to independent • What could we assume for {W {Weather, Traffic, Cavity, Toothache} ? Example: Independence Example: Independence? • N fa fair ir, in independent coin oin flip flips: T P hot 0.5 cold 0.5 H 0.5 H 0.5 H 0.5 T 0.5 T 0.5 T 0.5 T W P T W P hot sun 0.4 hot sun 0.3 hot rain 0.1 hot rain 0.2 cold sun 0.2 cold sun 0.3 cold rain 0.3 cold rain 0.2 W P sun 0.6 rain 0.4 Conditional Independence Conditional Independence Jo Joint distribution: P(T (Toothache, Cavit ity, Catch) • • Un Uncondit itio ional l (a (abs bsolute) ) inde depe pende dence very rare (wh (why?) ?) • If I h If I have a a cavi cavity, t , the p probability t that t the p probe cat catch ches es in in it it • Co Condi nditiona nal in independence is is our r most basic ic and ro robust doesn't de do depe pend d on whether I have a to tooth thache: form of kn knowledge about uncertain en environmen ents. P(+ P(+cat catch ch | +toothach ache, e, +cavi cavity) y) = P(+ P(+cat catch ch | +cavi cavity) • • The The sa same ind ndepend ndenc nce hol holds s if I don on’t t have a cavi cavity: • X X is is co conditional ally indep epen enden ent of of Y gi given Z • P(+cat P(+ catch ch | +toothach ache, e, -cavi cavity) ) = P(+ P(+cat catch ch| -cavi cavity) if an if and only if : • Catch is Ca is condit itio ionally lly in independent of Toot Tootha hache he gi given n Ca Cavit ity: • P(C P(Cat atch ch | Toothach ache, e, Cavi avity) y) = P(C P(Cat atch ch | Cavi avity) y) or or, e , equivalently, i , if a and o only i if • Equi Equivalent nt statement nts: • P(Toothach P(T ache e | Cat atch ch , Cavi avity) y) = P(T P(Toothach ache e | Cavi avity) y) P(T P(Toothach ache, e, Cat atch ch | Cavi avity) y) = P(T P(Toothach ache e | Cavi avity) y) P(C P(Cat atch ch | Cavi avity) y) • • One can be derived from the other easily

  2. Conditional Independence Conditional Independence • Wha What ab about this domai ain: • Wha What ab about this domai ain: • Tr Traffic • Fi Fire Traffic ⫫ Um Tr Umbrella lla • Um Umbrella lla • Smoke ke • Ra Raining • Al Alarm Traffic ⫫ Um Tr Umbrella lla | Rain inin ing Al Alarm ⫫ Fi Fire | Smoke ke Conditional Independence and the Chain Rule Ghostbusters Chain Rule Ea Each h sens nsor de depe pends nds onl nly • P(T,B,G) = P(G) P(T|G) P(B|G) • Cha Chain n rul ule: on on whe here the he ghost ghost is T B G P(T,B,G) • Tha That means, ns, the he two o se sensor nsors s are ar e co condition onal ally y indep epen enden ent, , +t +b +g 0.16 • Tr Trivial decom ompos ositi tion on: gi given n th the ghost t positi tion +t +b -g 0.16 • T: Top square is re red +t -b +g 0.24 B: B: Bottom square is re red +t -b -g 0.04 G: Ghost is in the top G: • Wi With h assum umpt ption n of condi nditiona nal inde ndepe pende ndenc nce: -t +b +g 0.04 • Givens (assu Giv assumption ons): ): P( +g +g ) = = 0. 0.5 -t +b -g 0.24 P( ( -g g ) = = 0.5 -t -b +g 0.06 P( +t +t | +g +g ) = = 0.8 -t -b -g 0.06 P( +t +t | -g g ) = = 0.4 • Ba Bayes’ ne nets / / graphical mo models he help us us express cond nditiona nal ind ndepend ndenc nce assum umptions ns P( +b +b | +g +g ) = = 0.4 P( +b +b | -g g ) = = 0.8 Bayes’ Nets: Big Picture Bayes’ Nets: Big Picture • Tw Two o prob oblems with th using g fu full jo join int dis istrib ributio ion table les as as ou our pr proba babi bilistic mode dels: • Unless there are only a few variables, the joint is WAY too big to represent explicitly • Hard to learn (estimate) anything empirically about more than a few variables at a time • Ba Bayes’ ne nets: De Describ ribe comple lex jo join int dis istrib ributio ions (mo models) ) us using ng simple, local distribut utions ns (co conditional al probab abilities es) • More properly called gr graphi hical mod odels • We describe how var variab ables es local cally y inter eract act • Local interactions chain together to give global, in indir irect in interactio ions • For about 10 min, we’ll be vague about how these interactions are specified Example Bayes’ Net: Insurance Example Bayes’ Net: Car

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend