more ore on on bn bnets ets str tructure ucture an and d
play

More ore on on BN BNets ets str tructure ucture an and d - PowerPoint PPT Presentation

Re Reasoning asoning Un Unde der r Un Uncertainty: ertainty: More ore on on BN BNets ets str tructure ucture an and d cons onstruction truction Computer ter Sc Science ce cpsc3 c322 22, , Lectur ture e 28 (Te Text xtbo


  1. Re Reasoning asoning Un Unde der r Un Uncertainty: ertainty: More ore on on BN BNets ets str tructure ucture an and d cons onstruction truction Computer ter Sc Science ce cpsc3 c322 22, , Lectur ture e 28 (Te Text xtbo book ok Chpt 6.3) Nov, No , 14, 2012 CPSC 322, Lecture 28 Slide 1

  2. Belie lief f netw tworks orks Recap ap • By considering causal al depende denc ncie ies, we order variables in the joint. • Ap Apply …………………….. and simplify ify • Bu Build a direct cted ed acycl clic ic graph (DAG) in which the parents of each var X are those vars on which X directly depends. • By By constru tructi ction on, a var is independent form it non- descendant given its parents. CPSC 322, Lecture 28 Slide 2

  3. Be Belief lief Networ tworks: ks: open n iss ssues ues • Indep epend nden enci cies es: Does a BNet encode more independencies than the ones specified by construction? • Compactn ctnes ess: We reduce the number of probabilities from to In some domains we need to do better than that! • Still too many and often there are no data/experts for accurate assessment So Solutio tion: n: Make stronger (approximate) independence assumptions CPSC 322, Lecture 28 Slide 3

  4. Lecture cture Ov Overview view • Implied Conditional Independence relations in a Bnet • Compactness: Making stronger Independence assumptions • Representation of Compact Conditional Distributions • Network structure( Naïve Bayesian Classifier) CPSC 322, Lecture 28 Slide 4

  5. Bn Bnets: ets: En Enta tailed iled (in)depend )dependencies encies Indep ep(Rep Repor ort, t, Fi Fire,{A {Ala larm rm})? })? Indep ep(Le Leav avin ing, g, Se SeeSm Smoke,{ e,{Fi Fire re})? })? CPSC 322, Lecture 29 Slide 5

  6. Condit nditional ional In Independenci ependencies es Or, blocking paths for probability propagation. Three ways in which a path between X to Y can be blocked, (1 and 2 given evidence E ) Y E X 1 Z Z 2 3 Z Note that, in 3, X and Y become dependent as soon as I get evidence on Z or on any of its descendants CPSC 322, Lecture 28 Slide 6

  7. Or ….Conditional Dependencies Y X 1 Z Z 2 E 3 Z CPSC 322, Lecture 28 Slide 7

  8. In/D /Dep epend endenc encies ies in a Bn Bnet : Ex Example e 1 Y E X Z 1 Z 2 3 Z Si Simple one CPSC 322, Lecture 28 Slide 8

  9. In/D /Dep epend endenc encies ies in a Bn Bnet : Ex Example e 1 Y E X Z 1 Z 2 3 Z Is A conditionally independent of I given F? CPSC 322, Lecture 28 Slide 9

  10. In/D /Dep epend endenc encies ies in a Bn Bnet : Ex Example e 2 Y E X Z 1 Z 2 3 Z Is H conditionally independent of E given I? CPSC 322, Lecture 28 Slide 10

  11. Lecture cture Ov Overview view • Implied Conditional Independence relations in a Bnet • Compactness: Making stronger Independence assumptions • Representation of Compact Conditional Distributions • Network structure( Naïve Bayesian Classifier) CPSC 322, Lecture 28 Slide 11

  12. Mo More e on on C Con onstr truction uction an and C d Com ompa pactness: tness: Com ompa pact ct Con ondi diti tion onal al Dis istributio tributions ns Once we have established the topology of a Bnet, we still need to specify the conditional probabilities How? • Fr From Data • Fr From Ex Experts ts To To facilita litate te acquisi isitio tion, n, we we aim for compact ct repres esen entatio tations ns for which data/ex /expe perts rts can provid ide accur urate ate assess ssme ments nts CPSC 322, Lecture 28 Slide 12

  13. Mo More e on on C Con onstr truction uction an and C d Com ompa pactness: tness: Com ompa pact ct Con ondi diti tion onal al Dis istributio tributions ns From JointPD to But still, CPT grows exponentially with number of parents In realistic stic model of intern rnal al medicin ine with 448 nodes and 906 links 133,931,430 values are required! And often there are no data/experts for accurate assessment CPSC 322, Lecture 28 Slide 13

  14. Ef Effect fect wit ith h mu mult ltip iple le no non-in inte terac ractin ing g cau auses es What do we need to specify? ify? Malaria Flu Cold Cold Malaria Flu P(Fever=T | ..) P(Fever=F|..) Fever T T T T T F What do you think k T F T data/e a/exp xper erts ts T F F could ld easily ly tell F T T you? you? F F T F F T F F F More diffic ficul ult t to get info to asses ess s more complex ex conditioning…. CPSC 322, Lecture 28 Slide 14

  15. Sol olut utio ion: n: Noi oisy sy-OR OR Dis istribution tributions • Models multiple non interacting causes • Logic OR with a probabilistic twist. • Logic OR Conditional Prob. Table. Malaria Flu Cold P(Fever=T | ..) P(Fever=F|..) T T T T T F T T F T F F T F T F T F T F F F F F CPSC 322, Lecture 28 Slide 15

  16. Sol olut utio ion: n: Noi oisy sy-OR OR Dis istribution tributions The No Noisy-OR OR model allows for uncert rtaint nty y in t the ability of each cause to generate ate the effect ct (e.g.. one may have a cold without a fever) Malaria Flu Cold P(Fever=T | ..) P(Fever=F|..) T T T T T F T F T T F F F T T F T F F F T F F F Two assumptions 1. All possible causes a listed 2. For each of the causes, whatever inhibits it to generate the target effect is independent from the inhibitors of the other causes CPSC 322, Lecture 28 Slide 16

  17. Noi oisy sy-OR: OR: Der eriv ivation ations s C 1 C k Effect For each of the causes, whatever inhibits it to generate the target effect is independent from the inhibitors of the other causes e q i for each cause alone: Independe pendent nt Probability ility of failure • P(Effect=F | C i = T, and no other causes ) = q i • P(Effect=F | C 1 = T,.. C j = T, C j+1 = F,., C k = F)= • P(Effect=T | C 1 = T,.. C j = T, C j+1 = F,., C k = F) = CPSC 322, Lecture 28 Slide 17

  18. Noi oisy sy-OR: OR: Exam ampl ple Model of internal medicine P(Fever=F| Cold=T, Flu=F, Malaria=F) = 0.6 P(Fever=F| Cold=F, Flu=T, Malaria=F) = 0.2 133,931,430  8,254 P(Fever=F| Cold=F, Flu=F, Malaria=T) = 0.1 • P(Effect=F | C 1 = T,.. C j = T, C j+1 = F,., C k = F)= ∏ j i=1 q i Malaria Flu Cold P(Fever=T | ..) P(Fever=F|..) 0.1 x 0.2 x 0.6 = 0.012 T T T 0.2 x 0.1 = 0 0.02 T T F 0.6 x 0.1=0.06 06 T F T 0.9 0.1 T F F F T T 0.2 x 0.6 = 0.12 F T F 0.8 0.2 F F T 0.4 0.6 F F F 1.0 • Number of probabilities linear in ….. CPSC 322, Lecture 28 Slide 18

  19. Lecture cture Ov Overview view • Implied Conditional Independence relations in a Bnet • Compactness: Making stronger Independence assumptions • Representation of Compact Conditional Distributions • Network twork str tructure cture ( N Naïve ïve Bayesian sian Classifier) assifier) CPSC 322, Lecture 28 Slide 19

  20. Naïve Bayesian Classifier A very simple and successful Bnets that allow to classify entities in a set of classes C, given a set of attributes Ex Example: le: • Determine whether an email is spam (only two classes spam=T and spam=F) • Useful attributes of an email ? As Assumptio ptions ns • The value of each attribute depends on the classification • (Na Naïve) ïve) The attributes are independent of each other given the classification P(“bank” | “account” , spam=T) = P(“bank” | spam=T)

  21. Naïve Bayesian Classifier for Email Spam As Assumptio ptions ns • The value of each attribute depends on the classification • (Na Naïve) ïve) The attributes are independent of each other given the classification • What is the structur cture? e? Email Spam words Email contains Email contains Email contains Email contains “free” “money” “ubc” “midterm” Number of parameters? Easy to acquire? If you have a large collection of emails for which you know if they are spam or not……

  22. NB Classifier for Email Spam: Usage Most likely class given set of observations Is a given n Em Email E spam? “free money for you now” Email Spam Email contains Email contains Email contains Email contains “free” “money” “ubc” “midterm” Email is a spam if…….

  23. Fo For another ther example mple of f naïv ïve e Bayesian esian Classifier assifier Se See textbo tbook ok ex. 6.16 he help lp syste stem m to determine what he help lp pa page ge a a us user er is is in inte teres ested ted in in based on th the k e key eyword ords s th they ey gi give e in in a qu a quer ery to a help system.

  24. Learning Goals for today’s class Yo You u can an: • Given a Belief Net, determine whether one variable is conditionally independent of another variable, given a set of observations. • Define and use Noi oisy sy-OR OR distributions. Explain assumptions and benefit. • Implement and use a na naïv ïve e Bay ayes esia ian n cla lassifier ssifier. Explain assumptions and benefit. CPSC 322, Lecture 4 Slide 24

  25. Next xt Class ss Bayesian Networks Inference: Va Variable le El Eliminati ation on Course urse El Elements ments • Work on Practice Exercises 6A and 6B • Assignment 3 is due on Monday! • Assignment 4 will be available on Wednesday and due on Nov the 28 th (last class). CPSC 322, Lecture 28 Slide 25

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend