prob obab abil ilit ity y an and d tim time ma marko kov
play

Prob obab abil ilit ity y an and d Tim Time: Ma Marko kov - PowerPoint PPT Presentation

Prob obab abil ilit ity y an and d Tim Time: Ma Marko kov v Mo Mode dels ls Com omputer Science c cpsc sc322, Lecture 3 31 (Te Text xtboo ook k Chpt 6.5.1) June, 2 20, 2 2017 6/21/2017 CPSC322 Summer 2017 Slide 1


  1. Prob obab abil ilit ity y an and d Tim Time: Ma Marko kov v Mo Mode dels ls Com omputer Science c cpsc sc322, Lecture 3 31 (Te Text xtboo ook k Chpt 6.5.1) June, 2 20, 2 2017 6/21/2017 CPSC322 Summer 2017 Slide 1

  2. Lectu ture re Ov Overv rvie iew • Recap p • Te Tempo poral l Prob obabi bilistic ic Mo Mode dels ls • Start Markov Models • Markov Chain • Markov Chains in Natural Language Processing 6/21/2017 CPSC322 Summer 2017 Slide 2

  3. Bi Big g Pi Pict ctur ure: e: R&R &R sy syst stem ems Environ onment Sto tochas asti tic Dete terministi tic Prob oblem Arc Consiste tency Se Sear arch Constr trai aint t Var ars + Sat atisfac acti tion Constr trai aints ts SLS Sta tati tic Belief N Nets ts Logics Query Qu Var ar. Eliminat ation Sear arch Decision Nets ts Sequenti Se tial al STRIPS Var ar. Eliminat ation Planning Mar arkov Pr v Processes Sear arch Representa tati tion Val alue Ite terat ation Reas asoning 6/21/2017 CPSC322 Summer 2017 Slide 3 Technique

  4. Answerin ing Query u y unde der Un Uncertai ainty Probab abil ility ity Theory ry Dynamic mic Bayesia ian n Network rk Sta tati tic B Belief Netw twork & V Variab able le Elimi mina nation ion Hidden n Markov ov Models St Student t Tracing ng in Monitori oring ng tutorin ing g Sy Systems ms BioInforma rmati tics cs (e.g credit t cards) Markov ov Chains Natural al Language age Diagno nosti stic c Processin ssing Systems Sy ms (e.g., ., medici cine ne) Email il spam m filters rs 6/21/2017 CPSC322 Summer 2017 Slide 4

  5. Lectu ture re Ov Overv rvie iew • Recap p • Te Tempo poral l Prob obabi bilistic ic Mo Mode dels ls • Start Markov Models • Markov Chain • Markov Chains in Natural Language Processing 6/21/2017 CPSC322 Summer 2017 Slide 5

  6. Mod odell llin ing g st stat atic ic Envi viro ronme ments ts So far we have used Bnets to perform inference in st static environ onments s • For instance, the system keeps collecting evidence to diagnose the cause of a fault in a system (e.g., a car). • The environment (values of the evidence, the true cause) does not change as I gather new evidence • What does change? The system’s beliefs over possible causes 6/21/2017 CPSC322 Summer 2017 Slide 6

  7. Mode deli ling Ev Evolv lvin ing E Envir ironments • Often we need to make inferences about evolving environments . • Represent the state of the world at each specific point in time via a series of snapshots, or time sl slices , Solve veProblem t-1 Solve veProblem t Knows-Subtrac action t-1 Knows-Subtrac action t t Moral ale t-1 Moral ale t Tutor Tu oring g sy syst stem tracing student knowledge and morale 6/21/2017 CPSC322 Summer 2017 Slide 7

  8. Lectu ture re Ov Overv rvie iew • Recap p • Te Tempo poral l Prob obabi bilistic ic Mo Mode dels ls • Start Markov Models • Mar arko kov v Ch Chai ain • Markov Chains in Natural Language Processing 6/21/2017 CPSC322 Summer 2017 Slide 8

  9. Si Simp mple lest st Pos ossi sible le DBN BN able for each time slice : let’s assume S t • On One ran andom va variab represents the state at time t . with domain { v 1 … v n } • Eac ach ran andom va variab able depends only o y on th the previ vious one • Thus • Intuitively S t conveys all of the information about the history that can affect the future states. • “The future is independent of the past given the present.” 6/21/2017 CPSC322 Summer 2017 Slide 9

  10. Simplest Possible DBN (cont’) • How many CPTs do we need to specify? A. 1 A. C . . 2 D. 3 D. B. 4 • Stationary process assumption: the mechanism that regulates how state variables change overtime is st station onary, that is it can be described by a single transition model • P(S t |S t-1 ) 6/21/2017 CPSC322 Summer 2017 Slide 10

  11. Stat atio ionar ary Ma y Markov Ch v Chai ain ( (SMC MC) A stationary Markov Chain : for all t >0 • P ( S t+1 | S 0 ,…, S t ) = P ( S t+1 | S t ) and • P ( S t +1 | S t ) is the same We only need to specify and • Simple Model, easy to specify • Often the natural model • The network can extend indefinitely • Var ariat ations of SMC ar are at at th the c core o of man any y Nat atural al L Lan anguag age Processing (NLP) ap applicat ations! 6/21/2017 CPSC322 Summer 2017 Slide 1 1

  12. Stat atio ionar ary Ma y Markov Ch v Chai ain ( (SMC MC) A stationary Markov Chain : for all t >0 • P ( S t+1 | S 0 ,…, S t ) = P ( S t+1 | S t ) and • P ( S t +1 | S t ) is the same So we only need to specify? A. A. P ( S t +1 | S t ) and P ( S 0 ) B. P ( S 0 ) D. D. P ( S t | S t+1 ) C . . P ( S t +1 | S t ) 6/21/2017 CPSC322 Summer 2017 Slide 12

  13. Sta Stati tion onar ary y Mar arko kov-Ch Chai ain: Exa xamp mple le t .6 Domain of variable S i is {t , q, p, a, h, e} q .4 p Probability of initial state P ( S 0 ) a P ( S t+1 | S t ) Stochastic Transition Matrix h e Which of these two is a possible STM? S S  t 1  t 1 t q p a h e t q p a h e 1 0 0 0 0 0 t 0 .3 0 .3 .4 0 t 0 1 0 0 0 0 q q .4 0 .6 0 0 0 .3 0 1 0 0 0 p p 0 0 1 0 0 0 S S t t a 0 0 0 1 0 0 0 0 .4 .6 0 0 a 0 0 0 0 0 1 h 0 0 0 0 0 1 h 0 0 0 .2 0 1 e e 1 0 0 0 0 0 A. A.Left one only B. Right one only 6/21/2017 CPSC322 Summer 2017 Slide 13 D. D. None C . . Both

  14. Sta Stati tion onar ary y Mar arko kov-Ch Chai ain: Exa xamp mple le Domain of variable S i is {t , q, p, a, h, e} t .6 We only need to specify… q .4 P ( S 0 ) p a Probability of initial state h e Stochastic Transition Matrix S  t 1 t q p a h e P ( S t+1 | S t ) t 0 .3 0 .3 .4 0 .4 0 .6 0 0 0 q 0 0 1 0 0 0 p S t a 0 0 .4 .6 0 0 h 0 0 0 0 0 1 1 0 0 0 0 0 e 6/21/2017 Slide 14 CPSC322 Summer 2017

  15. Mark rkov ov-Chai ain: Infe fere rence Probability of a sequence of states S 0 … S T P ( S ,..., S ) 0 T P ( S t+1 | S t ) t q p a h e P ( S 0 ) 0 .3 0 .3 .4 0 t t .6 .4 0 .6 0 0 0 q q .4 p 0 0 1 0 0 0 Exa xample: p 0 0 0 .4 .6 0 0 a a 0  P ( t , q , p ) h 0 0 0 0 0 1 0 h e 0 e 1 0 0 0 0 0 6/21/2017 CPSC322 Summer 2017 Slide 15

  16. Lectu ture re Ov Overv rvie iew • Recap p • Te Tempo poral l Prob obabi bilistic ic Mo Mode dels ls • Marko kov Mod odels ls • Mar arko kov v Ch Chai ain • Markov Chains in Natural Language Processing 6/21/2017 CPSC322 Summer 2017 Slide 16

  17. Ke Key pro roble lems s in in NLP P ( w ,.., w ) ? “Book me a room near UBC” 1 n Assign a probability to a sentence • Part-of-speech tagging Summariza zation on, Machine • Word-sense disambiguation, Translation…..... • Probabilistic Parsing Predict the next word • Speech recognition • Hand-writing recognition • Augmentative communication for the disabled P ( w ,.., w ) ? Impo possib ible le t to 1 n ate  estim imat 6/21/2017 CPSC322 Summer 2017 17

  18. P ( w ,.., w ) ? Impo possib ible le t to e estim imat ate! 1 n Assuming 10 5 words and average sentence contains 10 words ……. Google Go le la languag age re repo posit itory y (22 Sept. 2006) contained “only”: 95,119,665,584 sentences once  Mos ost se sentences w s will not ot appear or or appear on only on 6/21/2017 CPSC322 Summer 2017 18

  19. What at can an w we do? o? Make a strong simplifying assumption! Sentences are generated by a Markov Chain  n    P ( w ,.., w ) P ( w | S ) P ( w | w )  1 n 1 k k 1  k 2 P(Th The b big g red dog og barks ks)= P(Th The|<S>) * * 6/21/2017 CPSC322 Summer 2017 19

  20. Est stim imat ates es fo for r Bi Bigr gram ams Silly language repositories with on only two o se sentences: “<S> The big red dog barks against the big pink dog” “<S> The big pink dog is much smaller” C ( big , red ) N P ( big , red ) C ( big , red ) pairs     P ( red | big ) C ( big ) P ( big ) C ( big ) N words 6/21/2017 CPSC322 Summer 2017 20

  21. Bigrams in practice… If you have 10 5 words in your dictionary P ( w i w | )  i 1 will contain this many numbers.. ?? A. A.2 *10 5 B. 10 10 B. . 5 * 10 5 D.2 *10 10 D. C. 6/21/2017 CPSC322 Summer 2017 Slide 21

  22. Learning Goals for today’s class Yo You c can an: • Specify a Markov Chain and compute the probability of a sequence of states • Justify and apply Markov Chains to compute the probability of a Natural Language sentence 6/21/2017 CPSC322 Summer 2017 Slide 22

  23. Mar arko kov v Mod odel els Simplest Possible Markov Chains Dynamic Bnet We cannot observe directly what we care Hidden Markov Model about Add Actions and Values Markov Decision (Rewards) Processes (MDPs) 6/21/2017 CPSC322 Summer 2017 Slide 23

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend