markov models and hidden markov models
play

Markov Models and Hidden Markov Models Robert Platt Northeastern - PowerPoint PPT Presentation

Markov Models and Hidden Markov Models Robert Platt Northeastern University Some images and slides are used from: 1. CS188 UC Berkeley 2. RN, AIMA Markov Models We have already seen that an MDP provides a useful framework for modeling


  1. Markov Models and Hidden Markov Models Robert Platt Northeastern University Some images and slides are used from: 1. CS188 UC Berkeley 2. RN, AIMA

  2. Markov Models We have already seen that an MDP provides a useful framework for modeling stochastic control problems. Markov Models: model any kind of temporally dynamic system.

  3. Probability recap  Conditional probability  Product rule  Chain rule  X, Y independent if and only if:  X and Y are conditionally independent given Z if and only if:

  4. Probability again: Independence Two random variables, x and y , are independent when: The outcomes of two different coin flips are usually independent of each other

  5. Probability again: Independence If: Then: Why?

  6. Probability again: Independence Two random variables, x and y , are independent when: The outcomes of two different coin flips are usually independent of each other

  7. Example: Independence winter !winter snow 0.1 0.1 !snow 0.3 0.5

  8. Example: Independence winter !winter snow 0.1 0.1 !snow 0.3 0.5 Are snow and winter independent variables?

  9. Example: Independence winter !winter snow 0.1 0.1 !snow 0.3 0.5 Are snow and winter independent variables? P(snow) = 0.2 P(winter) = 0.4

  10. Example: Independence winter !winter snow 0.1 0.1 !snow 0.3 0.5 Are snow and winter independent variables? P(snow) = 0.2 P(winter) = 0.4 What would the distribution look like if snow, winter were independent?

  11. Conditional independence Independence: Conditional independence: Equivalent statements of conditional independence:

  12. Conditional independence: example cavity toothache catch P(toothache, catch | cavity) = P(toothache | cavity) P(catch | cavity) Toothache and catch are conditionally independent given cavity – this is the “common cause” scenario covered in Bayes Nets...

  13. Examples of conditional independence What are the conditional independence relationships in the following? – traffic, raining, late for work – snow, cloudy, crash – fire, smoke, alarm

  14. Markov Processes State at time=1 State at time=2 transitions Markov model can be used to model any sequential time process – the weather – traffic – stock market – news cycle ...

  15. Markov Processes State at time=1 State at time=2 transitions Since this is a Markov process, we assume transitions are Markov: Process model: Markov assumption:

  16. Markov Processes How do we calculate:

  17. Markov Processes How do we calculate:

  18. Markov Processes How do we calculate:

  19. Markov Processes How do we calculate: Can we simplify this expression?

  20. Markov Processes How do we calculate:

  21. Markov Processes How do we calculate:

  22. Markov Processes How do we calculate: In general:

  23. Markov Processes How do we calculate: Process model In general:

  24. Markov Processes: example Two states: cloudy, sunny X_{t-1} X_t X_t sun sun 0.8 sun cloudy 0.2 cloudy sun 0.3 cloudy cloudy 0.7 0.2 0.8 0.7 sun cloudy 0.3

  25. Simulating dynamics forward Joint distribution: But, suppose we want to predict the state at time T, given a prior distribution at time 1? ...

  26. Simulating dynamics forward Suppose is it sunny on mon... Prob sunny tues Prob sunny weds Prob sunny thurs Prob sunny fri

  27. Simulating dynamics forward Suppose is it cloudy on mon... Prob sunny tues Prob sunny weds Prob sunny thurs Prob sunny fri

  28. Simulating dynamics forward Suppose is it cloudy on mon... Prob sunny tues Prob sunny weds Prob sunny thurs Prob sunny fri Converge to same distribution regardless of starting point – called the “stationary distribution”

  29. An aside: the stationary distribution How might you calculate the stationary distribution? Let: Then: Stationary distribution is the value for p such that:

  30. An aside: the stationary distribution How might you calculate the stationary distribution? Let: Then: Stationary distribution is the value for p such that: How calculate p that satisfies this eqn?

  31. Hidden Markov Models (HMMs) Hidden Markov Models: – extension of the Markov model – state is assumed to be “hidden”

  32. Hidden Markov Models (HMMs) Called an “emission” State, , is assumed to be unobserved However, you get to make one observation, , on each timestep. Examples: – speech to text; tracking in computer vision' robot localization

  33. Hidden Markov Models (HMMs) Sensor Markov Assumption: the current observation depends only on current state:

  34. HMM example (state is unobserved) (only observations are observed) 0.2 You live underground... Every day, you're boss comes in either 0.8 0.7 sun cloudy wearing sunglasses or not 0.3 Can you infer whether it's sunny out based on whether you see the glasses 0.7 0.3 0.6 over a sequence of days? 0.4 – e.g. what's the prob it's sunny out glasses No glasses today if you've seen your boss wear glasses three days in a row?

  35. HMM Filtering Given a prior distribution, , and a series of observations, , calculate the posterior distribution: Two steps: Process update Observation update The Kalman filter is perhaps the most famous instance of this idea

  36. HMM Filtering Given a prior distribution, , and a series of observations, , calculate the posterior distribution: Two steps: Process update Observation update

  37. HMM Filtering Given a prior distribution, , and a series of observations, , calculate the posterior distribution: “Beliefs” Two steps: Process update Observation update

  38. Process update This is just forward simulation of the Markov Model

  39. Process update: example T = 1 T = 2 T = 5 Completely certain By now, we've almost A little less certain on about ghost completely lost track the next time step... position at T=1 of the ghost... If we only do the process update, then we typically lose information over time – when might this not be true?

  40. Observation update Where is a normalization factor

  41. Observation update Before observation After observation Observations enable the system to gain information – a single observation may not determine system state exactly – but, the more observations, the better

  42. Robot localization example Prob 0 1

  43. Robot localization example Prob 0 1

  44. Robot localization example Prob 0 1

  45. Robot localization example Prob 0 1

  46. Robot localization example Prob 0 1

  47. Robot localization example Prob 0 1

  48. Weather HMM example 0.2 0.8 0.7 sun cloudy 0.3 0.7 0.3 0.6 0.4 glasses No glasses

  49. Weather HMM example X_{t-1} X_t X_t X_t P(g_t|X_t) sun sun 0.8 sun 0.7 sun cloudy 0.2 cloudy sun 0.3 cloudy 0.4 cloudy cloudy 0.7 No glasses glasses glasses w_t P(w_t) sun 0.5 cloudy 0.5

  50. Weather HMM example X_{t-1} X_t X_t X_t P(g_t|X_t) sun sun 0.8 sun 0.7 sun cloudy 0.2 cloudy sun 0.3 cloudy 0.4 cloudy cloudy 0.7 No glasses glasses glasses w_t P(w_t) w_t P(w_t) sun 0.5 sun ? cloudy 0.5 cloudy ?

  51. Weather HMM example X_{t-1} X_t X_t X_t P(g_t|X_t) sun sun 0.8 sun 0.7 sun cloudy 0.2 cloudy sun 0.3 cloudy 0.4 cloudy cloudy 0.7 No glasses glasses glasses w_t P(w_t) w_t P(w_t) sun 0.5 sun 0.55 cloudy 0.5 cloudy 0.45

  52. Weather HMM example X_{t-1} X_t X_t X_t P(g_t|X_t) sun sun 0.8 sun 0.7 sun cloudy 0.2 cloudy sun 0.3 cloudy 0.4 cloudy cloudy 0.7 No glasses glasses glasses w_t P(w_t) w_t P(w_t) sun 0.5 sun 0.55 cloudy 0.5 cloudy 0.45 w_t P(w_t) sun ? cloudy ?

  53. Weather HMM example X_{t-1} X_t X_t X_t P(g_t|X_t) sun sun 0.8 sun 0.7 sun cloudy 0.2 cloudy sun 0.3 cloudy 0.4 cloudy cloudy 0.7 No glasses glasses glasses w_t P(w_t) w_t P(w_t) sun 0.5 sun 0.55 cloudy 0.5 cloudy 0.45 w_t P(w_t) sun 0.68 cloudy 0.31

  54. Weather HMM example X_{t-1} X_t X_t X_t P(g_t|X_t) sun sun 0.8 sun 0.7 sun cloudy 0.2 cloudy sun 0.3 cloudy 0.4 cloudy cloudy 0.7 No glasses glasses glasses w_t P(w_t) sun ? cloudy ? w_t P(w_t) sun 0.68 cloudy 0.31

  55. Weather HMM example X_{t-1} X_t X_t X_t P(g_t|X_t) sun sun 0.8 sun 0.7 sun cloudy 0.2 cloudy sun 0.3 cloudy 0.4 cloudy cloudy 0.7 No glasses glasses glasses w_t P(w_t) sun 0.64 cloudy 0.36 w_t P(w_t) sun 0.68 cloudy 0.31

  56. Weather HMM example X_{t-1} X_t X_t X_t P(g_t|X_t) sun sun 0.8 sun 0.7 sun cloudy 0.2 cloudy sun 0.3 cloudy 0.4 cloudy cloudy 0.7 No glasses glasses glasses w_t P(w_t) sun 0.64 cloudy 0.36 w_t P(w_t) w_t P(w_t) sun ? sun 0.68 cloudy ? cloudy 0.31

  57. Weather HMM example X_{t-1} X_t X_t X_t P(g_t|X_t) sun sun 0.8 sun 0.7 sun cloudy 0.2 cloudy sun 0.3 cloudy 0.4 cloudy cloudy 0.7 No glasses glasses glasses w_t P(w_t) sun 0.64 cloudy 0.36 w_t P(w_t) w_t P(w_t) sun 0.76 sun 0.68 cloudy 0.24 cloudy 0.31

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend