hidden markov models markov model finite state machine
play

Hidden Markov Models Markov Model (Finite State Machine with Probs) - PowerPoint PPT Presentation

Hidden Markov Models Markov Model (Finite State Machine with Probs) Modeling a sequence of weather observations Hidden Markov Models Assume the states in the machine are not observed and we can observe some output at certain states. Hidden


  1. Hidden Markov Models

  2. Markov Model (Finite State Machine with Probs) Modeling a sequence of weather observations

  3. Hidden Markov Models Assume the states in the machine are not observed and we can observe some output at certain states.

  4. Hidden Markov Models Assume the states in the machine are not observed and we can observe some output at certain states. Hidden: Sunny Hidden: Rainy Observation: Clean Observation: Walk Observation: Shop

  5. Generate a sequence from a HMM p ( s ( i + 1) | s ( i )) p ( s ( i ) | s ( i − 1)) Hidden s(i-1) s(i) s(i+1) Observed x(i-1) x(i) x(i+1) p ( x ( i + 1) | s ( i + 1)) p ( x ( i ) | s ( i )) p ( x ( i − 1) | s ( i − 1))

  6. Generate a sequence from a HMM HHHHHHCCCCCCCHHHHHH Hidden: temperature 3323332111111233332 Observed: number of ice creams

  7. Hidden Markov Models: Applications Speech recognition Action recognition

  8. Motif Finding Problem: Find frequent motifs with length L in a sequence dataset ATCGCGCGGCGCGGAATCGDTATCGCGCGCC CAGGTAAGT GCGCGCG CAGGTAAGG TATTATGCGAGACGATGTGCTATT GTAGGCTGATGTGGGGGG AAGGTAAGT CGAGGAGTGCATG CTAGGGAAACCGCGCGCGCGCGAT AAGGTGAGT GGGAAAG Assumption: the motifs are very similar to each other but look very different from the rest part of sequences

  9. Motif: a first approximation Assumption 1: lengths of motifs are fixed to L Assumption 2: states on different positions on the sequence are independently distributed N i ( A ) p i ( A ) = N i ( A ) + N i ( T ) + N i ( G ) + N i ( C ) L Y p ( x ) = p i ( x ( i )) i =1

  10. Motif: (Hidden) Markov models Assumption 1: lengths of motifs are fixed to L Assumption 2: future letters depend only on the present letter p i ( A | G ) = N i − 1 ,i ( G, A ) N i − 1 ( G ) L Y p ( x ) = p 1 ( x (1)) p i ( x ( i ) | x ( i − 1)) i =2

  11. Motif Finding Problem: We don’t know the exact locations of motifs in the sequence dataset ATCGCGCGGCGCGGAATCGDTATCGCGCGCC CAGGTAAGT GCGCGCG CAGGTAAGG TATTATGCGAGACGATGTGCTATT GTAGGCTGATGTGGGGGG AAGGTAAGT CGAGGAGTGCATG CTAGGGAAACCGCGCGCGCGCGAT AAGGTGAGT GGGAAAG Assumption: the motifs are very similar to each other but look very different from the rest part of sequences

  12. Hidden state space null start end

  13. Hidden Markov Model (HMM) 0.9 null 0.99 0.02 start end 0.08 0.95 0.01 0.05

  14. How to build HMMs?

  15. Computational problems in HMMs

  16. Hidden Markov Models

  17. Hidden Markov Model Hidden q(i-1) q(i) q(i+1) Observed o(i-1) o(i) o(i+1)

  18. Conditional Probability of Observations Example:

  19. Joint and marginal probabilities Joint: Marginal:

  20. How to compute the probability of observations

  21. Forward algorithm

  22. Forward algorithm

  23. Forward algorithm

  24. Decoding: finding the most probable states Similar to the forward algorithm, we can define the following value:

  25. Viterbi algorithm

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend