SLIDE 1
Hidden Markov Models Markov Model (Finite State Machine with Probs) - - PowerPoint PPT Presentation
Hidden Markov Models Markov Model (Finite State Machine with Probs) - - PowerPoint PPT Presentation
Hidden Markov Models Markov Model (Finite State Machine with Probs) Modeling a sequence of weather observations Hidden Markov Models Assume the states in the machine are not observed and we can observe some output at certain states. Hidden
SLIDE 2
SLIDE 3
Hidden Markov Models
Assume the states in the machine are not observed and we can observe some output at certain states.
SLIDE 4
Hidden Markov Models
Assume the states in the machine are not observed and we can observe some output at certain states.
Hidden: Sunny Hidden: Rainy Observation: Walk Observation: Shop Observation: Clean
SLIDE 5
s(i-1) s(i+1) s(i)
p(s(i)|s(i − 1)) p(s(i + 1)|s(i))
x(i-1) x(i) x(i+1)
p(x(i − 1)|s(i − 1)) p(x(i)|s(i)) p(x(i + 1)|s(i + 1)) Hidden Observed
Generate a sequence from a HMM
SLIDE 6
HHHHHHCCCCCCCHHHHHH 3323332111111233332
Hidden: temperature Observed: number of ice creams
Generate a sequence from a HMM
SLIDE 7
Speech recognition Action recognition
Hidden Markov Models: Applications
SLIDE 8
Motif Finding
Problem: Find frequent motifs with length L in a sequence dataset Assumption: the motifs are very similar to each other but look very different from the rest part of sequences
ATCGCGCGGCGCGGAATCGDTATCGCGCGCCCAGGTAAGT GCGCGCGCAGGTAAGGTATTATGCGAGACGATGTGCTATT GTAGGCTGATGTGGGGGGAAGGTAAGTCGAGGAGTGCATG CTAGGGAAACCGCGCGCGCGCGATAAGGTGAGTGGGAAAG
SLIDE 9
Motif: a first approximation
Assumption 1: lengths of motifs are fixed to L Assumption 2: states on different positions on the sequence are independently distributed p(x) =
L
Y
i=1
pi(x(i)) pi(A) = Ni(A) Ni(A) + Ni(T) + Ni(G) + Ni(C)
SLIDE 10
Motif: (Hidden) Markov models
Assumption 1: lengths of motifs are fixed to L Assumption 2: future letters depend only on the present letter p(x) = p1(x(1))
L
Y
i=2
pi(x(i)|x(i − 1)) pi(A|G) = Ni−1,i(G, A) Ni−1(G)
SLIDE 11
Motif Finding
Problem: We don’t know the exact locations of motifs in the sequence dataset Assumption: the motifs are very similar to each other but look very different from the rest part of sequences
ATCGCGCGGCGCGGAATCGDTATCGCGCGCCCAGGTAAGT GCGCGCGCAGGTAAGGTATTATGCGAGACGATGTGCTATT GTAGGCTGATGTGGGGGGAAGGTAAGTCGAGGAGTGCATG CTAGGGAAACCGCGCGCGCGCGATAAGGTGAGTGGGAAAG
SLIDE 12
Hidden state space
null start end
SLIDE 13
Hidden Markov Model (HMM)
null start end
0.9 0.08 0.95 0.05 0.01 0.99 0.02
SLIDE 14
How to build HMMs?
SLIDE 15
Computational problems in HMMs
SLIDE 16
Hidden Markov Models
SLIDE 17
Hidden Markov Model
q(i-1) q(i+1) q(i)
- (i-1)
- (i)
- (i+1)
Hidden Observed
SLIDE 18
Conditional Probability of Observations
Example:
SLIDE 19
Joint and marginal probabilities
Joint: Marginal:
SLIDE 20
How to compute the probability of observations
SLIDE 21
Forward algorithm
SLIDE 22
Forward algorithm
SLIDE 23
Forward algorithm
SLIDE 24
Decoding: finding the most probable states
Similar to the forward algorithm, we can define the following value:
SLIDE 25
SLIDE 26