Hidden Markov Models Markov Model (Finite State Machine with Probs) - - PowerPoint PPT Presentation

hidden markov models markov model finite state machine
SMART_READER_LITE
LIVE PREVIEW

Hidden Markov Models Markov Model (Finite State Machine with Probs) - - PowerPoint PPT Presentation

Hidden Markov Models Markov Model (Finite State Machine with Probs) Modeling a sequence of weather observations Hidden Markov Models Assume the states in the machine are not observed and we can observe some output at certain states. Hidden


slide-1
SLIDE 1

Hidden Markov Models

slide-2
SLIDE 2

Markov Model (Finite State Machine with Probs)

Modeling a sequence of weather observations

slide-3
SLIDE 3

Hidden Markov Models

Assume the states in the machine are not observed and we can observe some output at certain states.

slide-4
SLIDE 4

Hidden Markov Models

Assume the states in the machine are not observed and we can observe some output at certain states.

Hidden: Sunny Hidden: Rainy Observation: Walk Observation: Shop Observation: Clean

slide-5
SLIDE 5

s(i-1) s(i+1) s(i)

p(s(i)|s(i − 1)) p(s(i + 1)|s(i))

x(i-1) x(i) x(i+1)

p(x(i − 1)|s(i − 1)) p(x(i)|s(i)) p(x(i + 1)|s(i + 1)) Hidden Observed

Generate a sequence from a HMM

slide-6
SLIDE 6

HHHHHHCCCCCCCHHHHHH 3323332111111233332

Hidden: temperature Observed: number of ice creams

Generate a sequence from a HMM

slide-7
SLIDE 7

Speech recognition Action recognition

Hidden Markov Models: Applications

slide-8
SLIDE 8

Motif Finding

Problem: Find frequent motifs with length L in a sequence dataset Assumption: the motifs are very similar to each other but look very different from the rest part of sequences

ATCGCGCGGCGCGGAATCGDTATCGCGCGCCCAGGTAAGT GCGCGCGCAGGTAAGGTATTATGCGAGACGATGTGCTATT GTAGGCTGATGTGGGGGGAAGGTAAGTCGAGGAGTGCATG CTAGGGAAACCGCGCGCGCGCGATAAGGTGAGTGGGAAAG

slide-9
SLIDE 9

Motif: a first approximation

Assumption 1: lengths of motifs are fixed to L Assumption 2: states on different positions on the sequence are independently distributed p(x) =

L

Y

i=1

pi(x(i)) pi(A) = Ni(A) Ni(A) + Ni(T) + Ni(G) + Ni(C)

slide-10
SLIDE 10

Motif: (Hidden) Markov models

Assumption 1: lengths of motifs are fixed to L Assumption 2: future letters depend only on the present letter p(x) = p1(x(1))

L

Y

i=2

pi(x(i)|x(i − 1)) pi(A|G) = Ni−1,i(G, A) Ni−1(G)

slide-11
SLIDE 11

Motif Finding

Problem: We don’t know the exact locations of motifs in the sequence dataset Assumption: the motifs are very similar to each other but look very different from the rest part of sequences

ATCGCGCGGCGCGGAATCGDTATCGCGCGCCCAGGTAAGT GCGCGCGCAGGTAAGGTATTATGCGAGACGATGTGCTATT GTAGGCTGATGTGGGGGGAAGGTAAGTCGAGGAGTGCATG CTAGGGAAACCGCGCGCGCGCGATAAGGTGAGTGGGAAAG

slide-12
SLIDE 12

Hidden state space

null start end

slide-13
SLIDE 13

Hidden Markov Model (HMM)

null start end

0.9 0.08 0.95 0.05 0.01 0.99 0.02

slide-14
SLIDE 14

How to build HMMs?

slide-15
SLIDE 15

Computational problems in HMMs

slide-16
SLIDE 16

Hidden Markov Models

slide-17
SLIDE 17

Hidden Markov Model

q(i-1) q(i+1) q(i)

  • (i-1)
  • (i)
  • (i+1)

Hidden Observed

slide-18
SLIDE 18

Conditional Probability of Observations

Example:

slide-19
SLIDE 19

Joint and marginal probabilities

Joint: Marginal:

slide-20
SLIDE 20

How to compute the probability of observations

slide-21
SLIDE 21

Forward algorithm

slide-22
SLIDE 22

Forward algorithm

slide-23
SLIDE 23

Forward algorithm

slide-24
SLIDE 24

Decoding: finding the most probable states

Similar to the forward algorithm, we can define the following value:

slide-25
SLIDE 25
slide-26
SLIDE 26

Viterbi algorithm