http://cvsp.cs.ntua.gr/courses/patrec
& HMM DTW - - PowerPoint PPT Presentation
& HMM DTW - - PowerPoint PPT Presentation
& HMM DTW http://cvsp.cs.ntua.gr/courses/patrec Forward Algorithm Backward Algorithm Probability Functions for Local State Estimation i
Forward Algorithm
Backward Algorithm
Probability Functions for Local State Estimation
1
argmax ( )
t t i N
q γ i
Εργοδικο Left-Right Serial Left-Right Parallel
HMM: State Estimation, Viterbi Algorithm
Viterbi Score
* *
Pr( | , ) P O Q λ
Example of State Estimation via Viterbi Algorithm
Probability Functions for HMM Parameter Estimation - I
Probability Functions for HMM Parameter Estimation - II
Reestimation of HMM Parameters
HMM Continuous Densities
HMM Parameter Estimation for Continuous Densities
LPC Processor for Speech Recognition
Probability Distributions of Cepstral Coefs of /zero/
Dynamic Time Warping (DTW)
HMM: λ = (Α, Β, π)
- A =
State Transition Probability Matrix
- Β =
Observations Probability Distributions
- π =
Initial State Probability
HMM (Hidden Markov Models)
- t = 1, 2, 3, …: Discrete Time
- Ο =
: Observation Sequence
- T = Length of Observation Sequence
- N = Number of States
- M = # of Observation Symbols / Mixtures
- States
[ ], Pr at +1 | at
ij ij j i
a a S t S t
1 2
( , ,..., )
T
O O O
1 2
, , ,
N
S S S
( ) , ( ) Pr at | at
j j k j
b k b k v t S t
, Pr at =1
i i i
q t
Problems to Be Solved in HMM
- Problem 1: Classification – Scoring (Forward-Backward Algorithm)
Given an observed sequence and a model λ=(π, Α, Β), compute likelihood
- Problem 2: State Estimation (Viterbi Algorithm)
Given an observed sequence estimate an optimum state sequence and compute the score
- Problem 3: Training (EM Algorithm)
Given an observed sequence
adjust model parameters λ=(π, Α, Β) to maximize likelihood
1 2
( , , , )
T
O O O O
Pr( | ) O
* 1 2
( , , , )
T
Q q q q
1 2
( , , , )
T
O O O O
1 2
( , , , )
T
O O O O Pr( | ) O
*
Pr( , | ) O Q