& HMM DTW - - PowerPoint PPT Presentation

hmm dtw http cvsp cs ntua gr courses patrec
SMART_READER_LITE
LIVE PREVIEW

& HMM DTW - - PowerPoint PPT Presentation

& HMM DTW http://cvsp.cs.ntua.gr/courses/patrec Forward Algorithm Backward Algorithm Probability Functions for Local State Estimation i


slide-1
SLIDE 1

http://cvsp.cs.ntua.gr/courses/patrec

Αναγνώριση Προτύπων & Αναγνώριση Φωνής

Πετρος Μαραγκος

HMM

DTW

slide-2
SLIDE 2
slide-3
SLIDE 3
slide-4
SLIDE 4
slide-5
SLIDE 5
slide-6
SLIDE 6
slide-7
SLIDE 7

Forward Algorithm

slide-8
SLIDE 8

Backward Algorithm

slide-9
SLIDE 9

Probability Functions for Local State Estimation

1

argmax ( )

t t i N

q γ i

slide-10
SLIDE 10

Εργοδικο Left-Right Serial Left-Right Parallel

slide-11
SLIDE 11

HMM: State Estimation, Viterbi Algorithm

Viterbi Score

* *

Pr( | , ) P O Q λ

slide-12
SLIDE 12

Example of State Estimation via Viterbi Algorithm

slide-13
SLIDE 13

Probability Functions for HMM Parameter Estimation - I

slide-14
SLIDE 14

Probability Functions for HMM Parameter Estimation - II

slide-15
SLIDE 15

Reestimation of HMM Parameters

slide-16
SLIDE 16

HMM Continuous Densities

slide-17
SLIDE 17

HMM Parameter Estimation for Continuous Densities

slide-18
SLIDE 18

LPC Processor for Speech Recognition

slide-19
SLIDE 19
slide-20
SLIDE 20
slide-21
SLIDE 21

Probability Distributions of Cepstral Coefs of /zero/

slide-22
SLIDE 22

Dynamic Time Warping (DTW)

slide-23
SLIDE 23
slide-24
SLIDE 24
slide-25
SLIDE 25
slide-26
SLIDE 26
slide-27
SLIDE 27
slide-28
SLIDE 28

HMM: λ = (Α, Β, π)

  • A =

State Transition Probability Matrix

  • Β =

Observations Probability Distributions

  • π =

Initial State Probability

HMM (Hidden Markov Models)

  • t = 1, 2, 3, …: Discrete Time
  • Ο =

: Observation Sequence

  • T = Length of Observation Sequence
  • N = Number of States
  • M = # of Observation Symbols / Mixtures
  • States

[ ], Pr at +1 | at

ij ij j i

a a S t S t

1 2

( , ,..., )

T

O O O

1 2

, , ,

N

S S S

( ) , ( ) Pr at | at

j j k j

b k b k v t S t

, Pr at =1

i i i

q t

slide-29
SLIDE 29

Problems to Be Solved in HMM

  • Problem 1: Classification – Scoring (Forward-Backward Algorithm)

Given an observed sequence and a model λ=(π, Α, Β), compute likelihood

  • Problem 2: State Estimation (Viterbi Algorithm)

Given an observed sequence estimate an optimum state sequence and compute the score

  • Problem 3: Training (EM Algorithm)

Given an observed sequence

adjust model parameters λ=(π, Α, Β) to maximize likelihood

1 2

( , , , )

T

O O O O

Pr( | ) O

* 1 2

( , , , )

T

Q q q q

1 2

( , , , )

T

O O O O

1 2

( , , , )

T

O O O O Pr( | ) O

*

Pr( , | ) O Q