Using Dependency Grammar Features in Whole Sentence Maximum Entropy Language Model for Speech Recognition
Teemu Ruokolainen, Tanel Alum¨ ae, Marcus Dobrinkat October 8th, 2010
1 / 16
Using Dependency Grammar Features in Whole Sentence Maximum Entropy - - PowerPoint PPT Presentation
Using Dependency Grammar Features in Whole Sentence Maximum Entropy Language Model for Speech Recognition Teemu Ruokolainen, Tanel Alum ae, Marcus Dobrinkat October 8th, 2010 1 / 16 Contents Whole sentence language modeling Dependency
1 / 16
2 / 16
3 / 16
◮ Log probability given by trigram LM = -19.39
◮ Log probability = -21.26 4 / 16
◮ Log probability = -19.92
◮ Log probability = -18.82 5 / 16
◮ Dependency Grammar Features ◮ Whole Sentence Maximum Entropy Language Model (WSME
◮ Experiments in a large vocabulary speech recognition task 6 / 16
NEG OBJ SUBS V-CH DAT 7 / 16
◮ Feature is or is not present in a sentence
buy
OBJ
votes I will buy
SUBS V-CH 8 / 16
9 / 16
◮ For uniform background model, the Maximum Entropy solution
◮ Markov Chain Monte Carlo sampling methods 10 / 16
11 / 16
◮ English newswire articles of typical daily news topics; sports,
◮ 1M sentences (20M words) ◮ Small subset of Gigaword
◮ Dictated English financial newswire articles ◮ 329 sentences (11K words)
12 / 16
13 / 16
14 / 16
◮ Experiments on bigram and trigram features
15 / 16
16 / 16