induction of treebank aligned lexical resources
play

Induction of Treebank-Aligned Lexical Resources LREC 2008 - PowerPoint PPT Presentation

Induction of Treebank-Aligned Lexical Resources LREC 2008 Tejaswini Deoskar, Mats Rooth Department of Linguistics Cornell University Induction of Treebank-Aligned Lexical Resources p. 1/2 Overview Goal: Induction of probabilistic


  1. Induction of Treebank-Aligned Lexical Resources LREC 2008 Tejaswini Deoskar, Mats Rooth Department of Linguistics Cornell University Induction of Treebank-Aligned Lexical Resources – p. 1/2

  2. Overview • Goal: Induction of probabilistic treebank-aligned lexical resources. • Treebank-Aligned Lexicon : a systematic correspondence between features of a probabilistic lexicon and structural annotation in a treebank. • Features: ♦ complex subcategorization frames for verbs or nouns. ♦ attachment preference of adverbs Induction of Treebank-Aligned Lexical Resources – p. 2/2

  3. Overview • Treebank PCFG and lexicon. ♦ Unlexicalised Treebank PCFG : Clear division between grammar and lexicon. ♦ Good performance (Klein and Manning, 2003) • Large-scale lexicon: Unsupervised acquisition from unlabeled data. Induction of Treebank-Aligned Lexical Resources – p. 3/2

  4. Why another Treebank PCFG? • PCFGs built from Treebanks are reduced representations. Induction of Treebank-Aligned Lexical Resources – p. 4/2

  5. Why another Treebank PCFG? • PCFGs built from Treebanks are reduced representations. ♦ Exports which played a key role in fueling growth over the last two years seem to have stalled. Induction of Treebank-Aligned Lexical Resources – p. 4/2

  6. Why another Treebank PCFG? • PCFGs built from Treebanks are reduced representations. ♦ Exports which played a key role in fueling growth over the last two years seem to have stalled. • More expressive formalisms can represent these (LFG, HPSG, TAG, CCG, Minimalist grammars) Induction of Treebank-Aligned Lexical Resources – p. 4/2

  7. Why another Treebank PCFG? • PCFGs built from Treebanks are reduced representations. ♦ Exports which played a key role in fueling growth over the last two years seem to have stalled. • More expressive formalisms can represent these (LFG, HPSG, TAG, CCG, Minimalist grammars) • A sophisticated PCFG that captures the same phenomena as more expressive formalisms. Induction of Treebank-Aligned Lexical Resources – p. 4/2

  8. Why another Treebank PCFG? • PCFGs built from Treebanks are reduced representations. ♦ Exports which played a key role in fueling growth over the last two years seem to have stalled. • More expressive formalisms can represent these (LFG, HPSG, TAG, CCG, Minimalist grammars) • A sophisticated PCFG that captures the same phenomena as more expressive formalisms. ♦ Linguistic theory neutral. Induction of Treebank-Aligned Lexical Resources – p. 4/2

  9. Why another Treebank PCFG? • PCFGs built from Treebanks are reduced representations. ♦ Exports which played a key role in fueling growth over the last two years seem to have stalled. • More expressive formalisms can represent these (LFG, HPSG, TAG, CCG, Minimalist grammars) • A sophisticated PCFG that captures the same phenomena as more expressive formalisms. ♦ Linguistic theory neutral. ♦ Focus on commonly observed phenomenon. Induction of Treebank-Aligned Lexical Resources – p. 4/2

  10. Treebank Transformation Framework • Treebank Transformation : Johnson (1999), Klein and Manning (2003), etc. • Training of PCFG on transformed treebank. Induction of Treebank-Aligned Lexical Resources – p. 5/2

  11. Treebank Transformation Framework • Treebank Transformation : Johnson (1999), Klein and Manning (2003), etc. • Training of PCFG on transformed treebank. • Methodology for transformation based on addition of linguistically motivated features, and feature-constraint solving. • Database of Penn Treebank trees annotated with linguistic features as a resource. • Components usable for transforming existing PTB-style treebanks, and building accurate PCFGs from them. Induction of Treebank-Aligned Lexical Resources – p. 5/2

  12. Feature Constraint Framework • Bare-bones CFG extracted from Penn Treebank. • A feature-constraint grammar is built by adding constraints on CF rules (YAP, Schmid (2000)). • Each treebank tree converted into a trivial context-free shared forest. • Constraints in the shared forest solved by YAP constraint solver. Induction of Treebank-Aligned Lexical Resources – p. 6/2

  13. Adding Constraints Features on auxiliary verbs: Induction of Treebank-Aligned Lexical Resources – p. 7/2

  14. Adding Constraints Features on auxiliary verbs: VP → VB ADVP VP Induction of Treebank-Aligned Lexical Resources – p. 7/2

  15. Adding Constraints Features on auxiliary verbs: VP → VB ADVP VP → VP { Vform = base; } VB {Val = aux;} ADVP { } VP { } Induction of Treebank-Aligned Lexical Resources – p. 7/2

  16. Adding Constraints Features on auxiliary verbs: VP → VB ADVP VP → VP { Vform = base; } VB {Val = aux;} ADVP { } VP { } VP {Vform = base; Slash = sl ; } VB {Val = aux; Vsel = vf ; } → ADVP { } VP { Slash = sl ; Vform = vf } Induction of Treebank-Aligned Lexical Resources – p. 7/2

  17. Adding Constraints Features on auxiliary verbs: VP → VB ADVP VP → VP { Vform = base; } VB {Val = aux;} ADVP { } VP { } VP {Vform = base; Slash = sl ; } VB {Val = aux; Vsel = vf ; } → ADVP { } VP { Slash = sl ; Vform = vf } VP {Vform = base; Slash = sl ; } VB {Val = aux; Vsel = vf ; → Prep = - ; Prtcl = -; Sbj = -; } ADVP { } VP {Slash = sl ; Vform = vf } Induction of Treebank-Aligned Lexical Resources – p. 7/2

  18. Adding Constraints Features on auxiliary verbs: VP → VB ADVP VP → VP { Vform = base; } VB {Val = aux;} ADVP { } VP { } VP {Vform = base; Slash = sl ; } VB {Val = aux; Vsel = vf ; } → ADVP { } VP { Slash = sl ; Vform = vf } VP {Vform = base; Slash = sl ; } VB {Val = aux; Vsel = vf ; → Prep = - ; Prtcl = -; Sbj = -; } ADVP { } VP {Slash = sl ; Vform = vf } Induction of Treebank-Aligned Lexical Resources – p. 7/2

  19. Relative Clause ..that has been seen. Induction of Treebank-Aligned Lexical Resources – p. 8/2

  20. Verbal Subcategorization Features → VP VBD +EI-NP+ S Induction of Treebank-Aligned Lexical Resources – p. 9/2

  21. Verbal Subcategorization Features → VP VBD +EI-NP+ S → VP{ Vform = ns; } VBD { Val = ns; } +EI-NP+ S { } Induction of Treebank-Aligned Lexical Resources – p. 9/2

  22. Verbal Subcategorization Features → VP VBD +EI-NP+ S → VP{ Vform = ns; } VBD { Val = ns; } +EI-NP+ S { } VBD { Val=ns; Sbj = x ; Vsel = vf ; } → VP{ Vform = ns; } +EI-NP+ S { Sbj= x ; Vform = vf ; } Induction of Treebank-Aligned Lexical Resources – p. 9/2

  23. Verbal Subcategorization Features → VP VBD +EI-NP+ S → VP{ Vform = ns; } VBD { Val = ns; } +EI-NP+ S { } VBD { Val=ns; Sbj = x ; Vsel = vf ; } → VP{ Vform = ns; } +EI-NP+ S { Sbj= x ; Vform = vf ; } VP{Vform = ns; Slash = sl ;} VBD {Val=ns; Sbj= x ; Vsel= vf ; → Prep=-; Prtcl=-; } +EI-NP+ S {Sbj= x ; Vform= vf ; Slash= sl ;} Induction of Treebank-Aligned Lexical Resources – p. 9/2

  24. Verbal Subcategorization Structural information is projected onto lexical item: verbs, adverbs, nouns. Induction of Treebank-Aligned Lexical Resources – p. 10/2

  25. A feature-structure Treebank Tree The product-design project he heads is scrapped Induction of Treebank-Aligned Lexical Resources – p. 11/2

  26. Treebank PCFG • Frequencies collected from feature-annotated treebank database. • Rule frequency table and frequency lexicon that can be used by a probabilistic parser. Induction of Treebank-Aligned Lexical Resources – p. 12/2

  27. Treebank grammar and lexicon ROOT S .fin.-.-.root → 29092.0 S .fin.-.-.- NP-SBJ .nvd.base.-.-.- VP .fin.-.- → 14134.0 NP-SBJ .nvd.base.-.-.- PRP → 13057.0 PP .nvd.of.np IN .of NP .nvd.base.-.-.-.- → 13050.0 VBD .s.e.to.- 32.0 VBN .s.e.to.- 11.0 VBN .n.-.-.- 5.0 tried VBD .z.-.-.- 1.0 VBD .n.-.-.- 1.0 VBD .s.e.g.- 1.0 VBN .z.-.-.- 1.0 VBD .n.-.- 1.0 admired VB .z.-.- 1.0 VB .n.-.- 1.0 VB .b.-.- 3.0 admit VBP .z.-.- 1.0 VBP .p.-.- 1.0 VBP .b.-.- 2.0 VBG .s.-.to 1.0 admonishing Induction of Treebank-Aligned Lexical Resources – p. 13/2

  28. Treebank PCFG • PCFG of variable granularity, based on attributes incorporated into the PCFG symbols. PTB No Prep. Prep. Sec 23 Prepositions on verbs on nouns Labeled Recall 86.5 86.11 85.98 Labeled Precision 86.7 86.50 86.3 Labeled F-score 86.6 86.31 86.14 Number of features on all categories: 19 Some structural features, mostly linguistic features. Induction of Treebank-Aligned Lexical Resources – p. 14/2

  29. Scarcity of lexical data In training sections of Penn Treebank, ∼ 45000 sentences • Total verb types: ∼ 7450, tokens ∼ 125000. • ∼ 2830 verb types with occurrence freq 1: 38% of all types, 2.37% of all tokens. VBD .n.-.- 1.0 admired VB .z.-.- 1.0 VB .n.-.- 1.0 VB .b.-.- 3.0 admit VBP .z.-.- 1.0 VBP .p.-.- 1.0 VBP .b.-.- 2.0 VBG .s.-.to 1.0 admonishing VBN .aux.e.fin 2.0 VBD .n.-.- 15.0 VBD .np.-.- 1.0 adopted VBN .n.-.- 16.0 Induction of Treebank-Aligned Lexical Resources – p. 15/2

  30. Unsupervised Estimation • Inside-outside estimation over an unlabeled corpus. Induction of Treebank-Aligned Lexical Resources – p. 16/2

  31. Unsupervised Estimation • Inside-outside estimation over an unlabeled corpus. • Treebank PCFG as starting model. Induction of Treebank-Aligned Lexical Resources – p. 16/2

  32. Unsupervised Estimation • Inside-outside estimation over an unlabeled corpus. • Treebank PCFG as starting model. • Focus on learning lexical parameters. Induction of Treebank-Aligned Lexical Resources – p. 16/2

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend