lecture 17 expressive grammars
play

Lecture 17: Expressive grammars Julia Hockenmaier - PowerPoint PPT Presentation

CS498JH: Introduction to NLP (Fall 2012) http://cs.illinois.edu/class/cs498jh Lecture 17: Expressive grammars Julia Hockenmaier juliahmr@illinois.edu 3324 Siebel Center Office Hours: Wednesday, 12:15-1:15pm Why grammar? Meaning


  1. CS498JH: Introduction to NLP (Fall 2012) http://cs.illinois.edu/class/cs498jh Lecture 17: Expressive grammars Julia Hockenmaier juliahmr@illinois.edu 3324 Siebel Center Office Hours: Wednesday, 12:15-1:15pm

  2. Why grammar? Meaning representation Parsing Logical form: Surface saw(Mary,John) Grammar string Mary saw John Pred-arg structure: Generation PRED saw AGENT Mary PATIENT John Dependency graph: saw Mary John CS498JH: Introduction to NLP 2

  3. Grammar formalisms Formalisms provide a language in which linguistic theories can be expressed and implemented Formalisms define elementary objects (trees, strings, feature structures) and recursive operations which generate complex objects from simple objects. Formalisms may impose constraints (e.g. on the kinds of dependencies they can capture) CS498JH: Introduction to NLP 3

  4. How do grammar formalisms differ? Formalisms define different representations Tree-adjoining Grammar (TAG) : Fragments of phrase-structure trees Lexical-functional Grammar (LFG) : Annotated phrase-structure trees (c-structure) linked to feature structures (f-structure) Combinatory Categorial Grammar (CCG) : Syntactic categories paired with meaning representations Head-Driven Phrase Structure Grammar(HPSG): Complex feature structures (Attribute-value matrices) CS498JH: Introduction to NLP 4

  5. The dependencies so far: Arguments: - Verbs take arguments: subject, object, complements, ... Heads subcategorize for their arguments Adjuncts/Modifiers: Adjectives modify nouns, adverbs modify VPs or adjectives, PPs modify NPs or VPs Modifiers subcategorize for the head Typically, these are local dependencies: they can be expressed within individual CFG rules VP → Adv Verb NP 5 CS498JH: Introduction to NLP

  6. Context-free grammars CFGs capture only nested dependencies The dependency graph is a tree The dependencies do not cross CS498JH: Introduction to NLP 6

  7. Beyond CFGs: Nonprojective dependencies Dependencies: tree with crossing branches Arise in the following constructions - (Non-local) scrambling (free word order languages) Die Pizza hat Klaus versprochen zu bringen - Extraposition ( The guy is coming who is wearing a hat ) - Topicalization ( Cheeseburgers , I thought he likes ) CS498JH: Introduction to NLP 7

  8. Beyond CFGs: Nonlocal dependencies Dependencies form a DAG (a node may have multiple incoming edges ) Arise in the following constructions: - Control ( He has promised me to go ) , raising ( He seems to go ) - Wh-movement (the man who you saw yesterday is here again) , - Non-constituent coordination (right-node raising, gapping, argument-cluster coordination) CS498JH: Introduction to NLP 8

  9. Non-local dependencies CS498JH: Introduction to NLP 9

  10. Long-range dependencies Bounded long-range dependencies: limited distance between the head and argument Unbounded long-range dependencies: arbitrary distance (within the same sentence) between the head and argument Unbounded long-range dependencies cannot (in general) be represented with CFGs. Chomsky’s solution: Add null elements (and coindexation) 10 CS498JH: Introduction to NLP

  11. Unbounded nonlocal dependencies Wh-questions and relative clauses contain unbounded nonlocal dependencies , where the missing NP may be arbitrarily deeply embedded: ‘the sushi that [you told me [John saw [Mary eat]]]’ ‘what [did you tell me [John saw [Mary eat]]]?’ Linguists call this phenomenon wh-extraction (wh-movement). 11 CS498JH: Introduction to NLP

  12. Non-local dependencies in wh -extraction NP NP SBAR the sushi IN S that NP VP you V NP S told me NP VP John V S saw NP VP Mary V eat 12 CS498JH: Introduction to NLP

  13. The trace analysis of wh -extraction NP NP SBAR the sushi IN S that NP VP you V NP S trace told me NP VP John V S saw NP VP NP Mary V eat *T* 13 CS498JH: Introduction to NLP

  14. Slash categories for wh-extraction Because only one element can be extracted, we can use slash categories. This is still a CFG: the set of nonterminals is finite. NP NP SBAR the sushi IN S/NP that NP VP/NP you V NP S/NP told me NP VP/NP John V S/NP saw NP VP/NP Mary V Generalized Phrase Structure Grammar eat (GPSG), Gazdar et al. (1985) 14 CS498JH: Introduction to NLP

  15. German: center embedding ...daß ich [Hans schwimmen] sah ...that I Hans swim saw ...that I saw [Hans swim] ...daß ich [Maria [Hans schwimmen] helfen] sah ...that I Maria Hans swim help saw ...that I saw [Mary help [Hans swim]] ...daß ich [Anna [Maria [Hans schwimmen] helfen] lassen] sah ...that I Anna Maria Hans swim help let saw ...that I saw [Anna let [Mary help [Hans swim]]] 15 CS498JH: Introduction to NLP

  16. Dutch: cross-serial dependencies ...dat ik Hans zag zwemmen ...that I Hans saw swim ...that I saw [Hans swim] ...dat ik Maria Hans zag helpen zwemmen ...that I Maria Hans saw help swim ...that I saw [Mary help [Hans swim]] ...dat ik Anna Maria Hans zag laten helpen zwemmen ...that I Anna Maria Hans saw let help swim ...that I saw [Anna let [Mary help [Hans swim]]] Such cross-serial dependencies require mildly context-sensitive grammars 16 CS498JH: Introduction to NLP

  17. Two mildly context-sensitvie formalisms: TAG and CCG CS498JH: Introduction to NLP 17

  18. The Chomsky Hierarchy Recursively enumerable Context-sensitive Mildly context-sensitive Context-free Regular 18 CS498JH: Introduction to NLP

  19. Mildly context-sensitive grammars Contain all context-free grammars /languages Can be parsed in polynomial time (TAG/CCG: O(n 6 )) ( Strong generative capacity) capture certain kinds of dependencies: nested (like CFGs) and cross-serial (like the Dutch example), but not the MIX language: MIX: the set of strings w ∈ {a, b, c}* that contain equal numbers of a s, b s and c s Have the constant growth property: the length of strings grows in a linear way The power-of-2 language {a 2n } does not have the constant growth propery. 19 CS498JH: Introduction to NLP

  20. TAG and CCG are lexicalized formalisms The lexicon: - pairs words with elementary objects - specifies all language-specific information (e.g. subcategorization information) The grammatical operations: - are universal - define (and impose constraints on) recursion. 20 CS498JH: Introduction to NLP

  21. Tree-Adjoining Grammar CS498JH: Introduction to NLP 21

  22. (Lexicalized) Tree-Adjoining Grammar TAG is a tree-rewriting formalism: TAG defines operations ( substitution , adjunction ) on trees. The elementary objects in TAG are trees (not strings) TAG is lexicalized: Each elementary tree is anchored to a lexical item (word) “Extended domain of locality” : The elementary tree contains all arguments of the anchor. TAG requires a linguistic theory which specifies the shape of these elementary trees. TAG is mildly context-sensitive: can capture Dutch cross-serial dependencies AK Joshi and Y Schabes (1996) Tree Adjoining Grammars. but is still efficiently parseable In G. Rosenberg and A. Salomaa, Eds., Handbook of Formal 22 CS498JH: Introduction to NLP Languages

  23. Extended domain of locality We want to capture all arguments of a word in a single elementary object . We also want to retain certain syntactic structures (e.g. VPs). Our elementary objects are tree fragments: S NP VP VBZ NP eats 23 CS498JH: Introduction to NLP

  24. TAG substitution (arguments) Derived tree: α 1: X Y X ↓ Y ↓ Substitute X Y α 3: α 2: α 1 Derivation tree: α 2 α 3 24 CS498JH: Introduction to NLP

  25. TAG adjunction β 1: Derived tree: X Auxiliary tree X Foot node α 1: X* X X* ADJOIN α 1 Derivation tree: β 1 25 CS498JH: Introduction to NLP

  26. The effect of adjunction TIG: TAG: sister wrapping adjunction adjunction No adjunction: TSG (Tree substitution grammar) TSG is context-free Sister adjunction: TIG (Tree insertion grammar) TIG is also context-free, but has a linguistically more adequate treatment of modifiers Wrapping adjunction: TAG (Tree-adjoining grammar) TAG is mildy context-sensitive 26 CS498JH: Introduction to NLP

  27. A small TAG lexicon β 1: α 2: VP NP VP* RB α 1: John always S NP VP VBZ NP α 3: eats NP tapas 27 CS498JH: Introduction to NLP

  28. A TAG derivation α 1: α 1 S NP VP NP α 3 α 2 NP VBZ NP eats β 1: α 3: α 2: VP NP NP RB VP* NP NP John tapas always 28 CS498JH: Introduction to NLP

  29. A TAG derivation α 1 S NP VP VP α 2 β 1 α 3 John NP VBZ eats tapas β 1 VP VP RB VP* always 29 CS498JH: Introduction to NLP

  30. A TAG derivation S NP VP RB VP* VP John VBZ NP always eats tapas 30 CS498JH: Introduction to NLP

  31. a n b n : Cross-serial dependencies S S Elementary trees: a S a S b b S* Deriving aabb S S S S a a S a S a S b b S S* S S b b 31 CS498JH: Introduction to NLP

  32. Combinatory Categorial Grammar CS498JH: Introduction to NLP 32

  33. CCG: the machinery Categories: specify subcat lists of words/constituents. Combinatory rules: specify how constituents can combine. The lexicon: specifies which categories a word can have. Derivations: spell out process of combining constituents. 33 CS498JH: Introduction to NLP

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend