last class recursive descent parsing and cyk
play

Last Class Recursive Descent Parsing and CYK ANLP: Lecture 13 - PDF document

Last Class Recursive Descent Parsing and CYK ANLP: Lecture 13 Chomsky normal form grammars English syntax Shay Cohen Agreement phenomena and the way to model them with CFGs 14 October 2019 1 / 71 2 / 71 Recap: Syntax Parsing algorithms


  1. Last Class Recursive Descent Parsing and CYK ANLP: Lecture 13 Chomsky normal form grammars English syntax Shay Cohen Agreement phenomena and the way to model them with CFGs 14 October 2019 1 / 71 2 / 71 Recap: Syntax Parsing algorithms Two reasons to care about syntactic structure (parse tree): Goal: compute the structure(s) for an input string given a grammar. ◮ As a guide to the semantic interpretation of the sentence ◮ As usual, ambiguity is a huge problem. ◮ As a way to prove whether a sentence is grammatical or not ◮ For correctness: need to find the right structure to get the right meaning. But having a grammar isn’t enough. ◮ For efficiency: searching all possible structures can be very We also need a parsing algorithm to compute the parse tree for a slow; want to use parsing for large-scale language tasks (e.g., used to create Google’s “infoboxes”). given input string and grammar. 3 / 71 4 / 71

  2. Global and local ambiguity Parser properties ◮ We’ve already seen examples of global ambiguity: multiple analyses for a full sentence, like I saw the man with the telescope All parsers have two fundamental properties: ◮ But local ambiguity is also a big problem: multiple analyses ◮ Directionality: the sequence in which the structures are for parts of sentence. constructed. ◮ top-down: start with root category (S), choose expansions, ◮ the dog bit the child: first three words could be NP (but build down to words. aren’t). ◮ bottom-up: build subtrees over words, build up to S. ◮ Building useless partial structures wastes time. ◮ Mixed strategies also possible (e.g., left corner parsers) ◮ Search strategy: the order in which the search space of ◮ Avoiding useless computation is a major issue in parsing. possible analyses is explored. ◮ Syntactic ambiguity is rampant; humans usually don’t even notice because we are good at using context/semantics to disambiguate. 5 / 71 6 / 71 Example: search space for top-down parser Search strategies ◮ Start with S node. ◮ Choose one of many possible expansions. ◮ depth-first search: explore one branch of the search space at a ◮ Each of which has children with many time, as far as possible. If this branch is a dead-end, parser possible expansions... needs to backtrack. ◮ etc ◮ breadth-first search: expand all possible branches in parallel S (or simulated parallel). Requires storing many incomplete parses in memory at once. S S S ◮ best-first search: score each partial parse and pursue the aux NP VP NP VP NP highest-scoring options first. (Will get back to this when discussing statistical parsing.) S S S S S S . . . . . . . . . . . . . . . NP 7 / 71 8 / 71

  3. Recursive Descent Parsing RD Parsing algorithm Start with subgoal = S , then repeat until input/subgoals are ◮ A recursive descent parser treats a grammar as a specification empty: of how to break down a top-level goal (find S ) into subgoals ◮ If first subgoal in list is a non-terminal A , then pick an (find NP VP ). expansion A → B C from grammar and replace A in subgoal ◮ It is a top-down , depth-first parser: list with B C ◮ Blindly expand nonterminals until reaching a terminal (word). ◮ If first subgoal in list is a terminal w : ◮ If input is empty, backtrack. ◮ If multiple options available, choose one but store current state ◮ If next input word is different from w , backtrack. as a backtrack point (in a stack to ensure depth-first.) ◮ If next input word is w , match! i.e., consume input word w and subgoal w and move to next subgoal. ◮ If terminal matches next input word, continue; else, backtrack. If we run out of backtrack points but not input, no parse is possible. 9 / 71 10 / 71 Recursive descent parsing pseudocode Recursive descent example In the background: a CFG G , a sentence x 1 · · · x n Function RecursiveDescent( t , v , i ) where ◮ t is a partially constructed tree ◮ v is a node in t ◮ i is a sentence position Consider a very simple example: ◮ Let N be the nonterminal in v ◮ Grammar contains only these rules: ◮ For each rule with LHS N : S → NP VP VP → V NN → bit V → bit ◮ If the rule is a lexical rule N → w , check whether x i = w , if so NP → DT NN DT → the NN → dog V → dog increase i by 1 and call RecursiveDescent( t , u , i + 1) where u ◮ The input sequence is the dog bit is the lowest point above v that has a nonterminal ◮ If the rule is a grammatical rule, Let t ′ be t with v expanded using the rule N → A 1 · · · A n . For each j ∈ { 1 · · · n } , call RecursiveDescent( t ′ , u j , i ) where u j is the node for nonterminal A j in t ′ . Start with: RecursiveDescent( S , topnode , 1) Quick quiz: this algorithm has a bug. Where? What do we need to add? 11 / 71 12 / 71

  4. Recursive Descent Parsing S the dog saw a man in the park 13 / 71 14 / 71 Recursive Descent Parsing Recursive Descent Parsing S S NP NP VP VP Det N PP the dog saw a man in the park the dog saw a man in the park 15 / 71 16 / 71

  5. Recursive Descent Parsing Recursive Descent Parsing S S NP NP VP VP Det Det N PP N PP the the the dog saw a man in the park the dog saw a man in the park 17 / 71 18 / 71 Recursive Descent Parsing Recursive Descent Parsing S S NP NP VP VP N Det N Det PP PP park man the the the dog saw a man in the park the dog saw a man in the park 19 / 71 20 / 71

  6. Recursive Descent Parsing Recursive Descent Parsing S S NP NP VP VP N N Det Det PP PP P NP the dog the dog the dog saw a man in the park the dog saw a man in the park 21 / 71 22 / 71 Recursive Descent Parsing Recursive Descent Parsing S S NP NP VP VP N Det Det PP N P NP in the dog the the dog saw a man in the park the dog saw a man in the park 23 / 71 24 / 71

  7. Recursive Descent Parsing Recursive Descent Parsing S S NP NP VP VP N N Det Det V NP PP the dog the dog saw the dog saw a man in the park the dog saw a man in the park 25 / 71 26 / 71 Recursive Descent Parsing Recursive Descent Parsing S S NP NP VP VP N N Det V Det V NP NP PP PP Det Det N N PP PP the dog saw a the dog saw a man the dog saw a man in the park the dog saw a man in the park 27 / 71 28 / 71

  8. Recursive Descent Parsing Recursive Descent Parsing S S NP VP NP VP N N Det V NP Det V PP NP PP Det N PP Det N PP P NP P NP Det N PP the dog saw a man in the dog saw a man in the dog saw a man in the park the dog saw a man in the park 29 / 71 30 / 71 Recursive Descent Parsing Recursive Descent Parsing S S NP VP NP VP N Det V NP N PP Det V NP PP Det N PP Det N P NP N Det PP P NP the dog saw a man in the park the dog saw the dog saw a man in the park the dog saw a man in the park 31 / 71 32 / 71

  9. Recursive Descent Parsing Recursive Descent Parsing S S NP VP NP VP N N Det V NP PP Det V NP PP Det NP N P Det N N Det the dog saw a man in the park the dog saw a man the dog saw a man in the park the dog saw a man in the park 33 / 71 34 / 71 Left Recursion Shift-Reduce Parsing Can recursive descent parsing handle left recursion? Grammars for natural human languages should be revealing, left-recursive rules are needed in English. A Shift-Reduce parser tries to find sequences of words and phrases that correspond to the righthand side of a grammar production and replace them with the lefthand side: NP → DET N NP → NPR ◮ Directionality = bottom-up: starts with the words of the DET → NP ’s input and tries to build trees from the words up. These rules generate NPs with possessive modifiers such as: ◮ Search strategy = breadth-first: starts with the words, then applies rules with matching right hand sides, and so on until the whole sentence is reduced to an S. John’s sister John’s mother’s sister John’s mother’s uncle’s sister John’s mother’s uncle’s sister’s niece 35 / 71 36 / 71

  10. Algorithm Sketch: Shift-Reduce Parsing Shift-Reduce Parsing Stack Remaining Text Until the words in the sentences are substituted with S: my dog saw a man in the park with a statue ◮ Scan through the input until we recognise something that corresponds to the RHS of one of the production rules (shift) ◮ Apply a production rule in reverse; i.e., replace the RHS of the rule which appears in the sentential form with the LHS of the rule (reduce) A shift-reduce parser implemented using a stack: 1. start with an empty stack 2. a shift action pushes the current input symbol onto the stack 3. a reduce action replaces n items with a single item 37 / 71 38 / 71 Shift-Reduce Parsing Shift-Reduce Parsing Stack Remaining Text Stack Remaining Text Det dog saw a man in the park with a statue Det N saw a man in the park with a statue my my dog 39 / 71 40 / 71

  11. Shift-Reduce Parsing Shift-Reduce Parsing Stack Remaining Text Stack Remaining Text NP saw a man in the park with a statue NP V NP in the park with a statue Det N Det N saw Det N my dog my dog a man 41 / 71 42 / 71 Shift-Reduce Parsing Shift-Reduce Parsing Stack Remaining Text Stack Remaining Text NP V NP PP with a statue NP V NP Det N saw Det N P NP Det N saw NP PP my dog a man in Det N my dog Det N P NP the park a man in Det N the park 43 / 71 44 / 71

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend