probabilistic context free grammars
play

Probabilistic Context Free Grammars CMSC 473/673 UMBC Outline - PowerPoint PPT Presentation

Probabilistic Context Free Grammars CMSC 473/673 UMBC Outline Recap: MT word alignment Structure in Language: Constituency (Probabilistic) Context Free Grammars Definitions High-level tasks: Generating and Parsing Some uses for PCFGs CKY


  1. Constituents Help Form Grammars constituent: spans of words that act (syntactically) as a group “X phrase” (noun phrase) Baltimore is a great place to be . This house is a great place to be . This red house is a great place to be . This red house on the hill is a great place to be . This red house near the hill is a great place to be . This red house atop the hill is a great place to be . The hill is a great place to be . S  NP VP NP  NP PP NP  Det Noun PP  P NP NP  Noun AdjP  Adj Noun NP  Det AdjP VP  V NP

  2. Constituents Help Form Grammars constituent: spans of words that act (syntactically) as a group “X phrase” (noun phrase) Baltimore is a great place to be . This house is a great place to be . This red house is a great place to be . This red house on the hill is a great place to be . This red house near the hill is a great place to be . This red house atop the hill is a great place to be . The hill is a great place to be . S  NP VP NP  NP PP NP  Det Noun PP  P NP NP  Noun AdjP  Adj Noun NP  Det AdjP VP  V NP

  3. Constituents Help Form Grammars constituent: spans of words that act (syntactically) as a group “X phrase” (noun phrase) Baltimore is a great place to be . This house is a great place to be . This red house is a great place to be . This red house on the hill is a great place to be . This red house near the hill is a great place to be . This red house atop the hill is a great place to be . The hill is a great place to be . S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP

  4. Outline Recap: MT word alignment Structure in Language: Constituency (Probabilistic) Context Free Grammars Definitions High-level tasks: Generating and Parsing Some uses for PCFGs CKY Algorithm: Parsing with a (P)CFG

  5. Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP Set of rewrite rules, comprised of terminals and non-terminals Terminals: the words in the language (the lexicon), e.g., Baltimore Non-terminals: symbols that can trigger rewrite rules, e.g., S, NP , Noun (Sometimes) Pre-terminals: symbols that can only trigger lexical rewrites, e.g., Noun

  6. Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP Set of rewrite rules, comprised of terminals and non-terminals Terminals: the words in the language (the Applications: Theory: lexicon), e.g., Baltimore Learn more Learn in CMSC Non-terminals: symbols that can trigger more in 331, 431 rewrite rules, e.g., S, NP , Noun CMSC 451 (Sometimes) Pre-terminals: symbols that can only trigger lexical rewrites, e.g., Noun

  7. How Do We Robustly Handle Ambiguities?

  8. How Do We Robustly Handle Ambiguities? Add probabilities (to what?)

  9. Probabilistic Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP … Set of weighted (probabilistic) rewrite rules, comprised of terminals and non-terminals Terminals: the words in the language (the lexicon), e.g., Baltimore Non-terminals: symbols that can trigger rewrite rules, e.g., S, NP , Noun (Sometimes) Pre-terminals: symbols that can only trigger lexical rewrites, e.g., Noun

  10. Probabilistic Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP … Set of weighted (probabilistic) rewrite Q: What are the distributions? rules, comprised of terminals and What must sum to 1? non-terminals Terminals: the words in the language (the lexicon), e.g., Baltimore Non-terminals: symbols that can trigger rewrite rules, e.g., S, NP , Noun (Sometimes) Pre-terminals: symbols that can only trigger lexical rewrites, e.g., Noun

  11. Probabilistic Context Free Grammar 1.0 S  NP VP 1.0 PP  P NP .4 NP  Det Noun .34 AdjP  Adj Noun .3 NP  Noun .26 VP  V NP .2 NP  Det AdjP .0003 Noun  Baltimore .1 NP  NP PP … Set of weighted (probabilistic) rewrite Q: What are the distributions? rules, comprised of terminals and What must sum to 1? non-terminals Terminals: the words in the language (the lexicon), e.g., Baltimore A: P(X  Y Z | X) Non-terminals: symbols that can trigger rewrite rules, e.g., S, NP , Noun (Sometimes) Pre-terminals: symbols that can only trigger lexical rewrites, e.g., Noun

  12. Probabilistic Context Free Grammar S p( )= NP VP product of probabilities of individual rules used in the derivation NP Noun Verb Baltimore is a great city

  13. Probabilistic Context Free Grammar S p( VP ) * NP S p( )= NP VP NP Noun Verb Baltimore is a great city product of probabilities of individual rules used in the derivation

  14. Probabilistic Context Free Grammar S p( VP ) * NP S NP Noun p( ) * p( ) * p( )= NP VP Noun Baltimore NP Noun Verb Baltimore is a great city product of probabilities of individual rules used in the derivation

  15. Probabilistic Context Free Grammar S p( VP ) * NP S NP Noun p( ) * p( ) * p( )= NP VP Noun Baltimore VP NP Noun Verb Verb p( ) * p( ) * is Baltimore is a great city NP Verb product of probabilities of NP p( ) individual rules used in the derivation a great city

  16. Log Probabilistic Context Free Grammar S lp( VP ) + NP S NP Noun lp( ) + lp( ) + lp( )= NP VP Noun Baltimore VP NP Noun Verb Verb lp( ) + lp( ) + is Baltimore is a great city NP Verb sum of log probabilities of NP lp( ) individual rules used in the derivation a great city

  17. Estimating PCFGs Attempt 1: • Get access to a treebank (corpus of syntactically annotated sentences), e.g., the English Penn Treebank • Count productions • Smooth these counts • This gets ~75 F1

  18. Probabilistic Context Free Grammar (PCFG) Tasks Find the most likely parse (for an observed sequence) Calculate the (log) likelihood of an observed sequence w 1 , …, w N Learn the grammar parameters

  19. Outline Recap: MT word alignment Structure in Language: Constituency (Probabilistic) Context Free Grammars Definitions High-level tasks: Generating and Parsing Some uses for PCFGs CKY Algorithm: Parsing with a (P)CFG

  20. Context Free Grammar 1. Generate: Iteratively create a string (or tree derivation) using the rewrite rules 2. Parse: Assign a tree (if possible) to an input string

  21. Generate from a Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP S S

  22. Generate from a Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP S NP VP NP VP

  23. Generate from a Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP S NP VP Noun VP Noun

  24. Generate from a Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP S NP VP Baltimore VP Noun Baltimore

  25. Generate from a Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP S NP VP Baltimore V NP NP Noun Verb Baltimore

  26. Generate from a Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP … S NP VP Baltimore is a great city NP Noun Verb Baltimore is a great city

  27. Generate from a Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP … S NP VP Baltimore is a great city NP Noun Verb Baltimore is a great city

  28. Assign Structure (Parse) with a Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP … S NP VP Baltimore is a great city NP Noun Verb Baltimore is a great city

  29. Assign Structure (Parse) with a Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP … S NP VP [ S [ NP [ Noun Baltimore] ] [ VP [ Verb is] [ NP a great city]]] bracket notation NP Noun Verb Baltimore is a great city

  30. Assign Structure (Parse) with a Context Free Grammar S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP  NP PP … S NP VP (S (NP (Noun Baltimore)) (VP (V is) (NP a great city))) NP Noun Verb S-expression Baltimore is a great city

  31. Some CFG Terminology: Derivation/Parse Tree derivation, parse tree S NP VP S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V `P NP  Det AdjP Noun  Baltimore NP Noun Verb NP  NP PP … Baltimore is a great city

  32. Some CFG Terminology: Start Symbol start symbol S NP VP S  NP VP PP  P NP NP  Det Noun AdjP  Adj Noun NP  Noun VP  V NP NP  Det AdjP Noun  Baltimore NP Noun Verb NP  NP PP … Baltimore is a great city

  33. Some CFG Terminology: Rewrite Choices show choices with “|” (vertical bar) S S  NP VP NP  Det Noun | NP VP Noun | Det AdjP | NP PP NP Noun Verb PP  P NP AdjP  Adj Noun VP  V NP Baltimore is a great city Noun  Baltimore | …. …

  34. Some CFG Terminology: Chomsky Normal Form (CNF) Restricted binary and unary rules only No ternary rules (or above) non-terminal  non-terminal non-terminal X  Y Z binary rules can only involve non-terminals non-terminal  terminal X  a unary rules can only involve terminals

  35. Outline Recap: MT word alignment Structure in Language: Constituency (Probabilistic) Context Free Grammars Definitions High-level tasks: Generating and Parsing Some uses for PCFGs CKY Algorithm: Parsing with a (P)CFG

  36. What are some benefits to CFGs? Why should you care about syntax?

  37. Some Uses of CFGs Clearly disambiguate certain ambiguities Morphological derivations Identify “grammatical” sentences …

  38. Clearly Show Ambiguity I ate the meal with friends

  39. Clearly Show Ambiguity I ate the meal with friends

  40. Clearly Show Ambiguity salt I ate the meal with friends

  41. Clearly Show Ambiguity I ate the meal with friends VP NP PP NP VP S

  42. Clearly Show Ambiguity S NP VP VP NP NP PP I ate the meal with friends VP NP PP NP VP S

  43. Clearly Show Ambiguity S NP VP PP Attachment (a common source of errors, VP NP even still today) NP PP I ate the meal with friends VP NP PP NP VP S

  44. Clearly Show Ambiguity… But Not Necessarily All Ambiguity I ate the meal with a fork I ate the meal with gusto I ate the meal with friends VP NP PP NP VP S

  45. Other Attachment Ambiguity We invited the students, Chris and Pat.

  46. Coordination Ambiguity old men women and

  47. Grammars Aren’t Just for Syntax N overgeneralization N  N N over- generalization V  N V generalize -tion A  V A general -ize overgeneralization

  48. Clearly Show Grammaticality (?) The old man the boats NP VP S

  49. Clearly Show Grammaticality (?) S NP NP The old man the boats NP VP S

  50. Clearly Show Grammaticality (?) S NP NP The old man the boats Idea: define grammatical NP sentences as those that can VP be parsed by a grammar S

  51. Clearly Show Grammaticality (?) S NP NP The old man the boats Idea: define grammatical NP sentences as those that can VP be parsed by a grammar Issue 1: Which grammar? S

  52. Clearly Show Grammaticality (?) S NP NP Q: What do you see? The old man the boats Idea: define grammatical NP sentences as those that can VP be parsed by a grammar Issue 1: Which grammar? S Issue 2: Discourse demands A: [I see] The old man [and] the boats. flexibility

  53. Outline Recap: MT word alignment Structure in Language: Constituency (Probabilistic) Context Free Grammars Definitions High-level tasks: Generating and Parsing Some uses for PCFGs CKY Algorithm: Parsing with a (P)CFG

  54. Parsing with a CFG Top-down backtracking (brute force) CKY Algorithm: dynamic bottom-up Earley’s Algorithm: dynamic top-down not covered due to time

  55. CKY Precondition Grammar must be in Chomsky Normal Form (CNF) non-terminal  non-terminal non-terminal non-terminal  terminal

  56. S  NP VP NP  Papa NP  Det N N  caviar NP  NP PP N  spoon VP  V NP V  spoon VP  VP PP V  ate PP  P NP P  with Det  the Entire grammar Assume uniform weights Det  a Example from Jason Eisner

  57. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” S  NP VP NP  Papa NP  Det N N  caviar NP  NP PP N  spoon VP  V NP V  spoon VP  VP PP V  ate PP  P NP P  with Det  the Entire grammar Assume uniform weights Det  a Example from Jason Eisner

  58. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” S  NP VP NP  Papa NP  Det N N  caviar Goal: NP  NP PP N  spoon VP  V NP V  spoon (S, 0, 7) VP  VP PP V  ate PP  P NP P  with Det  the Entire grammar Assume uniform weights Det  a Example from Jason Eisner

  59. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” Check 1 : What are the non- terminals? S  NP VP NP  Papa NP  Det N N  caviar NP  NP PP N  spoon VP  V NP V  spoon VP  VP PP V  ate PP  P NP P  with Det  the Entire grammar Det  a Assume uniform weights Example from Jason Eisner

  60. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” Check 1 : What are the non- terminals? S  NP VP NP  Papa S N NP  Det N N  caviar NP V VP P NP  NP PP N  spoon PP Det VP  V NP V  spoon Check 2 : What are the terminals? VP  VP PP V  ate PP  P NP P  with Det  the Entire grammar Det  a Assume uniform weights Example from Jason Eisner

  61. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” Check 1 : What are the non- terminals? S  NP VP NP  Papa S N NP  Det N N  caviar NP V VP P NP  NP PP N  spoon PP Det VP  V NP V  spoon Check 2 : What are the terminals? VP  VP PP V  ate Papa with PP  P NP P  with caviar the Det  the spoon a ate Entire grammar Det  a Assume uniform weights Check 3 : What are the pre- terminals? Example from Jason Eisner

  62. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” Check 1 : What are the non- terminals? S N S  NP VP NP  Papa NP V NP  Det N N  caviar VP P PP Det NP  NP PP N  spoon Check 2 : What are the terminals? VP  V NP V  spoon Papa with VP  VP PP V  ate caviar the spoon a PP  P NP P  with ate Det  the Check 3 : What are the pre- Entire grammar Det  a terminals? Assume uniform weights N P V Det Check 4 : Is this in CNF? Example from Jason Eisner

  63. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” Check 1 : What are the non- terminals? S N S  NP VP NP  Papa NP V NP  Det N N  caviar VP P PP Det NP  NP PP N  spoon Check 2 : What are the terminals? VP  V NP V  spoon Papa with VP  VP PP V  ate caviar the spoon a PP  P NP P  with ate Det  the Check 3 : What are the pre- Entire grammar Det  a terminals? Assume uniform weights N P V Det Check 4 : Is this in CNF? Yes Example from Jason Eisner

  64. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” First : Let’s find all NPs S  NP VP NP  Papa NP  Det N N  caviar NP  NP PP N  spoon VP  V NP V  spoon VP  VP PP V  ate PP  P NP P  with Det  the Entire grammar Det  a Assume uniform weights Example from Jason Eisner

  65. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” First : Let’s find all NPs S  NP VP NP  Papa (NP, 0, 1): Papa NP  Det N N  caviar (NP, 2, 4): the caviar NP  NP PP N  spoon (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon VP  V NP V  spoon VP  VP PP V  ate PP  P NP P  with Det  the Entire grammar Det  a Assume uniform weights Example from Jason Eisner

  66. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” First : Let’s find all NPs S  NP VP NP  Papa (NP, 0, 1): Papa NP  Det N N  caviar (NP, 2, 4): the caviar NP  NP PP N  spoon (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon VP  V NP V  spoon Second : Let’s find all VPs VP  VP PP V  ate (VP, 1, 7): ate the caviar with a spoon PP  P NP P  with (VP, 1, 4): ate the caviar Det  the Entire grammar Det  a Assume uniform weights Example from Jason Eisner

  67. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” First : Let’s find all NPs S  NP VP NP  Papa (NP, 0, 1): Papa NP  Det N N  caviar (NP, 2, 4): the caviar NP  NP PP N  spoon (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon VP  V NP V  spoon Second : Let’s find all VPs VP  VP PP V  ate (VP, 1, 7): ate the caviar with a spoon PP  P NP P  with (VP, 1, 4): ate the caviar Det  the Third : Let’s find all Ss Entire grammar Det  a Assume uniform weights (S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar Example from Jason Eisner

  68. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” First : Let’s find all NPs S  NP VP NP  Papa (NP, 0, 1): Papa NP  Det N N  caviar (NP, 2, 4): the caviar NP  NP PP N  spoon (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon VP  V NP V  spoon Second : Let’s find all VPs VP  VP PP V  ate (VP, 1, 7): ate the caviar with a spoon PP  P NP P  with (VP, 1, 4): ate the caviar Det  the Third : Let’s find all Ss Entire grammar Det  a Assume uniform weights (S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar Example from Jason Eisner

  69. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” First : Let’s find all NPs S  NP VP NP  Papa (NP, 0, 1): Papa NP  Det N N  caviar (NP, 2, 4): the caviar NP  NP PP N  spoon (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon VP  V NP V  spoon Second : Let’s find all VPs VP  VP PP V  ate (VP, 1, 7): ate the caviar with a spoon PP  P NP P  with (VP, 1, 4): ate the caviar Det  the Third : Let’s find all Ss Entire grammar Det  a Assume uniform weights (S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar Example from Jason Eisner

  70. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” First : Let’s find all NPs S  NP VP NP  Papa (NP, 0, 1): Papa NP  Det N N  caviar (NP, 2, 4): the caviar NP  NP PP N  spoon (NP, 5, 7): a spoon (NP, 2, 7): the caviar with a spoon VP  V NP V  spoon Second : Let’s find all VPs VP  VP PP V  ate (VP, 1, 7): ate the caviar with a spoon PP  P NP P  with (VP, 1, 4): ate the caviar Det  the Third : Let’s find all Ss Entire grammar Det  a Assume uniform weights (S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar Example from Jason Eisner

  71. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” S  NP VP NP  Papa NP  Det N N  caviar First : Let’s find all NPs NP  NP PP N  spoon (NP, 0, 1): Papa VP  V NP V  spoon (NP, 2, 4): the caviar VP  VP PP V  ate (NP, 5, 7): a spoon PP  P NP P  with (NP, 2, 7): the caviar with a spoon Det  the Entire grammar Second : Let’s find all VPs Assume uniform Det  a weights (VP, 1, 7): ate the caviar with a spoon (VP, 1, 4): ate the caviar Third : Let’s find all Ss (NP, 0, 1) (VP, 1, 7) (S, 0, 7) (S, 0, 7): Papa ate the caviar with a spoon (S, 0, 4): Papa ate the caviar Example from Jason Eisner

  72. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” S  NP VP NP  Papa NP  Det N N  caviar NP  NP PP N  spoon First : Let’s find all NPs VP  V NP V  spoon VP  VP PP V  ate (NP, 0, 1): Papa (NP, 0, 1) (VP, 1, 7) (S, 0, 7) PP  P NP P  with (NP, 2, 4): the caviar (NP, 5, 7): a spoon Det  the Entire grammar (NP, 2, 7): the caviar with a spoon Assume uniform Det  a weights Second : Let’s find all VPs end 1 2 3 4 5 6 7 (VP, 1, 7): ate the caviar with a spoon 0 (VP, 1, 4): ate the caviar 1 Third : Let’s find all Ss 2 (S, 0, 7): Papa ate the caviar with a 3 start spoon 4 (S, 0, 4): Papa ate the caviar 5 6 Example from Jason Eisner

  73. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” S  NP VP NP  Papa NP  Det N N  caviar NP  NP PP N  spoon First : Let’s find all NPs VP  V NP V  spoon VP  VP PP V  ate (NP, 0, 1): Papa (NP, 0, 1) (VP, 1, 7) (S, 0, 7) PP  P NP P  with (NP, 2, 4): the caviar (NP, 5, 7): a spoon Det  the Entire grammar (NP, 2, 7): the caviar with a spoon Assume uniform Det  a weights Second : Let’s find all VPs end 1 2 3 4 5 6 7 (VP, 1, 7): ate the caviar with a spoon 0 (VP, 1, 4): ate the caviar NP 1 Third : Let’s find all Ss 2 (S, 0, 7): Papa ate the caviar with a 3 start spoon 4 (S, 0, 4): Papa ate the caviar 5 6 Example from Jason Eisner

  74. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” S  NP VP NP  Papa NP  Det N N  caviar NP  NP PP N  spoon First : Let’s find all NPs VP  V NP V  spoon VP  VP PP V  ate (NP, 0, 1): Papa (NP, 0, 1) (VP, 1, 7) (S, 0, 7) PP  P NP P  with (NP, 2, 4): the caviar (NP, 5, 7): a spoon Det  the Entire grammar (NP, 2, 7): the caviar with a spoon Assume uniform Det  a weights Second : Let’s find all VPs end 1 2 3 4 5 6 7 (VP, 1, 7): ate the caviar with a spoon 0 (VP, 1, 4): ate the caviar NP 1 VP Third : Let’s find all Ss 2 (S, 0, 7): Papa ate the caviar with a 3 start spoon 4 (S, 0, 4): Papa ate the caviar 5 6 Example from Jason Eisner

  75. 0 1 2 3 4 5 6 7 “Papa ate the caviar with a spoon” S  NP VP NP  Papa NP  Det N N  caviar NP  NP PP N  spoon First : Let’s find all NPs VP  V NP V  spoon VP  VP PP V  ate (NP, 0, 1): Papa (NP, 0, 1) (VP, 1, 7) (S, 0, 7) PP  P NP P  with (NP, 2, 4): the caviar (NP, 5, 7): a spoon Det  the Entire grammar (NP, 2, 7): the caviar with a spoon Assume uniform Det  a weights Second : Let’s find all VPs end 1 2 3 4 5 6 7 (VP, 1, 7): ate the caviar with a spoon 0 (VP, 1, 4): ate the caviar S NP 1 VP Third : Let’s find all Ss 2 (S, 0, 7): Papa ate the caviar with a 3 start spoon 4 (S, 0, 4): Papa ate the caviar 5 6 Example from Jason Eisner

  76. CKY Recognizer Input: * string of N words * grammar in CNF Output: True (with parse)/False Data structure: N*N table T Rows indicate span start (0 to N-1) Columns indicate span end (1 to N) T[i][j] lists constituents spanning i  j

  77. CKY Recognizer Input: * string of N words * grammar in CNF Output: True (with parse)/False For Viterbi in HMMs: build table left-to-right Data structure: N*N table T Rows indicate span For CKY in trees: start (0 to N-1) 1. build smallest-to-largest & Columns indicate span 2. left-to-right end (1 to N) T[i][j] lists constituents spanning i  j

  78. CKY Recognizer T = Cell[N][N+1]

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend