artificial intelligence
play

Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT - PowerPoint PPT Presentation

CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 18-19-20 Natural Language Processing (ambiguities and parsing) Importance of NLP Text based computation needs NLP Linguistics+Computation


  1. Discourse Processing of sequence of sentences Mother to John : John go to school. It is open today. Should you bunk? Father will be very angry. Ambiguity of open bunk what? Why will the father be angry? Complex chain of reasoning and application of world knowledge ( father will not be angry if somebody else’s son bunks the school) Ambiguity of father father as parent or father as headmaster

  2. Complexity of Connected Text John was returning from school dejected – today was the math test He couldn’t control the class Teacher shouldn’t have made him responsible After all he is just a janitor

  3. ML-NLP

  4. NLP as an ML task  France beat Brazil by 1 goal to 0 in the quarter-final of the world cup football tournament. (English)  braazil ne phraans ko vishwa kap phutbal spardhaa ke kwaartaar phaainal me 1-0 gol ke baraabarii se haraayaa. (Hindi)

  5. Categories of the Words in the Sentence France beat Brazil by 1 goal to 0 in the quarter final of the world cup football tournament Brazil beat France 1 0 by function content goal to words words quarter final in world cup the Football of tournament

  6. Further Classification 1/2 Brazil Brazil France France Brazil 1 beat goal proper France 0 noun 1 quarter final 1 goal noun world cup goal 0 football 0 quarter final tournament quarter final world cup common world cup verb football noun Football tournament tournament beat

  7. Further Classification 2/2 by to In the of preposition determiner by the to in of

  8. Why all this?  Fundamental and ubiquitous information need  who did what  to whom  by what  when  where  in what manner

  9. Semantic roles Brazil 1 goal to 0 patient/theme manner France agent beat time quarter finals world modifier cup football

  10. Semantic Role Labeling: a classification task  France beat Brazil by 1 goal to 0 in the quarter-final of the world cup football tournament  Brazil: agent or object?  Agent: Brazil or France or Quarter Final or World Cup?  Given an entity, what role does it play?  Given a role, it is played by which entity?

  11. A lower level of classification: Part of Speech (POS) Tag Labeling  France beat Brazil by 1 goal to 0 in the quarter-final of the world cup football tournament  beat: verb of noun (heart beat, e.g.)?  Final: noun or adjective?

  12. Uncertainty in classification: Ambiguity  Visiting aunts can be a nuisance  Visiting:  adjective or gerund (POS tag ambiguity)  Role of aunt:  agent of visit (aunts are visitors)  object of visit (aunts are being visited)  Minimize uncertainty of classification with cues from the sentence

  13. What cues? Position with respect to the verb:   France to the left of beat and Brazil to the right: agent- object role marking (English) Case marking:   France ne (Hindi); ne (Marathi): agent role  Brazil ko (Hindi); laa (Marathi): object role Morphology: haraayaa (hindi); haravlaa (Marathi):   verb POS tag as indicated by the distinctive suffixes

  14. Cues are like attribute-value pairs prompting machine learning from NL data Constituent ML tasks   Goal: classification or clustering  Features/attributes (word position, morphology, word label etc. )  Values of features  Training data (corpus: annotated or un-annotated)  Test data (test corpus)  Accuracy of decision (precision, recall, F-value, MAP etc. )  Test of significance (sample space to generality)

  15. What is the output of an ML-NLP System (1/2) Option 1: A set of rules, e.g. ,   If the word to the left of the verb is a noun and has animacy feature, then it is the likely agent of the action denoted by the verb.  The child broke the toy ( child is the agent)  The window broke ( window is not the agent; inanimate)

  16. What is the output of an ML-NLP System (2/2) Option 2: a set of probability values   P(agent | word is to the left of verb and has animacy) > P(object | word is to the left of verb and has animacy)> P(instrument | word is to the left of verb and has animacy) etc.

  17. How is this different from classical NLP  The burden is on the data as opposed Classical NLP to the human. Linguist rules Computer rules/probabilities Text data corpus Statistical NLP

  18. Classification appears as sequence labeling

  19. A set of Sequence Labeling Tasks: smaller to larger units  Words :  Part of Speech tagging  Named Entity tagging  Sense marking  Phrases : Chunking  Sentences : Parsing  Paragraphs : Co-reference annotating

  20. Example of word labeling: POS Tagging <s> Come September, and the IIT campus is abuzz with new and returning students. </s> <s> Come_VB September_NNP ,_, and_CC the_DT IIT_NNP campus_NN is_VBZ abuzz_JJ with_IN new_JJ and_CC returning_VBG students_NNS ._. </s>

  21. Example of word labeling: Named Entity Tagging <month_name> September </month_name> <org_name> IIT </org_name>

  22. Example of word labeling: Sense Marking Word Synset WN-synset-no come {arrive, get, come} 01947900 . . . abuzz {abuzz, buzzing, droning} 01859419

  23. Example of phrase labeling: Chunking Come July, and is the IIT campus abuzz with . new and returning students

  24. Example of Sentence labeling: Parsing [ S1 [ S [ S [ VP [ VB Come][ NP [ NNP July]]]] [ , ,] [ CC and] [ S [ NP [ DT the] [ JJ UJF] [ NN campus]] [ VP [ AUX is] [ ADJP [ JJ abuzz] [ PP [ IN with] [ NP [ ADJP [ JJ new] [ CC and] [ VBG returning]] [ NNS students]]]]]] [ . .]]]

  25. Parsing of Sentences

  26. Are sentences flat linear structures? Why tree?  Is there a principle in branching  When should the constituent give rise to children?  What is the hierarchy building principle?

  27. Structure Dependency: A Case Study Interrogative Inversion  (1) John will solve the problem. Will John solve the problem? Declarative Interrogative (2) a. Susan must leave. Must Susan leave? b. Harry can swim. Can Harry swim? c. Mary has read the book. Has Mary read the book? Bill is sleeping. Is Bill sleeping? d. ……………………………………………………… . The section, “ Structure dependency a case study” here is adopted from a talk given by Howard Lasnik (2003) in Delhi university.

  28. Interrogative inversion Structure Independent (1 st attempt) (3) Interrogative inversion process Beginning with a declarative, invert the first and second words to construct an interrogative. Declarative Interrogative (4) a. The woman must leave. *Woman the must leave? b. A sailor can swim. *Sailor a can swim? c. No boy has read the book. *Boy no has read the book? d. My friend is sleeping. *Friend my is sleeping?

  29. Interrogative inversion correct pairings Compare the incorrect pairings in (4) with the  correct pairings in (5): Declarative Interrogative (5) a. The woman must leave. Must the woman leave? b. A sailor can swim. Can a sailor swim? c. No boy has read the book. Has no boy read the book? d. My friend is sleeping. Is my friend sleeping?

  30. Interrogative inversion Structure Independent (2 nd attempt) (6) Interrogative inversion process :  Beginning with a declarative, move the auxiliary verb to the front to construct an interrogative. Declarative Interrogative (7) a. Bill could be sleeping. *Be Bill could sleeping? Could Bill be sleeping? b. Mary has been reading. *Been Mary has reading? Has Mary been reading? c. Susan should have left. *Have Susan should left? Should Susan have left?

  31. Structure independent (3 rd attempt): (8) Interrogative inversion process Beginning with a declarative, move the first  auxiliary verb to the front to construct an interrogative. Declarative Interrogative (9) a. The man who is here can swim. *Is the man who here can swim? b. The boy who will play has left. *Will the boy who play has left?

  32. Structure Dependent Correct Pairings  For the above examples, fronting the second auxiliary verb gives the correct form: Declarative Interrogative (10) a.The man who is here can swim. Can the man who is here swim? b.The boy who will play has left. Has the boy who will play left?

  33. Natural transformations are structure dependent (11) Does the child acquiring English learn these properties? (12) We are not dealing with a peculiarity of English. No known human language has a transformational process that would produce pairings like those in (4), (7) and (9), repeated below: (4) a. The woman must leave. *Woman the must leave? (7) a. Bill could be sleeping. *Be Bill could sleeping? (9) a. The man who is here can swim. *Is the man who here can swim?

  34. Deeper trees needed for capturing sentence structure This wont do! Flat structure! NP PP The AP PP book with the blue cover of poems big [ The big book of poems with the Blue cover ] is on the table.

  35. Other languages English NP PP The AP PP book with the blue cover big of poems NP Hindi AP PP PP kitaab kavita kii niil jilda vaalii badii [ niil jilda vaalii kavita kii kitaab ]

  36. Other languages: contd English NP PP The AP PP book with the blue cover big of poems NP Bengali AP PP PP ti bai niil malaat deovaa kavitar motaa [niil malaat deovaa kavitar bai ti]

  37. PPs are at the same level: flat with respect to the head word “book” No distinction in terms of dominance or c-command NP PP The AP PP book with the blue cover of poems big [ The big book of poems with the Blue cover ] is on the table.

  38. “Constituency test of Replacement” runs into problems  One-replacement:  I bought the big [book of poems with the blue cover] not the small [one]  One-replacement targets book of poems with the blue cover  Another one-replacement:  I bought the big [book of poems] with the blue cover not the small [one] with the red cover  One-replacement targets book of poems

  39. More deeply embedded structure NP N’ 1 The AP N’ 2 N’ 3 big PP N with the blue cover PP book of poems

  40. To target N 1 ’  I want [ NP this [ N’ big book of poems with the red cover] and not [ N that [ N one]]

  41. Bar-level projections  Add intermediate structures  NP  (D) N’  N’  (AP) N’ | N’ (PP) | N (PP)  () indicates optionality

  42. New rules produce this tree NP N-bar N’ 1 The AP N’ 2 N’ 3 big PP N with the blue cover PP book of poems

  43. As opposed to this tree NP PP The AP PP book with the blue cover of poems big

  44. V-bar  What is the element in verbs corresponding to one-replacement for nouns  do-so or did-so

  45. As opposed to this tree NP PP The AP PP book with the blue cover of poems big

  46. I [eat beans with a fork] VP PP eat NP with a fork beans No constituent that groups together V and NP and excludes PP

  47. Need for intermediate constituents  I [eat beans] with a fork but Ram [does VP so] with a spoon V 1 ’ VP  V’ V 2 ’ V’  V’ (PP) V’  V (NP) PP V NP with a fork eat beans

  48. How to target V 1 ’  I [eat beans with a fork], and Ram VP [does so] too. V 1 ’ VP  V’ V 2 ’ V’  V’ (PP) V’  V (NP) PP V NP with a fork eat beans

  49. Parsing Algorithms

  50. A simplified grammar  S  NP VP  NP  DT N | N  VP  V ADV | V

  51. A segment of English Grammar  S’  (C) S  S  {NP/S’} VP  VP  (AP+) (VAUX) V (AP+) ({NP/S’}) (AP+) (PP+) (AP+)  NP  (D) (AP+) N (PP+)  PP  P NP  AP  (AP) A

  52. Example Sentence People laugh 2 3 1 These are positions Lexicon: This indicate that both People - N, V Noun and Verb is possible for the word Laugh - N, V “People”

  53. Top-Down Parsing State Backup State Action ----------------------------------------------------------------------------------------------------- 1. ((S) 1) - - Position of input pointer 2. ((NP VP)1) - - 3a. ((DT N VP)1) ((N VP) 1) - 3b. ((N VP)1) - - 4. ((VP)2) - Consume “People” 5a. ((V ADV)2) ((V)2) - 6. ((ADV)3) ((V)2) Consume “laugh” 5b. ((V)2) - - 6. ((.)3) - Consume “laugh” Termination Condition : All inputs over. No symbols remaining. Note: Input symbols can be pushed back.

  54. Discussion for Top-Down Parsing  This kind of searching is goal driven.  Gives importance to textual precedence (rule precedence).  No regard for data, a priori (useless expansions made).

  55. Bottom-Up Parsing Some conventions: N 12 Represents positions S 1? -> NP 12 ° VP 2? Work on the LHS done, while End position unknown the work on RHS remaining

  56. Bottom-Up Parsing (pictorial representation) S -> NP 12 VP 23 ° People Laugh 1 2 3 N 12 N 23 V 12 V 23 NP 12 -> N 12 ° NP 23 -> N 23 ° VP 12 -> V 12 ° VP 23 -> V 23 ° S 1? -> NP 12 ° VP 2?

  57. Problem with Top-Down Parsing • Left Recursion • Suppose you have A-> AB rule. Then we will have the expansion as follows: • ((A)K) -> ((AB)K) - > ((ABB)K) ……..

  58. Combining top-down and bottom-up strategies

  59. Top-Down Bottom-Up Chart Parsing  Combines advantages of top-down & bottom- up parsing.  Does not work in case of left recursion.  e.g. – “People laugh”  People – noun, verb  Laugh – noun, verb  Grammar – S  NP VP NP  DT N | N VP  V ADV | V

  60. Transitive Closure People laugh 1 2 3 S  NP VP NP  N  VP  V  NP  DT N S  NP  VP S  NP VP  NP  N VP  V ADV success VP  V

  61. Arcs in Parsing  Each arc represents a chart which records  Completed work (left of  )  Expected work (right of  )

  62. Example People laugh loudly 1 2 3 4 S  NP VP NP  N  VP  V  VP  V ADV  NP  DT N S  NP  VP VP  V  ADV S  NP VP  NP  N VP   V ADV S  NP VP  VP   V

  63.  Advantage of Combination of Bottom Up & Top Down parsing over either of top down / bottom down  In top down bottom up parsing 1. Like top down parsing productions are brought, but inline top down parsing rules are not necessarily expanded 2. Unlike bottom up parsing uncontrolled lexical options (parts of speech) are not even considered.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend