using universal linguistic knowledge to guide grammar
play

Using Universal Linguistic Knowledge to Guide Grammar Induction - PowerPoint PPT Presentation

Using Universal Linguistic Knowledge to Guide Grammar Induction Using Universal Linguistic Knowledge to Guide Grammar Induction [Naseem et al., 2010] Juri Alexander Opitz June 30, 2016 1/60 Using Universal Linguistic Knowledge to Guide


  1. Using Universal Linguistic Knowledge to Guide Grammar Induction Using Universal Linguistic Knowledge to Guide Grammar Induction [Naseem et al., 2010] Juri Alexander Opitz June 30, 2016 1/60

  2. Using Universal Linguistic Knowledge to Guide Grammar Induction “By a generative grammar I mean simply a system of rules that in some explicit and well-defined way assigns structural descriptions to sentences. Obviously, every speaker of a language has mastered and internalized a generative grammar (...) This is not to say that he is aware of the rules of the grammar or even that he can become aware of them.” Noam Chomsky in Aspects of the Theory of Syntax (1965). 2/60

  3. Using Universal Linguistic Knowledge to Guide Grammar Induction Overview Introduction The Model Experiments Conclusions Outlook 3/60

  4. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction Introduction 4/60

  5. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction What Naseem et al. seek to accomplish Guide (Dependency-) Grammar induction by (known) Linguistic Universals. 5/60

  6. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction What is Grammar Induction? ◮ Automatic Learning of a Formal Grammer 1. receive observations 2. construct model which “explains” the observations 6/60

  7. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction Why do we need Grammar Induction in NLP? ◮ Observations: spoken/written natural language ◮ Model: any kind of model which explains how the observations arised (by incorporating underlying deeper structures). 7/60

  8. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction Example: Practical Use ◮ Observations: Texts (+Trees in supervised case). ◮ Model: Parser. ◮ Goal: Parse new Texts. 8/60

  9. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction Why Grammar Induction for LRLs? Successful parsers rely on manually annotated training material, which is: ◮ very costly (especially in this case: human needs to annotate data with trees)... ◮ typically constructed for each language. 9/60

  10. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction Why Grammar Induction for LRLs? Hence we need Unsupervised Grammar Induction for LLRs. 10/60

  11. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction Common Problem with Unsupervised Learning Models perform usually much worse than their supervised counterparts: They have no teacher and must learn on their own :-( 11/60

  12. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction A possible Cure Principal Idea of the paper: Exploit universal knowledge to guide the learning process. 12/60

  13. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction Linguistic Universals 13/60

  14. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction Linguistic Universals - Example Parse Sentence: Nim Chimsky eats a ripe banana. Noun Noun Verb Article Adjective Noun 14/60

  15. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction Linguistic Universals - Example Parse Sentence: Nim Chimsky eats a ripe banana. Noun Noun Verb Article Adjective Noun a | banana-- | | root--eats-- ripe | Nim--Chimsky 15/60

  16. Using Universal Linguistic Knowledge to Guide Grammar Induction Introduction Grammar induction & Low Resource Languages (LRLs) Idea: With linguistic Universals we can guide grammar induction when we have few or no annotated data at all. 16/60

  17. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model The Model, “explaining what we observe”. 17/60

  18. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model Model Naseem et al. use a generative Bayesian Model to describe grammar generation when we observe words x 1 , x 2 , ..., x n and corresponding coarse symbols, i.e. PoS-Tags s 1 , s 2 , ..., s n . 18/60

  19. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model Simplified Model Naseem et al. use hidden, refined symbols z 1 , z 2 , ..., z n . For simplicity, we drop this here,i.e. z 1 , z 2 , ..., z n == s 1 , s 2 , ..., s n . 19/60

  20. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model Simplified Model: 2 Facets 1. Generative Process for Model parameters 2. Generative Process for Parses 20/60

  21. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model Simplified Model: 2 Facets 1. For each coarse symbol s : ◮ Draw a word generation multinomial . ◮ For each possible context value c , draw also a child symbol generation multinomial . 2. For each Tree Node i generated in context c by parent symbol s ′ : ◮ Draw coarse symbol s i from child symbol generation multinomial of parent ◮ Draw word x i from word generation multinomial . 21/60

  22. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model More formally: 1. For each coarse symbol s : ◮ Draw Φ s ∼ Dir (Φ 0 ). ◮ For each possible context value c , draw θ sc ∼ Dir ( θ 0 ) 2. For each Tree Node i generated in context c by parent symbol s ′ : ◮ Draw coarse symbol s i ∼ Mult ( θ s ′ ) ◮ Draw word x i ∼ Mult (Φ s i ). 22/60

  23. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model The Dirichlet Distribution... ... is a distribution over multinomial distributions... 23/60

  24. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model 2 Parameters: K K: How many discrete events do we have (e.g. number of words in vocab). 24/60

  25. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model 2 Parameters: Vector α A K-dimensional “concentration parameter” Vector, all α i must be > 0 (e.g. counts of each word in text). 25/60

  26. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model Example for K=3 26/60

  27. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model Example for K=3 α = (6 , 2 , 2) , (3 , 7 , 5) , (6 , 2 , 6) , (2 , 3 , 4), clockwise from top left 27/60

  28. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model Model: Plate Outline 28/60

  29. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model Inference with Constraints Idea: constrain the posterior to satisfy the rules in expectation during inference. ◮ What? we require that a certain percantage of linguistic universals must occur in the model expectations. ◮ Why? Biases the model-inference towards linguistically more plausible setting. ◮ Advantage: we require only a certain percentage of linguistic universals to hold − → percentage can be tuned for every language. 29/60

  30. Using Universal Linguistic Knowledge to Guide Grammar Induction The Model Inference with Constraints Method outline: ◮ Maximize lower bound on likelihood of observations (equivalent to minimizing Divergence between the true posterior distribution of model parameters and other distributions of model parameters!) ◮ implement constraints in constrained optimization problem : ◮ a certain % of universals must hold! 30/60

  31. Using Universal Linguistic Knowledge to Guide Grammar Induction Experiments Experiments 31/60

  32. Using Universal Linguistic Knowledge to Guide Grammar Induction Experiments Experimental Setup Languages: English, Danish, Portuguese, Slovene, Spanish, and Swedish 32/60

  33. Using Universal Linguistic Knowledge to Guide Grammar Induction Experiments Experiments: Setup Languages: English, Danish, Portuguese, Slovene, Spanish, and Swedish. ◮ English data: dependency modification of Penn Treebank [Taylor et al., 2003], sentence-length < 20. ◮ Other data: 2006 CoNLL-X Shared task[Buchholz and Marsi, 2006], sentence-length < 10. ◮ each data set provides manually annotated PoS-tags. 33/60

  34. Using Universal Linguistic Knowledge to Guide Grammar Induction Experiments Experiments: Setup Metric: Dependency Accuracy. ◮ Percentage of words having the correct head. 34/60

  35. Using Universal Linguistic Knowledge to Guide Grammar Induction Experiments Experiments: Results DMV, PGI: Baselines. No-split: This model without refined subsymbols. HDP DEP: This model. 35/60

  36. Using Universal Linguistic Knowledge to Guide Grammar Induction Experiments Experiments: Ablations What happens when we exclude certain universal rules? 36/60

  37. Using Universal Linguistic Knowledge to Guide Grammar Induction Experiments Experiments: Ablations 37/60

  38. Using Universal Linguistic Knowledge to Guide Grammar Induction Experiments Experiments: Constraints Thresholds What happens when we increase/decrease the percentage of dependencies which must be in accordance with the universals? 38/60

  39. Using Universal Linguistic Knowledge to Guide Grammar Induction Experiments Experiments: Constraints Thresholds 39/60

  40. Using Universal Linguistic Knowledge to Guide Grammar Induction Experiments Experiments: Constraints Thresholds 40/60

  41. Using Universal Linguistic Knowledge to Guide Grammar Induction Conclusions Conclusions 41/60

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend