a latent variable model of synchronous parsing for
play

A Latent Variable Model of Synchronous Parsing for Syntactic and - PowerPoint PPT Presentation

A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation A Latent Variable Model of Synchronous Parsing for Syntactic and Semantic Dependencies James Henderson 1 Paola Merlo 2 Gabriele Musillo 1 2


  1. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation A Latent Variable Model of Synchronous Parsing for Syntactic and Semantic Dependencies James Henderson 1 Paola Merlo 2 Gabriele Musillo 1 2 Ivan Titov 3 1 Dept Computer Science, Univ Geneva 2 Dept Linguistics, Univ Geneva 3 Dept Computer Science, Univ Illinois at U-C CoNLL 2008 university-logo

  2. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Outline A Latent Variable Model of Synchronous Parsing 1 Probability Model 2 Machine Learning Method 3 Evaluation 4 university-logo

  3. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Motivation for synchronous parsing Syntax and semantics are separate structures , with different generalisations Obj Sub John broke the vase. A0 A1 Sub The vase broke. A1 Syntax and semantics are highly correlated , and therefore should be learned jointly Synchronous parsing provides a single joint model of two separate structures university-logo

  4. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Motivation for latent variables The correlations between syntax and semantics are partly lexical , and independence assumptions are hard to specify a priori The dataset is new, and there was little time for feature engineering Latent variables provide a powerful mechanism for discovering correlations both within and between the structures university-logo

  5. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Outline A Latent Variable Model of Synchronous Parsing 1 Probability Model 2 Machine Learning Method 3 Evaluation 4 university-logo

  6. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Outline A Latent Variable Model of Synchronous Parsing 1 Probability Model 2 Machine Learning Method 3 Evaluation 4 university-logo

  7. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation The Probability Model A generative, history-based model of the joint probability of syntactic and semantic synchronous derivations synchronised at each word . university-logo

  8. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Syntactic and semantic dependencies example ROOT Hope seems doomed to failure P ( T d , T s ) university-logo

  9. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Syntactic and semantic derivations Define two separate derivations , one for the syntactic structure and one for the semantic structure. P ( T d , T s ) = P ( D 1 d , ..., D m d d , D 1 s , ..., D m s s ) Actions of an incremental shift-reduce style parser similar to MALT [Nivre et al., 2006] Semantic derivations are less constrained, because their structures are less constrained Assumes each dependency structure is individually planar (“projective”) university-logo

  10. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Synchronisation granularity Use an intermediate synchronisation granularity, between full predications and individual actions. b t e t d , shift t , D b t s , ..., D e t C t = D d , ..., D d d s s , shift t s d , ..., D m d P ( D 1 d , D 1 s , ..., D m s s ) = P ( C 1 , . . . , C n ) Synchronisation at each word prediction Results in one shared input queue Allows two separate stacks university-logo

  11. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Synchronous parsing example ROOT Hope P ( C 1 ) university-logo

  12. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Synchronous parsing example ROOT Hope seems P ( C 1 ) P ( C 2 | C 1 ) university-logo

  13. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Synchronous parsing example ROOT Hope seems doomed P ( C 1 ) P ( C 2 | C 1 ) P ( C 3 | C 1 , C 2 ) university-logo

  14. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Synchronous parsing example ROOT Hope seems doomed to P ( C 1 ) P ( C 2 | C 1 ) P ( C 3 | C 1 , C 2 ) P ( C 4 | C 1 , C 2 , C 3 ) university-logo

  15. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Synchronous parsing example ROOT Hope seems doomed to failure P ( C 1 ) P ( C 2 | C 1 ) P ( C 3 | C 1 , C 2 ) P ( C 4 | C 1 , C 2 , C 3 ) P ( C 5 | C 1 , C 2 , C 3 , C 4 ) university-logo

  16. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope university-logo

  17. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems university-logo

  18. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems university-logo

  19. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems university-logo

  20. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems university-logo

  21. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems doomed university-logo

  22. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems doomed university-logo

  23. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems doomed university-logo

  24. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems doomed university-logo

  25. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems doomed to university-logo

  26. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems doomed to university-logo

  27. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems doomed to university-logo

  28. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems doomed to failure university-logo

  29. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems doomed to failure university-logo

  30. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Derivation example ROOT Hope seems doomed to failure university-logo

  31. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Projectivisation Allows crossing links between syntax and semantics Use the HEAD method [Nivre et al., 2006] to projectivise syntax Use syntactic dependencies to projectivise semantic dependencies university-logo

  32. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Projectivising semantic dependencies C w1 w2 w3 w4 w5 A B C w1 w2 w3 w4 w5 B A/C university-logo

  33. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Outline A Latent Variable Model of Synchronous Parsing 1 Probability Model 2 Machine Learning Method 3 Evaluation 4 university-logo

  34. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation The Machine Learning Method Synchronous derivations are modeled with an Incremental Sigmoid Belief Network ( ISBN ). ISBNs are Dynamic Bayesian Networks for modeling structures , with vectors of latent variables annotating derivation states that represent features of the derivation history . Use the neural network approximation of ISBNs [Titov and Henderson, ACL 2007] (“Simple Synchrony Netowrks”) university-logo

  35. A Latent Variable Model of Synchronous Parsing Probability Model Machine Learning Method Evaluation Statistical dependencies in the ISBN Connections between latent states reflect locality in the syntactic or semantic structure , thereby specifying the domain of locality for conditioning decisions Explicit conditioning features of the history are also specified t−c t t−1 S S S t−c t−1 t D D D university-logo

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend