joint learning of syntactic and semantic dependencies
play

Joint Learning of Syntactic and Semantic Dependencies Xavier Llu s - PowerPoint PPT Presentation

Introduction Difficulties Joint model Results and discussion Future work Joint Learning of Syntactic and Semantic Dependencies Xavier Llu s and Llu s M` arquez TALP Research Center Technical University of Catalonia Barcelona,


  1. Introduction Difficulties Joint model Results and discussion Future work Joint Learning of Syntactic and Semantic Dependencies Xavier Llu´ ıs and Llu´ ıs M` arquez TALP Research Center Technical University of Catalonia Barcelona, December 9, 2008

  2. Introduction Difficulties Joint model Results and discussion Future work Introduction Joint parsing is the simultaneous processing of the syntactic and semantic structure.

  3. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic dependencies Syntactic and semantic parsing: syntax A sample sentence

  4. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic dependencies Syntactic and semantic parsing: syntax Syntactic dependencies

  5. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic dependencies Syntactic and semantic parsing: semantics Predicate completed

  6. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic dependencies Syntactic and semantic parsing: semantics Semantic dependencies for completed

  7. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic dependencies Syntactic and semantic parsing: semantics Predicate acquisition

  8. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic dependencies Syntactic and semantic parsing: semantics Semantic dependencies for acquisition

  9. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic dependencies Syntactic and semantic parsing: semantics Predicate announcedq

  10. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic dependencies Syntactic and semantic parsing: semantics Semantic dependencies for announced

  11. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic dependencies Syntactic and semantic parsing: semantics Semantic dependencies for all predicates

  12. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic dependencies Mainstream approach The pipeline approach Syntactic parsing 1 A parser (Eisner, Shift-reduce) Semantic role labeling 2 A simpler (non-structured) classifier ⇒ ⇒

  13. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic dependencies Pipeline strategy The pipeline approach Propagation or amplification of errors 1 Assumes an order of increasing difficulty 2 Dependencies between layers are hard to be captured 3

  14. Introduction Difficulties Joint model Results and discussion Future work The joint approach Joint approach Design a joint model Overcome the pipeline approach 1 To build from scratch a simple and feasible system 2

  15. Introduction Difficulties Joint model Results and discussion Future work Design a joint model Design a joint model A joint approach Extend a syntactic parsing model to jointly parse semantics Syntactic parsing 1 A parser ( Eisner , Shift-reduce) Semantic role labeling 2 A simpler (non-structured) classifier

  16. Introduction Difficulties Joint model Results and discussion Future work Design a joint model Design a joint model A joint approach Extend the Eisner algorithm to jointly parse semantics O ( n 3 ) algorithm Based on CKY algorithm Bottom-up parser

  17. Introduction Difficulties Joint model Results and discussion Future work Design a joint model The Eisner algorithm Bottom-up dependency parsing

  18. Introduction Difficulties Joint model Results and discussion Future work Design a joint model The Eisner algorithm Bottom-up dependency parsing

  19. Introduction Difficulties Joint model Results and discussion Future work Design a joint model The Eisner algorithm Bottom-up dependency parsing

  20. Introduction Difficulties Joint model Results and discussion Future work Design a joint model The Eisner algorithm Bottom-up dependency parsing

  21. Introduction Difficulties Joint model Results and discussion Future work Design a joint model The Eisner algorithm Bottom-up dependency parsing

  22. Introduction Difficulties Joint model Results and discussion Future work Design a joint model The Eisner algorithm Bottom-up dependency parsing

  23. Introduction Difficulties Joint model Results and discussion Future work Design a joint model The Eisner algorithm Bottom-up dependency parsing

  24. Introduction Difficulties Joint model Results and discussion Future work Design a joint model The Eisner algorithm Score of a dependency A dependency d = � h , m , l � of a sentence x is scored by: score( d , x ) = φ ( � h , m , l � , x ) · w where φ is a feature extraction function, w is a weight vector

  25. Introduction Difficulties Joint model Results and discussion Future work Design a joint model The Eisner algorithm Best tree We are interested in the best scoring tree among all trees Y ( x ): best tree( x ) = argmax score tree( y , x ) y ∈Y ( x ) Eisner algorithm The Eisner algorithm is an exact search algorithm that computes the best first-order factorized tree.

  26. Introduction Difficulties Joint model Results and discussion Future work Design a joint model The Eisner algorithm Score of a tree A syntactic tree y for a sentence x is scored by: � score tree( y , x ) = score ( � h , m , l � , y ) � h , m , l �∈ y Arc-factorization The first order factorization is the sum of independent scores for each dependency of the tree.

  27. Introduction Difficulties Joint model Results and discussion Future work Design a joint model Extension of the Eisner algorithm Joint parsing point of view simultaneous prediction of the syntactic and semantic label

  28. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Extension of the Eisner algorithm: an example The complete syntactic and semantic structure.

  29. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Extension of the Eisner algorithm: an example Overlapping syntactic and semantic depencies.

  30. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Extension of the Eisner algorithm: an example Overlapping syntactic and semantic depencies.

  31. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Extension of the Eisner algorithm: an example Non-overlapping semantic dependencies.

  32. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Syntax and Semantics overlapping 1. Are syntax and semantics overlapping? 36.4% of argument-predicate relations do not exactly overlap with modifier-head syntactic relations. Proposed solution Attach the semantic label to the syntactic dependency

  33. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Difficulties: non-overlapping semantics Any given syntactic dependency

  34. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Difficulties: non-overlapping semantics The related semantic dependencies

  35. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Difficulties: non-overlapping semantics The overlapping A0 dependency

  36. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Difficulties: non-overlapping semantics The overlapping A0 dependency will be jointly annotated

  37. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Difficulties: non-overlapping semantics The non-overlapping A0 dependency

  38. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Difficulties: non-overlapping semantics The non-overlapping A0 dependency will also be jointly annotated

  39. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Difficulties: non-overlapping semantics Solution An extended dependency is: � � d = h , m , l syn , l sem p 1 , . . . , l sem p q h is the head m the modifier l syn the syntactic label l sem p i one semantic label for each sentence predicate p i

  40. Introduction Difficulties Joint model Results and discussion Future work Syntactic and semantic overlap Proposed solution OBJ, A1, A1, _ OBJ, _, _, Su SBJ, A0, _, A0 AMOD, _, AM−TMP, _ NMOD, _, _, _ NMOD, _, _, _ A dependency has its syntactic and semantic labels

  41. Introduction Difficulties Joint model Results and discussion Future work Unavailable features Proposed solution: unavailable features A dependency with semantic labels

  42. Introduction Difficulties Joint model Results and discussion Future work Unavailable features Proposed solution: unavailable features The first A0 is an overlapping semantic dependency

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend