macro grammars and holistic triggering for efficient
play

Macro Grammars and Holistic Triggering for Efficient Semantic - PowerPoint PPT Presentation

Macro Grammars and Holistic Triggering for Efficient Semantic Parsing Yuchen Zhang and Panupong Pasupat and Percy Liang EMNLP 2017 In what city did Piotr's last 1st place finish occur? 2 How long did it take this competitor to finish the


  1. Macro Grammars and Holistic Triggering for Efficient Semantic Parsing Yuchen Zhang and Panupong Pasupat and Percy Liang EMNLP 2017

  2. In what city did Piotr's last 1st place finish occur? 2

  3. How long did it take this competitor to finish the 4x400 meter relay at Universiade in 2005? Where was the competition held immediately before the one in Turkey? How many times has this competitor placed 5th or better in competition? In what city did Piotr's last 1st place finish occur? 3

  4. Semantic Parsing Parse utterances into executable logical forms “Who ranked right after Turkey?” Rank Nation Gold Silver Bronze 1 France 3 1 1 2 Turkey 2 0 1 3 Sweden 2 0 0 4

  5. Semantic Parsing Parse utterances into executable logical forms “Who ranked right after Turkey?” NationOf.NextOf.HasNation.Turkey Rank Nation Gold Silver Bronze 1 France 3 1 1 2 Turkey 2 0 1 3 Sweden 2 0 0 5

  6. Semantic Parsing Parse utterances into executable logical forms “Who ranked right after Turkey?” NationOf.NextOf.HasNation.Turkey Rank Nation Gold Silver Bronze 1 France 3 1 1 2 Turkey 2 0 1 3 Sweden 2 0 0 6

  7. Semantic Parsing Parse utterances into executable logical forms “Who ranked right after Turkey?” NationOf.NextOf.HasNation.Turkey Rank Nation Gold Silver Bronze 1 France 3 1 1 2 Turkey 2 0 1 3 Sweden 2 0 0 7

  8. Semantic Parsing Parse utterances into executable logical forms “Who ranked right after Turkey?” NationOf.NextOf.HasNation.Turkey Rank Nation Gold Silver Bronze 1 France 3 1 1 2 Turkey 2 0 1 3 Sweden 2 0 0 8

  9. Semantic Parsing Parse utterances into executable logical forms “Who ranked right after Turkey?” NationOf.NextOf.HasNation.Turkey Rank Nation Gold Silver Bronze 1 France 3 1 1 2 Turkey 2 0 1 3 Sweden 2 0 0 Denotation 9

  10. Floating Parser Given an utterance, the parser composes logical forms using a grammar ▸ Terminal rules generate terminal tokens TokenSpan → Ent Turkey Ent “Who ranked right after Turkey?” 10

  11. Floating Parser Given an utterance, the parser composes logical forms using a grammar ▸ Terminal rules generate terminal tokens ∅ → Rel Nation Rel Turkey Ent “Who ranked right after Turkey?” 11

  12. Floating Parser Given an utterance, the parser composes logical forms using a grammar ▸ Compositional rules combine parts Ent[z 1 ] → Set[z 1 ] Turkey Set Nation Rel Turkey Ent “Who ranked right after Turkey?” 12

  13. Floating Parser Given an utterance, the parser composes logical forms using a grammar ▸ Compositional rules combine parts HasNation.Turkey Set Turkey Set Nation Rel Turkey Ent Rel[z 1 ] + Set[z 2 ] → Set[Has-z 1 .z 2 ] “Who ranked right after Turkey?” 13

  14. NationOf.NextOf.HasNation.Turkey Root NationOf.NextOf.HasNation.Turkey Set NextOf.HasNation.Turkey Set HasNation.Turkey Set Turkey Set Nation Rel Turkey Ent “Who ranked right after Turkey?” 14

  15. Training a Semantic Parser Setup: Each training example has an utterance, a table, and the target denotation ▸ The logical form is latent “Who ranked right after Turkey?” Sweden Rank Nation Gold Silver Bronze 1 France 3 1 1 2 Turkey 2 0 1 3 Sweden 2 0 0 15

  16. Training a Semantic Parser Given a training example: 1. Generate a bunch of logical forms (beam search) 2. Featurize the logical forms and score them NationOf.NextOf.HasNation.Turkey Rank Nation Gold Silver Bronze NationOf.HasNext.HasNation.Turkey 1 France 3 1 1 count(HasNation.Turkey) 2 Turkey 2 0 1 “Who ranked right after Turkey?” 3 Sweden 2 0 0 16

  17. Training a Semantic Parser Given a training example: 1. Generate a bunch of logical forms (beam search) 2. Featurize the logical forms and score them 3. Execute the logical forms to identify the ones that are consistent with the target denotation 4. Gradient update toward consistent logical forms NationOf.NextOf.HasNation.Turkey Rank Nation Gold Silver Bronze NationOf.HasNext.HasNation.Turkey 1 France 3 1 1 count(HasNation.Turkey) 2 Turkey 2 0 1 “Who ranked right after Turkey?” 3 Sweden 2 0 0 17

  18. Training a Semantic Parser Given a training example: 1. Generate a bunch of logical forms (beam search) 2. Featurize the logical forms and score them 3. Execute the logical forms to identify the ones that are consistent with the target denotation 4. Gradient update toward consistent logical forms NationOf.NextOf.HasNation.Turkey Rank Nation Gold Silver Bronze NationOf.HasNext.HasNation.Turkey 1 France 3 1 1 count(HasNation.Turkey) 2 Turkey 2 0 1 “Who ranked right after Turkey?” 3 Sweden 2 0 0 18

  19. Main Problem: Speed Depending on the generality of the grammar, the number of generated partial logical forms can grow exponentially count(NextOf.HasNation.Turkey) sum(IndexOf.HasNation.Turkey) argmax(NextOf.HasNation.Turkey, Index) ▸ Many partial logical forms are also useless 19

  20. Main Problem: Speed Depending on the generality of the grammar, the number of generated partial logical forms can grow exponentially ▸ To reach 40% accuracy, each example: ▹ Generates ~ 13700 partial logical forms ▹ Takes ~ 1.1 seconds (2.6 GHz machine) ▹ 3 epochs on 14K examples → 12 hours 20

  21. Main Problem: Speed Depending on the generality of the grammar, the number of generated partial logical forms can grow exponentially ▸ To reach 40% accuracy, each example: ▹ Generates ~ 13700 partial logical forms ▹ Takes ~ 1.1 seconds (2.6 GHz machine) ▹ 3 epochs on 14K examples → 12 hours Our contribution: 11x speedup 21

  22. Main Ideas Idea 1: Macros ▸ Good logical forms share common patterns (“macro”) ▸ Restrict the generation to such macros Idea 2: Holistic Triggering ▸ There are still too many macros ▸ Only use macros from logical forms with similar utterances 22

  23. Idea 1: Macros Good logical forms usually share useful patterns (“macros”) NationOf.NextOf.HasNation.Turkey 23

  24. Idea 1: Macros Good logical forms usually share useful patterns (“macros”) NationOf.NextOf.HasNation.Turkey {REL1}Of.NextOf.Has{REL1}.{ENT2} ~ What {REL1} comes after {ENT2} 24

  25. Idea 1: Macros Good logical forms usually share useful patterns (“macros”) NationOf.NextOf.HasNation.Turkey {REL1}Of.NextOf.Has{REL1}.{ENT2} ~ What {REL1} comes after {ENT2} ▸ When we find a consistent logical form in one example, we want to cache and reuse its macro in other examples 25

  26. Training Algorithm Given a training example: ▸ Try applying macros found in previous examples to generate logical forms ▸ If a consistent logical form is found: ▹ Do gradient update as usual ▸ Otherwise: ▹ Fall back to the full compositional search 26

  27. Macro Grammar We encode the macros as grammar rules (“macro rules”) so that we can use the same beam search algorithm to generate logical forms from macros NationOf.NextOf.HasNation.Turkey {REL1}Of.NextOf.Has{REL1}.{ENT2} 27

  28. Macro Grammar We encode the macros as grammar rules (“macro rules”) so that we can use the same beam search algorithm to generate logical forms from macros NationOf.NextOf.HasNation.Turkey {REL1}Of.NextOf.Has{REL1}.{ENT2} Rel[z 1 ] + Ent[z 2 ] → Root[z 1 -Of.NextOf.Has-z 1 .z 2 ] (Rel and Ent are built by terminal rules) 28

  29. Training Algorithm Given a training example: ▸ Try applying macros found in previous examples ▸ If a consistent logical form is found: ▹ Do gradient update as usual ▸ Otherwise: ▹ Fall back to the full compositional search 29

  30. Training Algorithm Revised Maintain a list R of macro rules Given a training example: ▸ Apply beam search on R + terminal rules ▸ If a consistent logical form is found: ▹ Do gradient update as usual ▸ Otherwise: ▹ Fall back to beam search on the base grammar ▹ If a consistent logical form is found, extract its macro and augment R 30

  31. Decomposed Macro Rules Some macros share parts max(RankOf.HasGold.>.2) max({REL1}Of.Has{REL2}.>.{ENT3}) NationOf.argmin(HasSilver.>.2, Index) {REL1}Of.argmin(Has{REL2}.>.{ENT3}, Index) If we need to try both macros, it would be nice to have to featurize the shared part only once 31

  32. NationOf.NextOf.HasNation.Turkey Root NationOf.NextOf.HasNation.Turkey Set NextOf.HasNation.Turkey Set HasNation.Turkey Set Turkey Set Nation Rel Turkey Ent “Who ranked right after Turkey?” 32

  33. z 1 Root z 1 -Of.z 2 Set NextOf.z 1 Set Has-z 1 .z 2 Set z 1 Set {REL1} Rel {ENT2} Ent 33

  34. z 1 Root z 1 -Of.z 2 Set NextOf.z 1 Set Has-z 1 .z 2 Set z 1 Set {REL1} Rel {ENT2} Ent 34

  35. z 1 Root z 1 -Of.z 2 Set NextOf.z 1 Set Has-z 1 .z 2 Set z 1 Set {REL1} Rel {ENT2} Ent Ent[z 1 ] → M 1 [z 1 ] 35

  36. z 1 Root z 1 -Of.z 2 Set NextOf.z 1 Set Has-z 1 .z 2 Set M 1 {REL1} Rel Ent[z 1 ] → M 1 [z 1 ] 36

  37. z 1 Root z 1 -Of.z 2 Set NextOf.z 1 Set Has-z 1 .z 2 Set M 1 {REL1} Rel Rel[z 1 ] + M 1 [z 2 ] → M 2 [z 1 -Of.NextOf.Has-z 1 .z 2 ] Ent[z 1 ] → M 1 [z 1 ] 37

  38. z 1 Root M 2 Rel[z 1 ] + M 1 [z 2 ] → M 2 [z 1 -Of.NextOf.Has-z 1 .z 2 ] Ent[z 1 ] → M 1 [z 1 ] 38

  39. z 1 Root M 2 M 2 [z 1 ] → Root[z 1 ] Rel[z 1 ] + M 1 [z 2 ] → M 2 [z 1 -Of.NextOf.Has-z 1 .z 2 ] Ent[z 1 ] → M 1 [z 1 ] 39

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend