question generation with minimal recursion semantics
play

Question Generation with Minimal Recursion Semantics Xuchen Yao 1 - PowerPoint PPT Presentation

Introduction Background System Architecture Evaluation Conclusion References Question Generation with Minimal Recursion Semantics Xuchen Yao 1 and Yi Zhang 2 1 European Masters in Language and Communication Technologies University of


  1. Introduction Background System Architecture Evaluation Conclusion References Question Generation with Minimal Recursion Semantics Xuchen Yao 1 and Yi Zhang 2 1 European Masters in Language and Communication Technologies University of Groningen & Saarland University 2 Saarland University German Research Center for Artificial Intelligence 18 June, QG/QGSTEC/2010

  2. Introduction Background System Architecture Evaluation Conclusion References Outline Introduction Template/Syntax/Semantics-based Approaches Why Semantics-based? Background MRS/ERG/PET/LKB System Architecture Overview MRS Transformation for Simple Sentences MRS Decomposition for Complex Sentences Language Independence and Domain Adaptability Evaluation

  3. Introduction Background System Architecture Evaluation Conclusion References Approaches • Template-based (Mostow and Chen (2009)) • What did <character> <verb>? • Syntax-based (Wyse and Piwek (2009), Heilman and Smith (2009)) • John plays football. (S NP (VP (V NP))) • John plays what? (S NP (VP (V WHNP))) • John does play what? (S NP (VP (Aux-V V WHNP))) • Does John play what? (S Aux-V NP (VP (V WHNP))) • What does John play? (S WHNP Aux-V NP (VP (V))) • Semantics-based • play(John, football) • play(John, what) • play(who, football)

  4. Introduction Background System Architecture Evaluation Conclusion References Approaches • Template-based (Mostow and Chen (2009)) • What did <character> <verb>? • Syntax-based (Wyse and Piwek (2009), Heilman and Smith (2009)) • John plays football. (S NP (VP (V NP))) • John plays what? (S NP (VP (V WHNP))) • John does play what? (S NP (VP (Aux-V V WHNP))) • Does John play what? (S Aux-V NP (VP (V WHNP))) • What does John play? (S WHNP Aux-V NP (VP (V))) • Semantics-based • play(John, football) • play(John, what) • play(who, football)

  5. Introduction Background System Architecture Evaluation Conclusion References Outline Introduction Template/Syntax/Semantics-based Approaches Why Semantics-based? Background MRS/ERG/PET/LKB System Architecture Overview MRS Transformation for Simple Sentences MRS Decomposition for Complex Sentences Language Independence and Domain Adaptability Evaluation

  6. Introduction Background System Architecture Evaluation Conclusion References Why Semantics-based? • Something different than template/syntax-based. • More intuitive? • More language independent (universal)? • Make use of the generation function of the English Resource Grammar • Sag, I. A. & Flickinger, D. Generating Questions with Deep Reversible Grammars. In Proceedings of the First Workshop on the Question Generation Shared Task and Evaluation Challenge . 2008. • Deeper is better?

  7. Introduction Background System Architecture Evaluation Conclusion References Why Semantics-based? • Something different than template/syntax-based. • More intuitive? • More language independent (universal)? • Make use of the generation function of the English Resource Grammar • Sag, I. A. & Flickinger, D. Generating Questions with Deep Reversible Grammars. In Proceedings of the First Workshop on the Question Generation Shared Task and Evaluation Challenge . 2008. • Deeper is better?

  8. Introduction Background System Architecture Evaluation Conclusion References Outline Introduction Template/Syntax/Semantics-based Approaches Why Semantics-based? Background MRS/ERG/PET/LKB System Architecture Overview MRS Transformation for Simple Sentences MRS Decomposition for Complex Sentences Language Independence and Domain Adaptability Evaluation

  9. Introduction Background System Architecture Evaluation Conclusion References DELPH-IN (MRS/ERG/PET/LKB) Deep Linguistic Processing with HPSG: http://www.delph-in.net/ Minimal Recursion Semantics INDEX: e2 John likes Mary. RELS: < like(John, Mary ) [ PROPER_Q_REL<0:4> [ NAMED_REL<0:4> LBL: h3 LBL: h7 ARG0: x6 ARG0: x6 RSTR: h5 (PERS: 3 NUM: SG) Parsing BODY: h4 ] CARG: "John" ] with PET [ _like_v_1_rel<5:10> LBL: h8 ARG0: e2 [ e SF: PROP TENSE: PRES ] ARG1: x6 Generation ARG2: x9 with LKB [ NAMED_REL<11:17> [ PROPER_Q_REL<11:17> LBL: h13 LBL: h10 ARG0: x9 ARG0: x9 RSTR: h12 (PERS: 3 NUM: SG) CARG: "Mary" ] BODY: h11 ] > John likes Mary. English Resource HCONS: < h5 qeq h7 h12 qeq h13 > Grammar

  10. Introduction Background System Architecture Evaluation Conclusion References Details (THEORY)MRS: Minimal Recursion Semantics a meta-level language for describing semantic structures in some underlying object language. (GRAMMAR)ERG: English Resource Grammar a general-purpose broad-coverage grammar implementation under the HPSG framework. (TOOL)LKB: Linguistic Knowledge Builder a grammar development environment for grammars in typed feature structures and unification-based formalisms. (TOOL)PET: a platform for experimentation with efficient HPSG processing techniques a two-stage parsing model with HPSG rules and PCFG models, balancing between precise linguistic interpretation and robust probabilistic coverage.

  11. Introduction Background System Architecture Evaluation Conclusion References Details (THEORY)MRS: Minimal Recursion Semantics a meta-level language for describing semantic structures in some underlying object language. (GRAMMAR)ERG: English Resource Grammar a general-purpose broad-coverage grammar implementation under the HPSG framework. (TOOL)LKB: Linguistic Knowledge Builder a grammar development environment for grammars in typed feature structures and unification-based formalisms. (TOOL)PET: a platform for experimentation with efficient HPSG processing techniques a two-stage parsing model with HPSG rules and PCFG models, balancing between precise linguistic interpretation and robust probabilistic coverage.

  12. Introduction Background System Architecture Evaluation Conclusion References Outline Introduction Template/Syntax/Semantics-based Approaches Why Semantics-based? Background MRS/ERG/PET/LKB System Architecture Overview MRS Transformation for Simple Sentences MRS Decomposition for Complex Sentences Language Independence and Domain Adaptability Evaluation

  13. Introduction Background System Architecture Evaluation Conclusion References MrsQG (Task B) http://code.google.com/p/mrsqg/

  14. Introduction Background System Architecture Evaluation Conclusion References Term Extraction 5 4 MRS ● Stanford Named Entity Recognizer Plain text MRS Decomposition Transformation ● a regular expression NE tagger ● an Ontology NE tagger MRS 1 6 XML Apposition Decomposer Term Generation extraction Coordination Decomposer with LKB 2 7 NEday NEfirstName NEperson Subclause Decomposer NElocation NEprovince FSC Output NEusPresident NEstate construction selection NEcapital NEmonth NEyear NElocation Subordinate Decomposer FSC 3 XML 8 Jackson was born on August 29, 1958 in Gary, Indiana. Output to Why Decomposer Parsing MRS console/XML with PET XML NEperson NEdate NElocation

  15. Introduction Background System Architecture Evaluation Conclusion References Term Extraction 5 4 MRS ● Stanford Named Entity Recognizer Plain text MRS Decomposition Transformation ● a regular expression NE tagger ● an Ontology NE tagger MRS 1 6 XML Apposition Decomposer Term Generation extraction Coordination Decomposer with LKB where 2 7 Subclause Decomposer FSC Output which location who when which day construction selection Subordinate Decomposer FSC 3 XML 8 Jackson was born on August 29, 1958 in Gary, Indiana. Output to Why Decomposer Parsing MRS console/XML with PET XML NEperson NEdate NElocation

  16. Introduction Background System Architecture Evaluation Conclusion References Outline Introduction Template/Syntax/Semantics-based Approaches Why Semantics-based? Background MRS/ERG/PET/LKB System Architecture Overview MRS Transformation for Simple Sentences MRS Decomposition for Complex Sentences Language Independence and Domain Adaptability Evaluation

  17. Introduction Background System Architecture Evaluation Conclusion References MRS Transformation 5 4 MRS Plain text MRS Decomposition Transformation MRS 1 6 XML Apposition Decomposer Term Generation extraction with LKB Coordination Decomposer 2 7 Subclause Decomposer FSC Output construction selection Subordinate Decomposer FSC 3 8 XML Output to Parsing Why Decomposer MRS console/XML with PET XML

  18. Introduction Background System Architecture Evaluation Conclusion References WHO proper_q(x) which_q(x) named(x,"John") proper_q(y) person(x) proper_q(y) named(y,"Mary") like(e,x,y) named(y,"Mary") like(e,x,y) Figure: “John likes Mary” → “Who likes Mary?”

  19. Introduction Background System Architecture Evaluation Conclusion References WHERE proper_q(x) which_q(x) named(x,"Broadway") proper_q(y) place(x) proper_q(y) sing(e,y), sing(e,y), named(y,"Mary") named(y,"Mary") on_p(x) loc_nonsp(x) Figure: “Mary sings on Broadway.” → “Where does Mary sing?”

  20. Introduction Background System Architecture Evaluation Conclusion References WHEN def_implict_q(x) which_q(x) numbered_hour(x,"10") proper_q(y) time(x) proper_q(y) sing(e,y), sing(e,y), named(y,"Mary") named(y,"Mary") at_p_temp(x) loc_nonsp(x) Figure: “Mary sings at 10.” → “When does Mary sing?”

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend