evaluating answers to reading
play

Evaluating Answers to Reading Results for German and the Role of - PowerPoint PPT Presentation

Evaluating Answers to RC Questions in Context: Evaluating Answers to Reading Results for German and the Role of Comprehension Questions in Context: Information Structure Detmar Meurers, Ramon Ziai, Niels Ott, Janina Kopp Results for German


  1. Evaluating Answers to RC Questions in Context: Evaluating Answers to Reading Results for German and the Role of Comprehension Questions in Context: Information Structure Detmar Meurers, Ramon Ziai, Niels Ott, Janina Kopp Results for German and the Role of Introduction Our Corpus Information Structure Data sets used CoMiC Approach Annotation Alignment Classification Features Experiment Detmar Meurers, Ramon Ziai, Niels Ott & Janina Kopp Overall results Detailed evaluation Universit¨ at T¨ ubingen Information Structure SFB 833, Projekt A4 Givenness filter Alternative question problem From Givenness to Focus Towards annotating focus Conclusion References SFB 833 EMNLP TextInfer-Workshop 2011 July 30, 2011 1 / 27

  2. Evaluating Answers Overview to RC Questions in Context: Results for German and the Role of Information Structure Introduction Detmar Meurers, Ramon Ziai, Niels Ott, Janina Kopp Introduction Our Corpus Our Corpus Data sets used CoMiC Approach Annotation CoMiC Approach Alignment Classification Features Experiment Overall results Experiment Detailed evaluation Information Structure Givenness filter Alternative question problem Information Structure From Givenness to Focus Towards annotating focus Conclusion Conclusion References SFB 833 2 / 27

  3. Evaluating Answers Long-term research questions to RC Questions in Context: Results for German and the Role of Information Structure Detmar Meurers, Ramon Ziai, Niels Ott, Janina Kopp Introduction Our Corpus ◮ What linguistic representations can be used robustly Data sets used and efficiently in automatic meaning comparison? CoMiC Approach Annotation Alignment ◮ What is the role of context and how can we utilize Classification Features knowledge about it in comparing meaning automatically? Experiment Overall results ◮ Context here means questions and reading texts in Detailed evaluation reading comprehension tasks. Information Structure Givenness filter Alternative question problem From Givenness to Focus Towards annotating focus Conclusion References SFB 833 3 / 27

  4. Evaluating Answers Aims of this talk to RC Questions in Context: Results for German and the Role of Information Structure Detmar Meurers, Ramon Ziai, Niels Ott, Janina Kopp Introduction ◮ Present first content assessment approach for German Our Corpus Data sets used ◮ Explore impact of CoMiC Approach Annotation ◮ question types and Alignment ◮ ways of encoding information in the text Classification Features Experiment ◮ Discuss the importance of explicit language-based context Overall results Detailed evaluation ◮ here: information structure of answers Information Structure Givenness filter given questions and text Alternative question problem From Givenness to Focus Towards annotating focus Conclusion References SFB 833 4 / 27

  5. Evaluating Answers Connection to RTE and Textual Inference to RC Questions in Context: Results for German and the Role of Information Structure Detmar Meurers, Ramon Ziai, Niels Ott, Janina Kopp ◮ What is Content Assessment? Introduction ◮ The task of determining whether a response actually Our Corpus Data sets used answers a given question about a specific text. CoMiC Approach ◮ Two possible perspectives in connection with RTE: Annotation Alignment Classification Features 1. Decide whether reading text T supports student answer Experiment SA , i.e., whether SA is entailed by T . Overall results Detailed evaluation 2. Decide whether student answer SA is paraphrase of Information Structure target answer TA . ⇒ bi-directional entailment Givenness filter Alternative question problem From Givenness to Focus In this talk, we focus on the second perspective. Towards annotating focus Conclusion References SFB 833 5 / 27

  6. Evaluating Answers Example from our corpus to RC Questions in Context: Results for German and the Role of Information Structure Detmar Meurers, Ramon Ziai, Niels Ott, Janina Kopp T: (Reading comprehension text) Introduction Our Corpus Data sets used Was sind die Kritikpunkte, die Leute ¨ uber Hamburg ¨ außern? Q: CoMiC Approach Annotation ‘What are the objections people have about Hamburg?’ Alignment Classification Features Experiment TA: Der Gestank von Fisch und Schiffsdiesel an den Kais . Overall results Detailed evaluation The stink of fish and fuel on the quays . Information Structure Givenness filter Alternative question problem SA: Der Geruch zon Fish und Schiffsdiesel beim Hafen . From Givenness to Focus The smell of err fish err and fuel at the port . Towards annotating focus Conclusion References SFB 833 6 / 27

  7. Evaluating Answers Data source: CREG to RC Questions in Context: C orpus of R eading Comprehension E xercises in G erman Results for German and the Role of Information Structure Detmar Meurers, Ramon Ziai, ◮ Consists of Niels Ott, Janina Kopp ◮ reading texts, Introduction ◮ reading comprehension questions, Our Corpus Data sets used ◮ target answers formulated by teachers, CoMiC Approach ◮ student answers to the questions. Annotation Alignment Classification Features Experiment Overall results Detailed evaluation Information Structure Givenness filter Alternative question problem From Givenness to Focus Towards annotating focus Conclusion References SFB 833 7 / 27

  8. Evaluating Answers Data source: CREG to RC Questions in Context: C orpus of R eading Comprehension E xercises in G erman Results for German and the Role of Information Structure Detmar Meurers, Ramon Ziai, ◮ Consists of Niels Ott, Janina Kopp ◮ reading texts, Introduction ◮ reading comprehension questions, Our Corpus Data sets used ◮ target answers formulated by teachers, CoMiC Approach ◮ student answers to the questions. Annotation Alignment Classification Features ◮ Is being collected in two large German programs in US Experiment ◮ The Ohio State University (Prof. Kathryn Corl) Overall results Detailed evaluation ◮ Kansas University (Prof. Nina Vyatkina) Information Structure Givenness filter Alternative question problem From Givenness to Focus Towards annotating focus Conclusion References SFB 833 7 / 27

  9. Evaluating Answers Data source: CREG to RC Questions in Context: C orpus of R eading Comprehension E xercises in G erman Results for German and the Role of Information Structure Detmar Meurers, Ramon Ziai, ◮ Consists of Niels Ott, Janina Kopp ◮ reading texts, Introduction ◮ reading comprehension questions, Our Corpus Data sets used ◮ target answers formulated by teachers, CoMiC Approach ◮ student answers to the questions. Annotation Alignment Classification Features ◮ Is being collected in two large German programs in US Experiment ◮ The Ohio State University (Prof. Kathryn Corl) Overall results Detailed evaluation ◮ Kansas University (Prof. Nina Vyatkina) Information Structure Givenness filter ◮ Two research assistants independently rate each Alternative question problem From Givenness to Focus student answer with respect to meaning. Towards annotating focus ◮ Did student provide meaningful answer to question? Conclusion ◮ Binary categories: adequate/inadequate References ◮ Annotators also identify target answer for student answer SFB 833 7 / 27

  10. Evaluating Answers Data sets used to RC Questions in Context: Results for German and the Role of ◮ From the corpus in development, we took a snapshot Information Structure ◮ with full agreement in binary ratings, Detmar Meurers, Ramon Ziai, Niels Ott, Janina Kopp ◮ and with half of the answers being rated as inadequate Introduction (random base line = 50%). Our Corpus Data sets used ◮ Resulted in one data set for each of the two sites CoMiC Approach Annotation ◮ No overlap in exercise material Alignment Classification Features Experiment Overall results Detailed evaluation Information Structure Givenness filter Alternative question problem From Givenness to Focus Towards annotating focus Conclusion References SFB 833 8 / 27

  11. Evaluating Answers Data sets used to RC Questions in Context: Results for German and the Role of ◮ From the corpus in development, we took a snapshot Information Structure ◮ with full agreement in binary ratings, Detmar Meurers, Ramon Ziai, Niels Ott, Janina Kopp ◮ and with half of the answers being rated as inadequate Introduction (random base line = 50%). Our Corpus Data sets used ◮ Resulted in one data set for each of the two sites CoMiC Approach Annotation ◮ No overlap in exercise material Alignment Classification Features Experiment KU data set OSU data set Overall results Detailed evaluation Target Answers 136 87 Information Structure Questions 117 60 Givenness filter Alternative question problem Student Answers 610 422 From Givenness to Focus Towards annotating focus # of Students 141 175 Conclusion SAs per question 5.21 7.03 References avg. Token # 9.71 15.00 SFB 833 8 / 27

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend