Temporal Argument Mining for Wri3ng Assistance Diane Litman - - PowerPoint PPT Presentation

temporal argument mining for wri3ng assistance
SMART_READER_LITE
LIVE PREVIEW

Temporal Argument Mining for Wri3ng Assistance Diane Litman - - PowerPoint PPT Presentation

Temporal Argument Mining for Wri3ng Assistance Diane Litman Professor, Computer Science Department Co-Director, Intelligent Systems Program Senior Scien3st, Learning Research & Development Center University of PiDsburgh PiDsburgh, PA USA


slide-1
SLIDE 1

Temporal Argument Mining for Wri3ng Assistance

Diane Litman

Professor, Computer Science Department Co-Director, Intelligent Systems Program Senior Scien3st, Learning Research & Development Center University of PiDsburgh PiDsburgh, PA USA (joint work with Fan Zhang, PhD student)

slide-2
SLIDE 2

Roles for NLP Argument Mining in Education

Learning Argumentation

Automatic Essay Grading

slide-3
SLIDE 3

Roles for NLP Argument Mining in Education

Teaching Using Argumentation

Socratic-Method Dialogue Systems

slide-4
SLIDE 4

Roles for NLP Argument Mining in Education

Processing Language

Peer Feedback

slide-5
SLIDE 5

Today’s Talk: Learning Argumenta3on

  • Temporal Argument Mining of Student Wri5ng
  • Algorithms and Applica3ons

– Revision Extrac3on – Annota3on / Classifica3on – A Wri3ng Revision Analysis System

  • Open Ques3ons
slide-6
SLIDE 6

Why teach argumenta3ve wri3ng?

  • Studies show students:

– lack competence in argument wri3ng (Oostdam, et al., 1994; Oostdam

& Emmelot, 1991).

– do not integrate their arguments into a high-level structure or coherent posi3on (Keith, Weiner, & Lesgold, 1991). – Even if compose-aloud protocols show students mentally connect posi3on statement & suppor3ng details, connec<ons not evident in wri<ng (Durst, 1987).

Slide modified from Kevin D. Ashley, Ilya Goldin. 2011 6

slide-7
SLIDE 7

Research Ques3on

  • Can temporal argument mining be used to beDer

teach, assess, and understand argumenta5ve wri5ng?

  • Approach: Technology design and evalua3on

– System enhancements that improve student learning – Argument analy3cs for teachers – Experimental pla`orms to test research predic3ons

slide-8
SLIDE 8

Temporal Argument Mining (Revision Analysis via Sentence Alignment)

Dra> 1: 1) In the circle, I would place Bill Clinton because he had an affair with his aide. Dra> 2: 1) In the third circle of Hell, sinners have uncontrollable lust. 2) The carnal sinners in this level are punished by a howling, endless wind. 3) Bill Clinton would be in this level because he had an affair with his aide.

slide-9
SLIDE 9

Temporal Argument Mining (Revision Analysis via Sentence Alignment)

Dra> 1: 1) In the circle, I would place Bill Clinton because he had an affair with his aide. Dra> 2: 1) In the third circle of Hell, sinners have uncontrollable lust. 2) The carnal sinners in this level are punished by a howling, endless wind. 3) Bill Clinton would be in this level because he had an affair with his aide. R1: Align: null->1 Op: Add Purpose: ? R2: Align: 1->3 Op: Modify Purpose: ? ….

slide-10
SLIDE 10

Temporal Argument Mining (Revision Analysis via Sentence Alignment)

Dra> 1: 1) In the circle, I would place Bill Clinton because he had an affair with his aide. Dra> 2: 1) In the third circle of Hell, sinners have uncontrollable lust. 2) The carnal sinners in this level are punished by a howling, endless wind. 3) Bill Clinton would be in this level because he had an affair with his aide. R1: Align: null->1 Op: Add Purpose: ArgumentaBve R2: Align: 1->3 Op: Modify Purpose: Surface ….

slide-11
SLIDE 11

Temporal Argument Mining

  • How are arguments changed during revision?

– Analysis across versions of a text, rather than analyzing the argument structure of a single text

  • Subtasks

– SegmentaBon: sentences

  • Revision extrac3on via alignment [Zhang & Litman, 2014]

– Segment classificaBon: argumenta3ve purpose

  • Wikipedia features [Zhang & Litman, 2015]
  • Contextual methods [Zhang & Litman, 2016]
slide-12
SLIDE 12

Revision Extrac3on

[Zhang & Litman, 2014]

  • Treat alignment as classifica3on

– Construct sentence pairs using the Cartesian product across drajs – Compute sentence similarity – Logis3c regression determines whether a pair is aligned or not

  • Global alignment [Needleman & Wunsch, 1970]

– Sentences are more likely to be aligned if sentences before are aligned – Star3ng from the first pair, find the path to maximize likelihood

  • s(i, j) = max{s(i−1, j−1)+sim(i, j), s(i− 1, j) + insertcost , s(i, j − 1) + deletecost}
  • TF*IDF similarity yields the best results

– 90 -94% within and across several corpora

12

slide-13
SLIDE 13

Revision Purpose Annota3on [Zhang & Litman, 2015]

  • 2 binary (5 fine-grained) categories

– Argumenta3ve

  • Claim
  • Warrant
  • Evidence
  • General content

– Surface

  • Kappa = .7

– 2 high school corpora (>1000 revisions each)

13

slide-14
SLIDE 14

Revision Purpose Classifica3on

[Zhang & Litman, 2015]

  • Each sentence pair is an instance
  • Features based on Wikipedia revisions [Adler et al., 2011; Javanmardi et

al., 2011; Bronner & Monz, 2012; Daxenberger & Gurevych, 2013]

– Loca3on

– Sentence (first/last in paragraph, exact index) – Paragraph (first/last in essay, exact index)

– Textual

– Keywords: “because”, “however”, “for example” …. – Named-en3ty – Sentence difference (Levenshtein distance…) – Revision opera3on (Add/Delete/Modify)

– Language

– Out of vocabulary words

14

slide-15
SLIDE 15

Experimental Evalua3ons

  • Surface vs. argumenta3ve

– Intrinsic (SVM, 10-fold): results significantly beDer than unigram baseline – Extrinsic: predicted versus actual labels yield same correla3ons with wri3ng improvement

  • Fine-grained

– Intrinsic results mostly outperform unigram baselines – Feature groups have different impacts

15

slide-16
SLIDE 16

Enhancing Classifica3on with Context

[Zhang & Litman, 2016]

  • Contextual features

– Original features, but for adjacent sentences – Changes in cohesion (lexical) & coherence (seman3c)

  • Sequence modeling
  • Results: Fine-grained labels

– Cohesion significantly improves results for one corpus (SVM, 10-fold) – Sequence modeling yields best results for both corpora

16

slide-17
SLIDE 17

Applica3on

[Zhang, Hwa, Litman, & Hashemi, 2016]

  • ArgRewrite: A Web-based Revision Assistant

for Argumenta3ve Wri3ngs

– www.cs.piD.edu/˜zhangfan/argrewrite

slide-18
SLIDE 18

Revision Overview Interface

slide-19
SLIDE 19

Revision Detail Interface

slide-20
SLIDE 20

Open Ques3on 1: Rela3on to Argument Mining?

  • Current Approach
  • 1. Revision extrac<on by comparing drajs
  • 2. (Temporal) Argument mining on each revision
  • Alterna3ve Approach?
  • 1. Argument mining on each draj
  • 2. Revision extrac<on by comparing mined arguments
  • Also, pipelining vs. joint modeling?
  • From syntax to seman3cs?
slide-21
SLIDE 21

Open Ques3on 2: Rela3on to Revision Analysis?

  • Current Approach
  • 1. Argumenta3ve annota3on scheme
  • Alterna3ve Approach?
  • 1. Something to also cover Wikipedia
  • 2. Predic3on of revision quality (e.g., is paper ge}ng

beDer, is argument ge}ng stronger?)

  • 3. Rela3on to peer reviews/teacher feedback between

drajs, if available

slide-22
SLIDE 22

Open Ques3on 3: Rela3on to Discourse Analysis?

  • Current Approach
  • 1. PDTB features
  • 2. Sentence as the ADU
  • Alterna3ve Approach?
  • 1. RST features
  • 2. “Clause” as the ADU
  • 3. Other ways of using NLP discourse parsers
slide-23
SLIDE 23

Summary

  • NLP-supported temporal argument mining for

teaching and assessing wri3ng

– Feature / Algorithm Development

  • Noisy and diverse data
  • Meaningful features
  • Real-3me performance

– Experimental Evalua3ons

  • Temporal Argument Mining (high school & university corpora)
  • Revision Assistant (lab user study)
  • Even non-structural and applica3on-dependent

argument mining can support useful applica3ons!

slide-24
SLIDE 24

Thank You!

  • Ques3ons?
  • Further Informa3on

– hDp://www.cs.piD.edu/~litman