Natural Language Processing for Analyzing Disaster Recovery Trends - - PowerPoint PPT Presentation

natural language processing for analyzing disaster
SMART_READER_LITE
LIVE PREVIEW

Natural Language Processing for Analyzing Disaster Recovery Trends - - PowerPoint PPT Presentation

Natural Language Processing for Analyzing Disaster Recovery Trends Expressed in Large Text Corpora Lucy H. Lin , Scott B. Miles, Noah A. Smith University of Washington 19 October 2018 Introduction Problem : empirical data describing disaster


slide-1
SLIDE 1

Natural Language Processing for Analyzing Disaster Recovery Trends Expressed in Large Text Corpora

Lucy H. Lin, Scott B. Miles, Noah A. Smith University of Washington 19 October 2018

slide-2
SLIDE 2

Introduction

Problem: empirical data describing disaster recovery is scarce ⇒ constrains pre-/post-event recovery planning Available data source: text corpora (e.g., news articles) Manual analysis of large text corpora is slow… ⇒ + natural language processing

slide-3
SLIDE 3

Introduction

Proposition query: “Dealing with authorities is causing stress and anxiety.” query corpus Matched sentences: “Unfamiliar bureaucratic systems are causing the majority of the stress.” “Those in charge of recovery are making moves to appease the growing anger among homeowners.” Frequency across time:

2 1 1 2 1 2 2 1 3 2 1 4 2 1 5 freq

aggregate

slide-4
SLIDE 4

Outline

  • 1. Introduction
  • 2. Case study: 2010-2011 Canterbury earthquake disaster
  • 3. NLP method for semantic matching
  • 4. User evaluation
  • 5. Qualitative/quantitative output
  • 6. Conclusions
slide-5
SLIDE 5

Outline

  • 1. Introduction
  • 2. Case study: 2010-2011 Canterbury earthquake disaster
  • 3. NLP method for semantic matching
  • 4. User evaluation
  • 5. Qualitative/quantitative output
  • 6. Conclusions
slide-6
SLIDE 6

2010–2011 Canterbury earthquake disaster: timeline

September 2010:

  • Epicenter: 35km west of Christchurch
  • Moderate damage

February 2011:

  • Epicenter: 10km southeast of Christchurch
  • Extremely high ground acceleration
  • 185 deaths, thousands of felt aftershocks
slide-7
SLIDE 7

2010–2011 Canterbury earthquake disaster: impacts

Damages:

  • Estimated $40 billion
  • Housing: 100k houses in need of repairs
  • Water, utilities, road infrastructure: extensive damage

Recovery groups:

  • Government: CERA, SCIRT (sunset after 5 years)
  • Community: Regenerate Christchurch

Recovery still ongoing: public development projects, residential rezoning

slide-8
SLIDE 8

2010–2011 Canterbury earthquake disaster: text data

Corpus: 982 NZ news articles (2010–2015) post-earthquakes

  • stuff.co.nz, nzherald.co.nz

Proposition queries: 20 queries, covering

  • Community wellbeing
  • Infrastructure
  • Decision-making

e.g.: “The council should have consulted residents before making decisions.”

slide-9
SLIDE 9

Outline

  • 1. Introduction
  • 2. Case study: 2010-2011 Canterbury earthquake disaster
  • 3. NLP method for semantic matching
  • 4. User evaluation
  • 5. Qualitative/quantitative output
  • 6. Conclusions
slide-10
SLIDE 10

Semantic matching

Goal: find sentences with similar meaning to the query.

  • Needs to be more powerful than word/phrase-level

matching.

  • Related to information retrieval, but want all matches.
slide-11
SLIDE 11

Semantic matching: method overview

fast filter corpus of sentences proposition query likely matches syntax-based model matched sentences

slide-12
SLIDE 12

Semantic matching: method overview

fast filter corpus of sentences proposition query likely matches syntax-based model matched sentences

slide-13
SLIDE 13

Semantic matching: fast filter

Goal: quickly filter out unlikely matches. Word vector based comparison between two sentences:

average       Unfamiliar ⊏ ⊐ bureaucratic ⊏ ⊐ . . . . . . stress ⊏ ⊐       average       Dealing ⊏ ⊐ with ⊏ ⊐ . . . . . . anxiety ⊏ ⊐       corpus sent. ⊏ ⊐ query sent. ⊏ ⊐ cosine similarity

slide-14
SLIDE 14

Semantic matching: method overview

fast filter corpus of sentences proposition query likely matches syntax-based model matched sentences

slide-15
SLIDE 15

Semantic matching: syntax-based model

Finer-grained matching: take word order/syntax into account. Intuition: transformation between sentences is indicative of their relationship.

slide-16
SLIDE 16

Semantic matching: syntax-based model

unfamiliar bureaucratic systems are causing stress

root

candidate dealing with authorities is causing stress

root

query

?

+delete(unfamiliar) +delete(bureaucratic) +relabel(systems) +relabel(are) +insert(authorities) +insert(with)

slide-17
SLIDE 17

Semantic matching: syntax-based model

unfamiliar bureaucratic systems are causing stress

root

candidate dealing with authorities is causing stress

root

query

?

systems are causing stress

root

+delete(unfamiliar) +delete(bureaucratic) +relabel(systems) +relabel(are) +insert(authorities) +insert(with)

slide-18
SLIDE 18

Semantic matching: syntax-based model

unfamiliar bureaucratic systems are causing stress

root

candidate dealing with authorities is causing stress

root

query

?

dealing are causing stress

root

+delete(unfamiliar) +delete(bureaucratic) +relabel(systems) +relabel(are) +insert(authorities) +insert(with)

slide-19
SLIDE 19

Semantic matching: syntax-based model

unfamiliar bureaucratic systems are causing stress

root

candidate dealing with authorities is causing stress

root

query

?

dealing is causing stress

root

+delete(unfamiliar) +delete(bureaucratic) +relabel(systems) +relabel(are) +insert(authorities) +insert(with)

slide-20
SLIDE 20

Semantic matching: syntax-based model

unfamiliar bureaucratic systems are causing stress

root

candidate dealing with authorities is causing stress

root

query dealing with authorities is causing stress

root

+delete(unfamiliar) +delete(bureaucratic) +relabel(systems) +relabel(are) +insert(authorities) +insert(with)

slide-21
SLIDE 21

Semantic matching: method overview

fast filter corpus of sentences proposition query likely matches syntax-based model matched sentences

slide-22
SLIDE 22

Outline

  • 1. Introduction
  • 2. Case study: 2010-2011 Canterbury earthquake disaster
  • 3. NLP method for semantic matching
  • 4. User evaluation
  • 5. Qualitative/quantitative output
  • 6. Conclusions
slide-23
SLIDE 23

User evaluation

Questions:

  • How good are the sentences matched by our method?
  • Do potential users think this kind of tool will be helpful?

User study: 20 emergency managers

slide-24
SLIDE 24

User evaluation: output quality

Rated output from 20 proposition queries:

  • Different method variants
  • Different parts of method:
  • Not selected by filter
  • Selected by filter, but not part of final output
  • Top-scoring output from filter
  • Method output (from syntax-based model)
  • 1-5 scale (Krippendorf’s α = 0.784)
slide-25
SLIDE 25

User evaluation: example

Query: There is a shortage of construction workers. “The quarterly report for Canterbury included analysis on Greater Christchurch Value of Work projections.” (1: completely unrelated to the query)

slide-26
SLIDE 26

User evaluation: example

Query: There is a shortage of construction workers. “The construction sector’s workload was expected to peak in December.” (3: related to but does not adequately express the query)

slide-27
SLIDE 27

User evaluation: example

Query: There is a shortage of construction workers. “Greater Christchurch’s labour supply for the rebuild was tight and was likely to remain that way.” (5: expresses the query in its entirety)

slide-28
SLIDE 28

User evaluation: results

1 1.5 2 2.5 3 1.06 2.03 3.1 3.22 Average score Best performing system Not selected by filter Selected by filter (unmatched)

slide-29
SLIDE 29

User evaluation: results

1 1.5 2 2.5 3 1.06 2.03 3.1 3.22 Average score Best performing system Not selected by filter Selected by filter (unmatched) Highest-scoring by filter

slide-30
SLIDE 30

User evaluation: results

1 1.5 2 2.5 3 1.06 2.03 3.1 3.22 Average score Best performing system Not selected by filter Selected by filter (unmatched) Highest-scoring by filter Matched by method

slide-31
SLIDE 31

User evaluation: is this interesting?

Other feedback:

  • 17/20 respondents interested in measuring ideas in

news/other text corpora

slide-32
SLIDE 32

User evaluation: round two

Follow-up study:

  • Participant-supplied queries (18)
  • 7 return participants
  • Replicated findings of first user study
slide-33
SLIDE 33

Outline

  • 1. Introduction
  • 2. Case study: 2010-2011 Canterbury earthquake disaster
  • 3. NLP method for semantic matching
  • 4. User evaluation
  • 5. Qualitative/quantitative output
  • 6. Conclusions
slide-34
SLIDE 34

Recovery trends: example #1

2010 2011 2012 2013 2014 2015 3 6 9 12 Frequency The power system was fully restored quickly.

slide-35
SLIDE 35

Recovery trends: example #1

2010 2011 2012 2013 2014 2015 3 6 9 12 Frequency The power system was fully restored quickly. “Orion Energy CEO Roger Sutton says most of the west of Christchurch now has fully restored power.”

slide-36
SLIDE 36

Recovery trends: example #1

2010 2011 2012 2013 2014 2015 3 6 9 12 Frequency The power system was fully restored quickly. “He had no water but power had been restored in his area.”

slide-37
SLIDE 37

Recovery trends: example #1

2010 2011 2012 2013 2014 2015 3 6 9 12 Frequency The power system was fully restored quickly. “TV3 reports that power has now been restored to 60 per cent

  • f Christchurch.”
slide-38
SLIDE 38

Recovery trends: example #1

2010 2011 2012 2013 2014 2015 3 6 9 12 Frequency The power system was fully restored quickly. “It had been unable to access the electricity network to restore power and the situation could remain for the next few days.”

slide-39
SLIDE 39

Recovery trends: example #2

2010 2011 2012 2013 2014 2015 2 4 6 Frequency Dealing w/authorities is causing stress and anxiety.

slide-40
SLIDE 40

Recovery trends: example #2

2010 2011 2012 2013 2014 2015 2 4 6 Frequency Dealing w/authorities is causing stress and anxiety. “The initial trauma may be over but […] Christchurch residents will endure at least six years of ‘man-made’ stressors as the region battles bureaucracy.” (5)

slide-41
SLIDE 41

Recovery trends: example #2

2010 2011 2012 2013 2014 2015 2 4 6 Frequency Dealing w/authorities is causing stress and anxiety. “Add to this the growing frustration among the new, youthful leaders of the community who emerged in the wake of the quakes.” (3)

slide-42
SLIDE 42

Caveats

  • Expected topics generally expressed in the output, but

not necessarily relationships/quantities

  • Except some domain-specific entities, e.g., CERA & SCIRT
  • Measurement plots best explored jointly w/text output

(i.e., quantitative & qualitative)

  • Small sample size (25 sentences per query)
  • Reliance on sentences as unit of match
slide-43
SLIDE 43

Outline

  • 1. Introduction
  • 2. Case study: 2010-2011 Canterbury earthquake disaster
  • 3. NLP method for semantic matching
  • 4. User evaluation
  • 5. Qualitative/quantitative output
  • 6. Conclusions
slide-44
SLIDE 44

Conclusion

New NLP method to measure propositions in text corpora

  • Potential applications: long-term recovery planning,

exploratory research

  • User study with participant interest in method
  • Future work: richer models, further user engagement
slide-45
SLIDE 45

Thanks!

Contact: lucylin@cs.washington.edu Website: homes.cs.washington.edu/~lucylin /research/semantic_matching.html Funding: National Science Foundation (grant #1541025, graduate fellowship)

slide-46
SLIDE 46

(more slides)

slide-47
SLIDE 47

In relation to other NLP problems...

dynamics of language across a corpus

(e.g., Blei & Lafferty, 2006)

paraphrase (Dolan et al., 2004), entailment (Dagan et al., 2006), semantic similarity (Agirre et al., 2012) information retrieval, passage retrieval for QA

(Tellex et al., 2003)

slide-48
SLIDE 48

Fast filter details

Pre-trained word vectors:

  • word2vec (Mikolov et al., 2013), pre-trained on Google News
  • paraphrastic word vectors (Wieting et al., 2015), based off the

PPDB

slide-49
SLIDE 49

Tree edit classifier details

Original model (Heilman and Smith, 2010):

  • Extract 39 integer features from tree edit sequence:

sequence length, counts of edit types

  • Logistic regression (LR) → m(sp, s)
slide-50
SLIDE 50

Tree edit classifier details

New variation: input tree edit sequence into a LSTM Each operation in the sequence is vectorized as:

  • One-hot encoding of the operation type
  • Word vector ∆ between the sentences pre- and

post-operation

  • insert → word embedding of new word
  • relabel → difference between word embeddings
  • delete → negated word embedding of deleted word
slide-51
SLIDE 51

Tree edit classifier details

Training: SNLI corpus (Bowman et al., 2015)

  • 570k pairs of sentences
  • labels: entailment, contradiction, neutral
  • e.g.,: “A soccer game with multiple males playing.”

entails “Some men are playing a sport.” Mapping to our problem:

  • s → premise, sp → hypothesis
  • match → entailment,

non-match → contradiction/neutral

slide-52
SLIDE 52

Disaster recovery queries

Residents are frustrated by the slow pace of recovery. The repair programme is on schedule to be completed. Money for repairs is running out. The council should have consulted residents before making decisions. Mental health rates have been rising. Dealing with authorities is causing stress and anxiety. Most eligible property owners have accepted insurance offers. Confidence in Cera has been trending downwards. Water quality declined after the earthquakes. The power system was fully restored quickly.

slide-53
SLIDE 53

Disaster recovery queries

Cera missed several recovery milestones. Prices levelled off as more homes were fixed or rebuilt. People are suffering because they’ve lost the intimacy of their relationships. Coordination between rebuild groups has been problematic. Few people said insurance companies had done a good job. Having the art gallery back makes the city feel more whole. Scirt has spent less money than predicted.

slide-54
SLIDE 54

Disaster recovery queries

Traffic congestion was severe due to road repairs. Some of the businesses forced out by the earthquake are returning. Some of the burden on mental health services is caused by lack

  • f housing.