NaturalLI: Natural Logic Inference for Common Sense Reasoning - - PowerPoint PPT Presentation

naturalli natural logic inference for common sense
SMART_READER_LITE
LIVE PREVIEW

NaturalLI: Natural Logic Inference for Common Sense Reasoning - - PowerPoint PPT Presentation

NaturalLI: Natural Logic Inference for Common Sense Reasoning Angeli & Manning (2014) (MacCartney 2007) Introduction: Motivation examples Natural Logic: Lexical Relations Monotonicity and Polarity Proof by alignment


slide-1
SLIDE 1

NaturalLI: Natural Logic Inference for Common Sense Reasoning

Angeli & Manning (2014) (MacCartney 2007)

slide-2
SLIDE 2
  • Introduction: Motivation examples
  • Natural Logic:

○ Lexical Relations ○ Monotonicity and Polarity ○ Proof by alignment

  • Inference as Search
  • Results
  • Discussion
slide-3
SLIDE 3

Natural Language Inference (NLI) : Recognizing textual entailment

Does premise P justify an inference to hypothesis H?

P Every firm polled saw costs grow more

than expected, even after adjusting inflation.

H Every big company in the poll reported

cost increases.

YES What if we change the quantifiers to Some?

slide-4
SLIDE 4

Does premise P justify an inference to hypothesis H?

P The cat ate a mouse H No carnivores eat animals NO Natural Language Inference is necessary to the ultimate goal of full Natural Language understanding.

(also enable semantic search, questions answering,)

slide-5
SLIDE 5

Approached solutions:

NLP on text

Surface form of the text. We need logical subtlety

First-order logic Theorem proving. Intractable

unnatural language!

Natural Logic

Intermediate representation

slide-6
SLIDE 6

What is Natural Logic?

If I mutate a sentence in this specified way, do I preserve its truth?

A logic whose vehicle of inference is natural language (Lakoff, 1970) Instantaneous semantic parsing!

Characterizes valid patterns of inference in terms of surface forms, it enables to do precise reasoning avoiding the difficulties of fuel semantic interpretation.

  • Influenced in traditional logic: Aristotle’s syllogisms.

Syllogistic reasoning.

  • Monotonicity calculus. (Sanchez, Valencia 1986-91)
  • McCartney's Natural Logic. Extends monotonicity calculus

to account for negation and exclusion

slide-7
SLIDE 7

Basic entailment lexical relations

# couch sofa crow bird utahn american human nonhuman (exhaustive exclusion) (non-exhaustive exclusion) cat dog (exhaustive non-exclusion) animal nonhuman (independence) Cat # friendly

slide-8
SLIDE 8

Relations are defined for all semantic types:

tiny small

dance move this morning today this morning today in Beijing in China everyone someone all most some

slide-9
SLIDE 9

eat apple eat fruit

apple fruit Small example

slide-10
SLIDE 10

Entailment and semantic composition

How the entailments of a compound expression depend on the entailments of its parts?

  • Typically, semantic composition preserves entailment relations:

eat apple eat fruit, big bird big fish,

  • But many semantic functions behave differently:

tango dance european african refuse to tango refuse to dance not european not african some cats some animals

slide-11
SLIDE 11

Polarity

Hypernym as a partial order Polarity is the direction a lexical item can move in the ordering

slide-12
SLIDE 12

Polarity

Quantifiers determines the polarity of words

slide-13
SLIDE 13

Polarity

Quantifiers determines the polarity of words

slide-14
SLIDE 14

Polarity

Quantifiers determines the polarity of words

slide-15
SLIDE 15

Polarity

Quantifiers determines the polarity of words

slide-16
SLIDE 16

Polarity

Quantifiers determines the polarity of words

slide-17
SLIDE 17

Polarity

Quantifiers determines the polarity of words

slide-18
SLIDE 18

Polarity

Quantifiers determines the polarity of words

slide-19
SLIDE 19

Projecting relations induced by lexical mutations

Projection function. Two sentences differing only by a single lexical relation (downward) Join table. Two projected relations for composition

slide-20
SLIDE 20

Projection examples

cat dog no cat no dog animal nonhuman failed to be animal failed to be nonhuman cat animal no cats eat mice no animal eat mice fish human human nonhuman fish nonhuman feline cat cat feline #dog dog cat feline feline cat dog dog

slide-21
SLIDE 21

Proof by alignment

1. Find sequence of edits connecting P and H. Insertions, deletions, substitution 2. Determine lexical entailment relation for each edit

  • Substitutions: depends on meaning of substituends:
  • Deletions: by default: dark chocolate chocolate
  • But some deletions are special: not ill ill, refuse to go go
  • Insertion are symmetric to deletions: by default
  • 3. Project up to find entailment relation across each edit
  • 4. Join entailment relations across sequence of edits

cat dog

slide-22
SLIDE 22

Example:

P Stimpy is a cat H Stimpy is not a poodle i Mutation r s Stimpy is a cat Stimpy is not a poodle

slide-23
SLIDE 23

A more complex example

slide-24
SLIDE 24

Common Sense Reasoning with Natural Logic

Task: Given an utterance, and a large knowledge base of supporting facts. We want to know if the utterance is true or false.

slide-25
SLIDE 25

Common Sense Reasoning for NLP

slide-26
SLIDE 26

Common Sense Reasoning for Vision

slide-27
SLIDE 27

Start with a (large) Knowledge Base >> Infer new facts

slide-28
SLIDE 28
slide-29
SLIDE 29

Infer new facts, on demand from a query

slide-30
SLIDE 30

Using text as the meaning representation

slide-31
SLIDE 31

Without aligning to any particular premise

slide-32
SLIDE 32

Natural Logic inference is search

slide-33
SLIDE 33

Example search as graph search

slide-34
SLIDE 34

Example search as graph search

slide-35
SLIDE 35

Example search as graph search

slide-36
SLIDE 36

Example search as graph search

slide-37
SLIDE 37

Example search as graph search

slide-38
SLIDE 38

Example search as graph search

slide-39
SLIDE 39

Edges of the graph

slide-40
SLIDE 40

Edge templates

slide-41
SLIDE 41

“Soft” Natural Logic

Likely (but not certain) inferences

  • Each edge has a cost >=0

Detail: Variation among edge instances of a template.

  • WordNet:
  • Nearest neighbors distance.
  • Most other cases distance is 1.
  • Let us call this edge distance f.
slide-42
SLIDE 42

Experiments

  • Knowledge base: 270 millions unique lemmatized premises as database (Ollie

extractions: short canonical utterances. Wikipedia)

  • Evaluation set: Semi-curated collection of common-sense (true) facts.
  • Negatives: Mechanical Turk
  • Size: 1378 Train, 1080 Test
slide-43
SLIDE 43

Results

slide-44
SLIDE 44

References

Some of the material for these slides was also extracted from the following links: Modeling Semantic Containment and Exclusion in Natural Language Inference. Bill MacCartney 2008: https://slideplayer.com/slide/5095504/

  • NatutalLI. G. Agneli 2014:

https://cs.stanford.edu/~angeli/talks/2014-emnlp-naturalli.pdf

slide-45
SLIDE 45
slide-46
SLIDE 46

Equations

Surface form and validity to a new fact is the normalized frequency a word in Google N-gram corpus Neural Network embeddings Huang et al. Log likelihood of data D, subject to cost, Objective function, negative log likelihood, with L2 regularization,