Lambdas, Vectors, and Dynamic Logic Develop: a vector semantics - - PowerPoint PPT Presentation

lambdas vectors and dynamic logic develop a vector
SMART_READER_LITE
LIVE PREVIEW

Lambdas, Vectors, and Dynamic Logic Develop: a vector semantics - - PowerPoint PPT Presentation

Lambdas, Vectors, and Dynamic Logic Develop: a vector semantics and a dynamic logic for lambda calculus models of language Mehrnoosh Sadrzadeh Queen Mary University of London Joint work with Reinhard Muskens (Tilburg) Supported by Royal


slide-1
SLIDE 1

Lambdas, Vectors, and Dynamic Logic

slide-2
SLIDE 2

Develop: a vector semantics and a dynamic logic for lambda calculus models of language

slide-3
SLIDE 3

Mehrnoosh Sadrzadeh Queen Mary University of London

Joint work with Reinhard Muskens (Tilburg) Supported by Royal Society Int. Exchange Award

slide-4
SLIDE 4

Why putting lambdas and vector together?

slide-5
SLIDE 5

Why putting lambdas and vector together? Because we want to develop a compositional distributional semantics for natural language.

slide-6
SLIDE 6

Why putting lambdas and vector together? Because we want to develop a compositional distributional semantics for natural language.

What is that?

slide-7
SLIDE 7

Distributional Semantics

Words that occur in similar contexts have similar meanings. “oculist and eye-doctor . . . occur in almost the same environments” “If A and B have almost identical environments. . . we say that they are synonyms.” Harris (1954) “You shall know a word by the company it keeps!” Firth (1957)

slide-8
SLIDE 8

Imagine you had never seen the word tesguino, but given the following text: A bottle of tesguino is on the table. Everybody likes tesguino . Tesguino makes you drunk. We make tesguino out of corn. the company it keeps!”. The meaning of a word is thus related to the distribution of words around it. Imagine you had never seen the word tesg¨ uino, but I gave you the following 4 sen-

Speech and Language Processing. Daniel Jurafsky & James H. Martin.

slide-9
SLIDE 9

Imagine you had never seen the word tesguino, but given the following text: A bottle of tesguino is on the table. Everybody likes tesguino . Tesguino makes you drunk. We make tesguino out of corn. “an alcoholic drink made of corn”

slide-10
SLIDE 10

Co-Occurrence Matrix

  • sugar, a sliced lemon, a tablespoonful of apricot

preserve or jam, a pinch each of, their enjoyment. Cautiously she sampled her first pineapple and another fruit whose taste she likened well suited to programming on the digital computer. In finding the optimal R-stage policy from for the purpose of gathering data and information necessary for the study authorized in the

aardvark ... computer data pinch result sugar ... apricot ... 1 1 pineapple ... 1 1 digital ... 2 1 1 information ... 1 6 4 Figure 15.4 Co-occurrence vectors for four words, computed from the Brown corpus,

[| digital |] := (0, …, 2, 1, 0, 1, 0)

Algorithm for Word Meaning Acquisition

slide-11
SLIDE 11

Co-Occurrence Matrix

  • sugar, a sliced lemon, a tablespoonful of apricot

preserve or jam, a pinch each of, their enjoyment. Cautiously she sampled her first pineapple and another fruit whose taste she likened well suited to programming on the digital computer. In finding the optimal R-stage policy from for the purpose of gathering data and information necessary for the study authorized in the

aardvark ... computer data pinch result sugar ... apricot ... 1 1 pineapple ... 1 1 digital ... 2 1 1 information ... 1 6 4 Figure 15.4 Co-occurrence vectors for four words, computed from the Brown corpus,

[| digital |] := (0, …, 0.25, 0.12, 0, 0.15, 0)

Algorithm for Word Meaning Acquisition

Normalised

slide-12
SLIDE 12

Word-Context Matrix

  • sugar, a sliced lemon, a tablespoonful of apricot

preserve or jam, a pinch each of, their enjoyment. Cautiously she sampled her first pineapple and another fruit whose taste she likened well suited to programming on the digital computer. In finding the optimal R-stage policy from for the purpose of gathering data and information necessary for the study authorized in the

aardvark ... computer data pinch result sugar ... apricot ... 1 1 pineapple ... 1 1 digital ... 2 1 1 information ... 1 6 4 Figure 15.4 Co-occurrence vectors for four words, computed from the Brown corpus,

Intuition: a word is represented by embedding its representation into a vector space, rather than just treated as an “atom”:

Q: What’s the meaning of life? A: LIFE

slide-13
SLIDE 13

Automatic Meaning Acquisition

Curran, 2006

Introduction: launch, implementation, advent, addition arrival, creation, inclusion Evaluation: assessment, examination, appraisal, review audit, analysis, consultation, test, verification Methods: technique, procedure, means, approach, tool, concept, practice, formula

slide-14
SLIDE 14

Stronghold

Semantic Similarity

1 2 3 4 5 6 7 1 2 3 digital apricot information Dimension 1: ‘large’ Dimension 2: ‘data’

slide-15
SLIDE 15

Evaluations

WordSim-353: noun pairs (cup, coffee) SimLex-999: adjective, noun, and verb pairs (cup, drink) TOEFL: 80 questions “Levied” is closest in meaning to imposed, believed, requested, correlated SCWS: 2003 words in sentences Analogy: a to b is like c to do Athens to Greece is like Oslo to Norway

slide-16
SLIDE 16

Applications

Drawing inferences about meanings of words. Named entity recognition, parsing, semantic role labelling., summarisation, essay marking, …

slide-17
SLIDE 17

Extending distributional semantics from words to sentences Compositional Distributional Semantics

slide-18
SLIDE 18

Vectors for Sentences

1 2 3 4 5 6 1 2 digital [1,1]

result data

information [6,4] 3 4

base 1 base 2

dogs bark mice squeak

slide-19
SLIDE 19

At childhood we spent the summers in countryside. Khaterey daram sharbat-e albalooye anjast ast. My mother preferred the jams, however. My worst memory is being forced to take a nap in the afternoons.

A direct distributional approach…

slide-20
SLIDE 20

At childhood we spent the summers in countryside. Khaterey daram sharbat-e albalooye anjast ast. My mother preferred the jams, however. My worst memory is being forced to take a nap in the afternoons. “ A memory from then is their sour cherry drink”

A direct distributional approach…

slide-21
SLIDE 21

An approach that forgets about grammar

  • !

vampires kill men =

  • !

vampires + ! kill +

  • !

men =

  • !

vampires ! kill

  • !

men

  • !

vampires kill men =

  • !

men kill vampires

slide-22
SLIDE 22

So what should we do?

slide-23
SLIDE 23

Formal Semantics

Syntax Semantics

structure preserving map

slide-24
SLIDE 24

Formal Semantics with Vectors

Syntax Semantics

structure preserving map

Formal Grammar Vectors

slide-25
SLIDE 25

Compositional Distributional Semantics

Syntax Semantics

structure preserving map

Pregroup Grammar

Coecke, Sadrzadeh, Clark, (Lambek’s 90th Festschrift), 2010 Preller, Sadrzadeh (JoLLI), 2011

Vectors Spaces

strongly monoidal functor

slide-26
SLIDE 26

Compositional Distributional Semantics

Syntax Semantics

structure preserving map

Lambek Calculus

Coecke, Grefenstette,Sadrzadeh, (APAL), 2013

Vectors Spaces

homomorphism

slide-27
SLIDE 27

Compositional Distributional Semantics

Syntax Semantics

structure preserving map

Lambek Grishin

  • G. Wijnholds, (MSc Thesis, ILLC), 2015

Vectors Spaces

homomorphism

slide-28
SLIDE 28

Syntax Semantics

structure preserving map

CCG

Maillard, Clark, Grefenstette, (Type Theory and NL, EACL workshop), 2014. Krishnamurty and Mitchell, (CVSC, ACL workshop), 2013. Baroni, Bernardini, Zamparelli, (LILT), 2014.

Vector Spaces

slide-29
SLIDE 29

Syntax Semantics

structure preserving map

ACG

Muskens and Sadrzadeh, DSALT , LACL 2016. journal version is to apear in JLM 2018

Vectors Spaces

homomorphism

Royal Society International Exchange Award

slide-30
SLIDE 30

CCG Semantics

structure preserving map T ypes

X/Y X \ Y

A

X/Y Y =) X Y X \ Y =) X

Rules

slide-31
SLIDE 31

Syntax Semantics

structure preserving map T ypes

3 Vi

A 7! U

slide-32
SLIDE 32

Syntax Semantics

structure preserving map T ypes

3 Vi

A 7! U

X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X

slide-33
SLIDE 33

Syntax Semantics

structure preserving map T ypes

3 Vi

3 3 Mij, Tijk, Cijkl, · · ·

A 7! U

X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X

slide-34
SLIDE 34

Syntax Semantics

structure preserving map T ypes Rules

3 Vi

A 7! U 3 Ti, Ti j, Ti jk, Ti jkl, · · ·

X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X

X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X

slide-35
SLIDE 35

Syntax Semantics

structure preserving map T ypes Rules

3 Vi

A 7! U 3 Ti, Ti j, Ti jk, Ti jkl, · · ·

X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X

X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X

Ti j···k

2

slide-36
SLIDE 36

Syntax Semantics

structure preserving map T ypes Rules

3 Vi

A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·

X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X

X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X

Ti j···k

2

Tkl···w

slide-37
SLIDE 37

Syntax Semantics

structure preserving map T ypes Rules

3 Vi

A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·

X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X

X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X

Ti j···k

2

Tkl···w

slide-38
SLIDE 38

Syntax Semantics

structure preserving map T ypes Rules

3 Vi

A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·

X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X

X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X

Ti j···k

2

Tkl···w = Ti j···l···w

2

slide-39
SLIDE 39

Syntax Semantics

structure preserving map T ypes Rules

3 Vi

A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·

X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X

X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X

2

3 Ti,

Ti j

slide-40
SLIDE 40

Syntax Semantics

structure preserving map T ypes Rules

3 Vi

A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·

X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X

X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X

2

3 Ti,

Ti j

slide-41
SLIDE 41

Syntax Semantics

structure preserving map T ypes Rules

3 Vi

A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·

X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X

X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X

2

3 Ti,

Ti j

T j

2

slide-42
SLIDE 42

Syntax Semantics

structure preserving map T ypes Rules

3 Vi

A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·

X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X

X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X

2

3 Ti,

Ti j

T j

2

x =

slide-43
SLIDE 43

7!

Cats chase dogs noisily.

NP ((S \ NP) / NP) NP S \ S

slide-44
SLIDE 44

7!

Cats chase dogs noisily.

NP ((S \ NP) / NP) NP S \ S

⌦ ⌦

⌦ S ⌦ N⇤ ⌦ N⇤

N ⌦ S ⌦ S⇤

N N

slide-45
SLIDE 45

7!

Cats chase dogs noisily.

NP ((S \ NP) / NP) NP S \ S

⌦ ⌦

⌦ S ⌦ N⇤ ⌦ N⇤

N ⌦ S ⌦ S⇤

N N

Ti

Ti jk

Tk

T jl

2 2 2 2

slide-46
SLIDE 46

7!

cats chase dogs noisily

NP ((S/NP) \ NP) NP (S/S)

N ⌦ S ⌦ N⇤ ⌦ N⇤ ⌦ N

N N

N ⌦ S ⌦ S⇤

TiTi jkTkT jl

slide-47
SLIDE 47

7!

cats chase dogs noisily

NP ((S/NP) \ NP) NP (S/S)

N ⌦ S ⌦ N⇤ ⌦ N⇤ ⌦ N

N N

N ⌦ S ⌦ S⇤

TiTi jkTkT jl =) TiTijT jl

slide-48
SLIDE 48

7!

cats chase dogs noisily

NP ((S/NP) \ NP) NP (S/S)

N ⌦ S ⌦ N⇤ ⌦ N⇤ ⌦ N

N N

N ⌦ S ⌦ S⇤

=) T jT jl

TiTi jkTkT jl =) TiTijT jl

slide-49
SLIDE 49

7!

cats chase dogs noisily

NP ((S/NP) \ NP) NP (S/S)

N ⌦ S ⌦ N⇤ ⌦ N⇤ ⌦ N

N N

N ⌦ S ⌦ S⇤

=) Tl

=) T jT jl

TiTi jkTkT jl =) TiTijT jl

slide-50
SLIDE 50

7!

cats chase dogs noisily

NP ((S/NP) \ NP) NP (S/S)

N ⌦ S ⌦ N⇤ ⌦ N⇤ ⌦ N

N N

N ⌦ S ⌦ S⇤

=) Tl

=) T jT jl

TiTi jkTkT jl =) TiTijT jl

2 S

slide-51
SLIDE 51

Applications

slide-52
SLIDE 52

Building the T ensors

2- Degree of correlation between subject and object

X

i

(

  • !

S bj ⌦

  • !

Obj)i

1- Context vector of the verb 3- Populating a cube tensor from a matrix by copying the

  • bject dimension

4- Populating a cube tensor from a matrix by copying the subject dimension

V Mij = Tijk

7! X ⌦ Y 3

  • !

Verb ⌦

  • !

Verb ⌦

  • !

Verb

Relational Frobenius 5- Baroni and Zamparelli, Polajnar et al: Linear and multi-step linear regression for learning these from holistic vectors. Least Sq Kronecker

slide-53
SLIDE 53

x y z annual report draw attention

  • ld man pulled ceremonial sword

annual report attracted attention

  • ld man draw ceremonial sword

Disambiguation

Grefenstette-Sadrzadeh, EMNP 2012, Journal of Computational Linguistics 2015 annual report pulled attention

slide-54
SLIDE 54

Model ρ Verb Baseline 0.20 Bigram Baseline 0.14 Trigram Baseline 0.16 Additive 0.10 Multiplicative AdjMult 0.20 AdjNoun 0.05 CategoricalAdj 0.20 Categorical AdjMult 0.14 AdjNoun 0.16 CategoricalAdj 0.19 Kronecker AdjMult 0.26 AdjNoun 0.17 CategoricalAdj 0.27 Upperbound 0.48

  • sentence 1

sentence 2

  • ld man draw ceremonial sword
  • ld man attracted ceremonial sword

annual report draw huge attention annual report attracted huge attention

Disambiguation

Grefenstette-Sadrzadeh, EMNP 2012, Journal of Computational Linguistics 2015

slide-55
SLIDE 55

x y z project present problem man shut door programme face difficulty gentleman close eye

Paraphrasing

Kartsaklis-Sadrzadeh, EMNLP 2013

slide-56
SLIDE 56

Sentence 1 Sentence 2 man shut door gentleman close eye survey collect information page provide datum project present problem programme face difficulty

Model Ambig. Disamb. BL Verbs only 0.310 ⌧ 0.341 M1 Multiplicative 0.325 ⌧ 0.404 M2 Additive 0.368 ⌧ 0.410 T1 Relational 0.368 ⌧ 0.397 T2 Kronecker 0.404 < 0.412 T3 Copy-subject 0.310 ⌧ 0.337 T4 Copy-object 0.321 ⌧ 0.368 Human agreement 0.550 Difference between T2 and M2 is not s.s.

Similarity

Kartsaklis-Sadrzadeh, EMNLP 2013

slide-57
SLIDE 57

Sentence 1 Sentence 2 man shut door gentleman close eye survey collect information page provide datum project present problem programme face difficulty

Model Ambig. Disamb. BL Verbs only 0.310 ⌧ 0.341 M1 Multiplicative 0.325 ⌧ 0.404 M2 Additive 0.368 ⌧ 0.410 T1 Relational 0.368 ⌧ 0.397 T2 Kronecker 0.404 < 0.412 T3 Copy-subject 0.310 ⌧ 0.337 T4 Copy-object 0.321 ⌧ 0.368 Human agreement 0.550 Difference between T2 and M2 is not s.s.

Similarity

Kartsaklis-Sadrzadeh, ACL 2014

ACL 2014

slide-58
SLIDE 58

Subject-verb-object report describe result ` document explain process report outline progress ` document describe change value suit budget ` number meet standard book present account ` work show evidence woman marry man ` female join male author retain house ` person hold property report highlight lack document stress need

Entailment

slide-59
SLIDE 59

Entailment

Model Inclusion KL-div αSkew WeedsPrec ClarkeDE APinc balAPinc SAPinc SBalAPinc Verb 0.61 0.61 0.66 0.69 0.58 0.74 0.67 0.59 0.63

  • 0.55

0.65 0.74 0.79 0.67 0.76 0.71 0.80 0.80 min 0.55 0.71 0.74 0.78 0.63 0.77 0.71 0.73 0.76 + 0.58 0.54 0.71 0.59 0.60 0.65 0.64 0.67 0.67 max 0.58 0.55 0.68 0.58 0.58 0.63 0.61 0.60 0.61 Least-Sqr – – – – – – – – – ⌦rel 0.51 0.64 0.78 0.79 0.69 0.79 0.72 0.84 0.83 ⌦proj 0.64 0.60 0.70 0.69 0.61 0.74 0.70 0.75 0.76 ⌦CpSbj 0.57 0.65 0.73 0.77 0.63 0.73 0.68 0.79 0.78 ⌦CpObj 0.54 0.62 0.73 0.72 0.64 0.76 0.71 0.81 0.79 ⌦FrAdd 0.60 0.60 0.75 0.72 0.67 0.77 0.75 0.84 0.82 ⌦FrMul 0.55 0.62 0.76 0.81 0.68 0.78 0.73 0.86 0.83

Balkir, Kartsaklis, Sadrzadeh , ISAIM 2016, COLING 2016, LACL 2016 Annals of Maths and AI

slide-60
SLIDE 60

Lambdas and Dynamic Logic

slide-61
SLIDE 61

Montague Semantics

constant c type τ H0(c) h0(τ) woman N woman est man N man est tall NN tall (est)est smokes DS smoke est loves DDS love eest knows SDS λpλxλw.∀w0(Kxww0 → pw0) (st)est every N(DS)S λP 0λPλw.∀x(P 0xw → Pxw) (est)(est)st a N(DS)S λP 0λPλw.∃x(P 0xw ∧ Pxw) (est)(est)st

term homs type homs terms types

slide-62
SLIDE 62

A Vector Semantics

c τ H(c) h(τ) woman N woman V tall NN λv.(tall ×1 v) V V smokes DS λv.(smoke ×1 v) V V loves DDS λuv.(love ×2 u) ×1 v V V V knows SDS λuv.(know ×2 u) ×1 v V V V every N(DS)S λvZ.Z(every ×1 v) V (V V )V a N(DS)S λvZ.Z(a ×1 v) V (V V )V Some abstract constants typed with abstract types and their term Matrix Mult Cube Contr Vector

slide-63
SLIDE 63

Simpler Option (1)

c τ H(c) h(τ) woman N woman V tall NN λv.(tall + v) V smokes DS λv.(smoke + v) V loves DDS λuv.(love + u) + v V knows SDS λuv.(know + u) + v V every N(DS)S λvZ.Z(every + v) V a N(DS)S λvZ.Z(a + v) V

slide-64
SLIDE 64

Simpler Option (2)

c τ H(c) h(τ) woman N woman V tall NN λv.(tall v) V smokes DS λv.(smoke v) V loves DDS λuv.(love u) v V knows SDS λuv.(know u) v V every N(DS)S λvZ.Z(every v) V a N(DS)S λvZ.Z(a v) V

slide-65
SLIDE 65

Making things dynamic

U

Update Function Contexts Contexts

:

slide-66
SLIDE 66

U

Update Function Contexts Contexts

:

Intransitive Verbs T ransitive Verbs

VU VVU U

Sentences

Making things dynamic

slide-67
SLIDE 67

Dynamic Vector Semantics

a τ H(a) ρ(τ) Anna (DS)S λZ.Z(anna) (V U)U woman N λZ.Z(woman) (V U)U tall NN λQZ.Q(λvc.ZvF(tall, v, c)) ((V U)U)(V U)U smokes DS λvc.G(smoke, v, c) V U loves DDS λuvc.I(love, u, v, c) V V U knows SDS λpvc.pJ(know, v, c) UV U every N(DS)S λQ.Q ((V U)U)(V U)U who (DS)NN λZ0QZ.Q(λvc.Zv(QZ0c)) (V U)((V U)U)(V U)U and (αS)(αS)(αS) λR0λRλXλc.R0X(RXc) (ρ(α)U)(ρ(α)U)(ρ(α)U)

N-> (VU)U D -> V S -> U

slide-68
SLIDE 68

Examples of Contexts

U

Update Function Contexts Contexts

:

Co-Occurrence Matrices Entity-Relation Cubes

slide-69
SLIDE 69

Co-occurrence Matrices

     m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk      + ha, u, v, · · · i =      m0

11 · · · m0 1k

m0

21 · · · m0 2k

. . . m0

n1 · · · m0 nk

    

slide-70
SLIDE 70

Co-occurrence Matrices

     m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk      + ha, u, v, · · · i =      m0

11 · · · m0 1k

m0

21 · · · m0 2k

. . . m0

n1 · · · m0 nk

    

m0

ij := mij + 1

slide-71
SLIDE 71

Co-occurrence Matrices

     m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk      + ha, u, v, · · · i =      m0

11 · · · m0 1k

m0

21 · · · m0 2k

. . . m0

n1 · · · m0 nk

    

i: tall

m0

ij := mij + 1

j: modifee of tall v

1, for i, j, k indices of by λvc.F(tall, v, c) tall and its modified

slide-72
SLIDE 72

Co-occurrence Matrices

     m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk      + ha, u, v, · · · i =      m0

11 · · · m0 1k

m0

21 · · · m0 2k

. . . m0

n1 · · · m0 nk

    

by λvc.G(smoke, v, c) increases indices of smoke and its subject i: smoke

m0

ij := mij + 1

j: subject of smoke v

slide-73
SLIDE 73

Co-occurrence Matrices

     m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk      + ha, u, v, · · · i =      m0

11 · · · m0 1k

m0

21 · · · m0 2k

. . . m0

n1 · · · m0 nk

    

i: love j: subject of love u

by λuv.λc.I(love, u, v, c) 1, for indices of loves

m0

ij := mij + 1

m0

jk := mjk + 1

m0

ik := mik + 1

k: object of love v

slide-74
SLIDE 74

Example

1 2 3 4 5 man cat loves fears sleeps 1 Anna 100 700 800 500 400 2 woman 500 650 750 750 600 3 tall 300 50 500 400 400 4 smokes 400 50 600 600 200 5 loves 350 250 ✏ 600 500 6 knows 300 50 200 250 270

update by

= ⇒

G, I, F, J 1 2 3 4 5 man cat loves fears sleeps 1 Anna 100 700 800 500 400 2 woman 500 650 750 750 600 3 tall 650 50 500 400 400 4 smokes 700 50 600 600 200 5 loves 550 750 ✏ 600 500 6 knows 600 250 450 510 700

slide-75
SLIDE 75

Entity-Relation Cubes

cijk entity relation entity + (a, u, v) = c0

ijk

entity relation entity where c0

ijk := cijk + 1

slide-76
SLIDE 76

Entity-Relation Cube Example

100 Anna is-a smoker

update by G

= ⇒ 400 Anna is-a smoker 50 Anna is tall

update by F

= ⇒ 280 Anna is tall 200 Anna loves cat

update by I

= ⇒ Anna 350 loves cat

slide-77
SLIDE 77

Dynamic Logic

φ ::= p | ¬φ | φ ∧ ψ |

slide-78
SLIDE 78

Dynamic Logic

φ ::= p | ¬φ | φ ∧ ψ |

||p||(c) := c + ||p|| ||¬φ||(c) := c − ||φ||(c) ||φ ∧ ψ|| := ||ψ||(||φ||(c))

c: context || - ||: update

slide-79
SLIDE 79

Dynamic Logic

||p||(c) := c + ||p|| ||¬φ||(c) := c − ||φ||(c) ||φ ∧ ψ|| := ||ψ||(||φ||(c))

+ = c ∩ ||p||

Heim, Karttunen c, ||p||: sets of valuations

slide-80
SLIDE 80

Dynamic Logic

||p||(c) := c + ||p|| ||¬φ||(c) := c − ||φ||(c) ||φ ∧ ψ|| := ||ψ||(||φ||(c))

? for us Contexts = Matrices or Cubes

slide-81
SLIDE 81

Dynamic Logic with Vector Semantics

||S||(c) := c +0 H(S) c − H(S) := (c +0 H(S))1

S: a sentence H(S): its term hom image

slide-82
SLIDE 82

Dynamic Logic with Vector Semantics

B B B @ m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk 1 C C C A +0 ha, u, v, · · · i = B B B @ m0

11 · · · m0 1k

m0

21 · · · m0 2k

. . . m0

n1 · · · m0 nk

1 C C C A for m0

ij :=

( 1 mij = 1 1 mij = 0 B B B @ m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk 1 C C C A 0 ha, u, v, · · · i = B B B @ m0

11 · · · m0 1k

m0

21 · · · m0 2k

. . . m0

n1 · · · m0 nk

1 C C C A for m0

ij :=

( mij = 1 mij = 0

For co-occurence matrices

slide-83
SLIDE 83

Dynamic Logic with Vector Semantics

Binary versions of the contexts.

1 2 3 4 5 man cat loves fears sleeps 1 Anna 100 700 500 400 2 woman 650 750 750 600 3 tall 300 50 500 400 400

slide-84
SLIDE 84

Dynamic Logic with Vector Semantics

1 2 3 4 5 man cat loves fears sleeps 1 Anna 1 1 1 1 2 woman 1 1 1 1 3 tall 1 1 1 1 1

Binary versions of the contexts.

slide-85
SLIDE 85

Dynamic Logic with Vector Semantics

Basically working with the graph of the relation of the matrix.

✓1 0 1 1 ◆ 7! 7! 1 2 1 1 0 2 1 1 {(1, 1), (2, 1), (2, 2)}

+’ = c ∩ ||p||

slide-86
SLIDE 86

||S1, · · · , Sn||(c) := ||S1|| · · · ||Sn||(c)

Dynamic Logic with Vector Semantics

This semantics is indeed “dynamic”.

  • Observation. The update obtained by a sequence of sentences

is the same as the update obtained by their composition.

slide-87
SLIDE 87

Dynamic Logic with Vector Semantics

This semantics is indeed “dynamic”.

  • Proposition. The context corresponding to a sequence of

sentences is the empty context (0 matrix or cube) updated by them.

c = ||S1, · · · , Sn||(0)

slide-88
SLIDE 88

Application

L context c admits proposition φ ⇐ ⇒ ||φ||(c) = c

Admittance of a proposition by a context (Karttunen&Heim):

slide-89
SLIDE 89

Application

L context c admits proposition φ ⇐ ⇒ ||φ||(c) = c

Admittance of a proposition by a context (Karttunen&Heim):

  • Definition. A corpus admits a sentence if the context

corresponding to it admits it.

slide-90
SLIDE 90

Application

1 2 3 4 5 man cat loves fears sleeps 1 Anna 100 700 500 400 2 woman 650 750 750 600 3 tall 300 50 500 400 400 1 2 3 4 5 man cat loves fears sleeps 1 Anna 1 1 1 1 2 woman 1 1 1 1 3 tall 1 1 1 1 1 1 2 3 4 5 man cat loves fears sleeps 1 Anna 1 1 1 1 2 woman 1 1 1 1 3 tall 1 1 1 1 1

update

slide-91
SLIDE 91

Example

‘Cats and dogs are animals that sleep. Cats chase cats and mice. Dogs chase all

  • animals. Cats like mice, but mice fear cats, since cats eat mice. Cats smell mice and

mice run from cats.’ It admits the following sentences:

Cats are animals. Dogs are animals. Cats chase cats. Cats chase mice. Dogs chase cats and dogs.

(*) Cats like dogs. (*) Cats eat dogs. (*) Dogs run from cats. (*) Dogs like mice. (*) Mice fear dogs. (*) Dogs eat mice.

(*) Cats are not animals. (*) Dogs do not sleep.

slide-92
SLIDE 92

Experimental Evaluation

fracas-013 answer: yes P1 Both leading tenors are excellent. P2 Leading tenors who are excellent are indispensable. Q Are both leading tenors indispensable? H Both leading tenors are indispensable.

Check whether the context P1, P2 admits the sentence H.

slide-93
SLIDE 93

Parents have a great influence on the career development of their children. Parents have a powerful influence on the career development of their children.

Experimental Evaluation

Zeichner S1 S2 Check whether the context S1 admits the sentence S2.

slide-94
SLIDE 94

Conclusions and Future Work

Developed vector semantics for lambda calculus of NL. Extend to reference resolution: dynamic modality. Experiment with the notion of admittance Developed a dynamic logic for vector semantics. It does not matter where do the lambdas come from, it can be from ACG, CCG, LC, etc.