Lambdas, Vectors, and Dynamic Logic Develop: a vector semantics - - PowerPoint PPT Presentation
Lambdas, Vectors, and Dynamic Logic Develop: a vector semantics - - PowerPoint PPT Presentation
Lambdas, Vectors, and Dynamic Logic Develop: a vector semantics and a dynamic logic for lambda calculus models of language Mehrnoosh Sadrzadeh Queen Mary University of London Joint work with Reinhard Muskens (Tilburg) Supported by Royal
Develop: a vector semantics and a dynamic logic for lambda calculus models of language
Mehrnoosh Sadrzadeh Queen Mary University of London
Joint work with Reinhard Muskens (Tilburg) Supported by Royal Society Int. Exchange Award
Why putting lambdas and vector together?
Why putting lambdas and vector together? Because we want to develop a compositional distributional semantics for natural language.
Why putting lambdas and vector together? Because we want to develop a compositional distributional semantics for natural language.
What is that?
Distributional Semantics
Words that occur in similar contexts have similar meanings. “oculist and eye-doctor . . . occur in almost the same environments” “If A and B have almost identical environments. . . we say that they are synonyms.” Harris (1954) “You shall know a word by the company it keeps!” Firth (1957)
Imagine you had never seen the word tesguino, but given the following text: A bottle of tesguino is on the table. Everybody likes tesguino . Tesguino makes you drunk. We make tesguino out of corn. the company it keeps!”. The meaning of a word is thus related to the distribution of words around it. Imagine you had never seen the word tesg¨ uino, but I gave you the following 4 sen-
Speech and Language Processing. Daniel Jurafsky & James H. Martin.
Imagine you had never seen the word tesguino, but given the following text: A bottle of tesguino is on the table. Everybody likes tesguino . Tesguino makes you drunk. We make tesguino out of corn. “an alcoholic drink made of corn”
Co-Occurrence Matrix
- sugar, a sliced lemon, a tablespoonful of apricot
preserve or jam, a pinch each of, their enjoyment. Cautiously she sampled her first pineapple and another fruit whose taste she likened well suited to programming on the digital computer. In finding the optimal R-stage policy from for the purpose of gathering data and information necessary for the study authorized in the
aardvark ... computer data pinch result sugar ... apricot ... 1 1 pineapple ... 1 1 digital ... 2 1 1 information ... 1 6 4 Figure 15.4 Co-occurrence vectors for four words, computed from the Brown corpus,
[| digital |] := (0, …, 2, 1, 0, 1, 0)
Algorithm for Word Meaning Acquisition
Co-Occurrence Matrix
- sugar, a sliced lemon, a tablespoonful of apricot
preserve or jam, a pinch each of, their enjoyment. Cautiously she sampled her first pineapple and another fruit whose taste she likened well suited to programming on the digital computer. In finding the optimal R-stage policy from for the purpose of gathering data and information necessary for the study authorized in the
aardvark ... computer data pinch result sugar ... apricot ... 1 1 pineapple ... 1 1 digital ... 2 1 1 information ... 1 6 4 Figure 15.4 Co-occurrence vectors for four words, computed from the Brown corpus,
[| digital |] := (0, …, 0.25, 0.12, 0, 0.15, 0)
Algorithm for Word Meaning Acquisition
Normalised
Word-Context Matrix
- sugar, a sliced lemon, a tablespoonful of apricot
preserve or jam, a pinch each of, their enjoyment. Cautiously she sampled her first pineapple and another fruit whose taste she likened well suited to programming on the digital computer. In finding the optimal R-stage policy from for the purpose of gathering data and information necessary for the study authorized in the
aardvark ... computer data pinch result sugar ... apricot ... 1 1 pineapple ... 1 1 digital ... 2 1 1 information ... 1 6 4 Figure 15.4 Co-occurrence vectors for four words, computed from the Brown corpus,
Intuition: a word is represented by embedding its representation into a vector space, rather than just treated as an “atom”:
Q: What’s the meaning of life? A: LIFE
Automatic Meaning Acquisition
Curran, 2006
Introduction: launch, implementation, advent, addition arrival, creation, inclusion Evaluation: assessment, examination, appraisal, review audit, analysis, consultation, test, verification Methods: technique, procedure, means, approach, tool, concept, practice, formula
Stronghold
Semantic Similarity
1 2 3 4 5 6 7 1 2 3 digital apricot information Dimension 1: ‘large’ Dimension 2: ‘data’
Evaluations
WordSim-353: noun pairs (cup, coffee) SimLex-999: adjective, noun, and verb pairs (cup, drink) TOEFL: 80 questions “Levied” is closest in meaning to imposed, believed, requested, correlated SCWS: 2003 words in sentences Analogy: a to b is like c to do Athens to Greece is like Oslo to Norway
Applications
Drawing inferences about meanings of words. Named entity recognition, parsing, semantic role labelling., summarisation, essay marking, …
Extending distributional semantics from words to sentences Compositional Distributional Semantics
Vectors for Sentences
1 2 3 4 5 6 1 2 digital [1,1]
result data
information [6,4] 3 4
base 1 base 2
dogs bark mice squeak
At childhood we spent the summers in countryside. Khaterey daram sharbat-e albalooye anjast ast. My mother preferred the jams, however. My worst memory is being forced to take a nap in the afternoons.
A direct distributional approach…
At childhood we spent the summers in countryside. Khaterey daram sharbat-e albalooye anjast ast. My mother preferred the jams, however. My worst memory is being forced to take a nap in the afternoons. “ A memory from then is their sour cherry drink”
A direct distributional approach…
An approach that forgets about grammar
- !
vampires kill men =
- !
vampires + ! kill +
- !
men =
- !
vampires ! kill
- !
men
- !
vampires kill men =
- !
men kill vampires
So what should we do?
Formal Semantics
Syntax Semantics
structure preserving map
Formal Semantics with Vectors
Syntax Semantics
structure preserving map
Formal Grammar Vectors
Compositional Distributional Semantics
Syntax Semantics
structure preserving map
Pregroup Grammar
Coecke, Sadrzadeh, Clark, (Lambek’s 90th Festschrift), 2010 Preller, Sadrzadeh (JoLLI), 2011
Vectors Spaces
strongly monoidal functor
Compositional Distributional Semantics
Syntax Semantics
structure preserving map
Lambek Calculus
Coecke, Grefenstette,Sadrzadeh, (APAL), 2013
Vectors Spaces
homomorphism
Compositional Distributional Semantics
Syntax Semantics
structure preserving map
Lambek Grishin
- G. Wijnholds, (MSc Thesis, ILLC), 2015
Vectors Spaces
homomorphism
Syntax Semantics
structure preserving map
CCG
Maillard, Clark, Grefenstette, (Type Theory and NL, EACL workshop), 2014. Krishnamurty and Mitchell, (CVSC, ACL workshop), 2013. Baroni, Bernardini, Zamparelli, (LILT), 2014.
Vector Spaces
Syntax Semantics
structure preserving map
ACG
Muskens and Sadrzadeh, DSALT , LACL 2016. journal version is to apear in JLM 2018
Vectors Spaces
homomorphism
Royal Society International Exchange Award
CCG Semantics
structure preserving map T ypes
X/Y X \ Y
A
X/Y Y =) X Y X \ Y =) X
Rules
Syntax Semantics
structure preserving map T ypes
3 Vi
A 7! U
Syntax Semantics
structure preserving map T ypes
3 Vi
A 7! U
X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X
Syntax Semantics
structure preserving map T ypes
3 Vi
3 3 Mij, Tijk, Cijkl, · · ·
A 7! U
X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X
Syntax Semantics
structure preserving map T ypes Rules
3 Vi
A 7! U 3 Ti, Ti j, Ti jk, Ti jkl, · · ·
X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X
X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X
Syntax Semantics
structure preserving map T ypes Rules
3 Vi
A 7! U 3 Ti, Ti j, Ti jk, Ti jkl, · · ·
X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X
X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X
Ti j···k
2
Syntax Semantics
structure preserving map T ypes Rules
3 Vi
A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·
X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X
X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X
Ti j···k
2
Tkl···w
Syntax Semantics
structure preserving map T ypes Rules
3 Vi
A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·
X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X
X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X
Ti j···k
2
Tkl···w
Syntax Semantics
structure preserving map T ypes Rules
3 Vi
A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·
X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X
X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X
Ti j···k
2
Tkl···w = Ti j···l···w
2
Syntax Semantics
structure preserving map T ypes Rules
3 Vi
A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·
X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X
X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X
2
3 Ti,
Ti j
Syntax Semantics
structure preserving map T ypes Rules
3 Vi
A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·
X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X
X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X
2
3 Ti,
Ti j
Syntax Semantics
structure preserving map T ypes Rules
3 Vi
A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·
X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X
X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X
2
3 Ti,
Ti j
T j
2
Syntax Semantics
structure preserving map T ypes Rules
3 Vi
A 7! U 2 3 Ti, Ti j, Ti jk, Ti jkl, · · ·
X/Y 7! X ⌦ Y⇤ X \ Y 7! Y⇤ ⌦ X
X/Y Y =) X 7! X ⌦ Y⇤ ⌦ Y =) X Y X \ Y =) X 7! Y ⌦ Y⇤ ⌦ X =) X
2
3 Ti,
Ti j
T j
2
x =
7!
Cats chase dogs noisily.
NP ((S \ NP) / NP) NP S \ S
7!
Cats chase dogs noisily.
NP ((S \ NP) / NP) NP S \ S
⌦ ⌦
⌦ S ⌦ N⇤ ⌦ N⇤
N ⌦ S ⌦ S⇤
N N
7!
Cats chase dogs noisily.
NP ((S \ NP) / NP) NP S \ S
⌦ ⌦
⌦ S ⌦ N⇤ ⌦ N⇤
N ⌦ S ⌦ S⇤
N N
Ti
Ti jk
Tk
T jl
2 2 2 2
7!
cats chase dogs noisily
NP ((S/NP) \ NP) NP (S/S)
N ⌦ S ⌦ N⇤ ⌦ N⇤ ⌦ N
N N
N ⌦ S ⌦ S⇤
TiTi jkTkT jl
7!
cats chase dogs noisily
NP ((S/NP) \ NP) NP (S/S)
N ⌦ S ⌦ N⇤ ⌦ N⇤ ⌦ N
N N
N ⌦ S ⌦ S⇤
TiTi jkTkT jl =) TiTijT jl
7!
cats chase dogs noisily
NP ((S/NP) \ NP) NP (S/S)
N ⌦ S ⌦ N⇤ ⌦ N⇤ ⌦ N
N N
N ⌦ S ⌦ S⇤
=) T jT jl
TiTi jkTkT jl =) TiTijT jl
7!
cats chase dogs noisily
NP ((S/NP) \ NP) NP (S/S)
N ⌦ S ⌦ N⇤ ⌦ N⇤ ⌦ N
N N
N ⌦ S ⌦ S⇤
=) Tl
=) T jT jl
TiTi jkTkT jl =) TiTijT jl
7!
cats chase dogs noisily
NP ((S/NP) \ NP) NP (S/S)
N ⌦ S ⌦ N⇤ ⌦ N⇤ ⌦ N
N N
N ⌦ S ⌦ S⇤
=) Tl
=) T jT jl
TiTi jkTkT jl =) TiTijT jl
2 S
Applications
Building the T ensors
2- Degree of correlation between subject and object
X
i
(
- !
S bj ⌦
- !
Obj)i
1- Context vector of the verb 3- Populating a cube tensor from a matrix by copying the
- bject dimension
4- Populating a cube tensor from a matrix by copying the subject dimension
V Mij = Tijk
7! X ⌦ Y 3
- !
Verb ⌦
- !
Verb ⌦
- !
Verb
Relational Frobenius 5- Baroni and Zamparelli, Polajnar et al: Linear and multi-step linear regression for learning these from holistic vectors. Least Sq Kronecker
x y z annual report draw attention
- ld man pulled ceremonial sword
annual report attracted attention
- ld man draw ceremonial sword
Disambiguation
Grefenstette-Sadrzadeh, EMNP 2012, Journal of Computational Linguistics 2015 annual report pulled attention
Model ρ Verb Baseline 0.20 Bigram Baseline 0.14 Trigram Baseline 0.16 Additive 0.10 Multiplicative AdjMult 0.20 AdjNoun 0.05 CategoricalAdj 0.20 Categorical AdjMult 0.14 AdjNoun 0.16 CategoricalAdj 0.19 Kronecker AdjMult 0.26 AdjNoun 0.17 CategoricalAdj 0.27 Upperbound 0.48
- sentence 1
sentence 2
- ld man draw ceremonial sword
- ld man attracted ceremonial sword
annual report draw huge attention annual report attracted huge attention
Disambiguation
Grefenstette-Sadrzadeh, EMNP 2012, Journal of Computational Linguistics 2015
x y z project present problem man shut door programme face difficulty gentleman close eye
Paraphrasing
Kartsaklis-Sadrzadeh, EMNLP 2013
Sentence 1 Sentence 2 man shut door gentleman close eye survey collect information page provide datum project present problem programme face difficulty
Model Ambig. Disamb. BL Verbs only 0.310 ⌧ 0.341 M1 Multiplicative 0.325 ⌧ 0.404 M2 Additive 0.368 ⌧ 0.410 T1 Relational 0.368 ⌧ 0.397 T2 Kronecker 0.404 < 0.412 T3 Copy-subject 0.310 ⌧ 0.337 T4 Copy-object 0.321 ⌧ 0.368 Human agreement 0.550 Difference between T2 and M2 is not s.s.
Similarity
Kartsaklis-Sadrzadeh, EMNLP 2013
Sentence 1 Sentence 2 man shut door gentleman close eye survey collect information page provide datum project present problem programme face difficulty
Model Ambig. Disamb. BL Verbs only 0.310 ⌧ 0.341 M1 Multiplicative 0.325 ⌧ 0.404 M2 Additive 0.368 ⌧ 0.410 T1 Relational 0.368 ⌧ 0.397 T2 Kronecker 0.404 < 0.412 T3 Copy-subject 0.310 ⌧ 0.337 T4 Copy-object 0.321 ⌧ 0.368 Human agreement 0.550 Difference between T2 and M2 is not s.s.
Similarity
Kartsaklis-Sadrzadeh, ACL 2014
ACL 2014
Subject-verb-object report describe result ` document explain process report outline progress ` document describe change value suit budget ` number meet standard book present account ` work show evidence woman marry man ` female join male author retain house ` person hold property report highlight lack document stress need
Entailment
Entailment
Model Inclusion KL-div αSkew WeedsPrec ClarkeDE APinc balAPinc SAPinc SBalAPinc Verb 0.61 0.61 0.66 0.69 0.58 0.74 0.67 0.59 0.63
- 0.55
0.65 0.74 0.79 0.67 0.76 0.71 0.80 0.80 min 0.55 0.71 0.74 0.78 0.63 0.77 0.71 0.73 0.76 + 0.58 0.54 0.71 0.59 0.60 0.65 0.64 0.67 0.67 max 0.58 0.55 0.68 0.58 0.58 0.63 0.61 0.60 0.61 Least-Sqr – – – – – – – – – ⌦rel 0.51 0.64 0.78 0.79 0.69 0.79 0.72 0.84 0.83 ⌦proj 0.64 0.60 0.70 0.69 0.61 0.74 0.70 0.75 0.76 ⌦CpSbj 0.57 0.65 0.73 0.77 0.63 0.73 0.68 0.79 0.78 ⌦CpObj 0.54 0.62 0.73 0.72 0.64 0.76 0.71 0.81 0.79 ⌦FrAdd 0.60 0.60 0.75 0.72 0.67 0.77 0.75 0.84 0.82 ⌦FrMul 0.55 0.62 0.76 0.81 0.68 0.78 0.73 0.86 0.83
Balkir, Kartsaklis, Sadrzadeh , ISAIM 2016, COLING 2016, LACL 2016 Annals of Maths and AI
Lambdas and Dynamic Logic
Montague Semantics
constant c type τ H0(c) h0(τ) woman N woman est man N man est tall NN tall (est)est smokes DS smoke est loves DDS love eest knows SDS λpλxλw.∀w0(Kxww0 → pw0) (st)est every N(DS)S λP 0λPλw.∀x(P 0xw → Pxw) (est)(est)st a N(DS)S λP 0λPλw.∃x(P 0xw ∧ Pxw) (est)(est)st
term homs type homs terms types
A Vector Semantics
c τ H(c) h(τ) woman N woman V tall NN λv.(tall ×1 v) V V smokes DS λv.(smoke ×1 v) V V loves DDS λuv.(love ×2 u) ×1 v V V V knows SDS λuv.(know ×2 u) ×1 v V V V every N(DS)S λvZ.Z(every ×1 v) V (V V )V a N(DS)S λvZ.Z(a ×1 v) V (V V )V Some abstract constants typed with abstract types and their term Matrix Mult Cube Contr Vector
Simpler Option (1)
c τ H(c) h(τ) woman N woman V tall NN λv.(tall + v) V smokes DS λv.(smoke + v) V loves DDS λuv.(love + u) + v V knows SDS λuv.(know + u) + v V every N(DS)S λvZ.Z(every + v) V a N(DS)S λvZ.Z(a + v) V
Simpler Option (2)
c τ H(c) h(τ) woman N woman V tall NN λv.(tall v) V smokes DS λv.(smoke v) V loves DDS λuv.(love u) v V knows SDS λuv.(know u) v V every N(DS)S λvZ.Z(every v) V a N(DS)S λvZ.Z(a v) V
Making things dynamic
U
Update Function Contexts Contexts
:
U
Update Function Contexts Contexts
:
Intransitive Verbs T ransitive Verbs
VU VVU U
Sentences
Making things dynamic
Dynamic Vector Semantics
a τ H(a) ρ(τ) Anna (DS)S λZ.Z(anna) (V U)U woman N λZ.Z(woman) (V U)U tall NN λQZ.Q(λvc.ZvF(tall, v, c)) ((V U)U)(V U)U smokes DS λvc.G(smoke, v, c) V U loves DDS λuvc.I(love, u, v, c) V V U knows SDS λpvc.pJ(know, v, c) UV U every N(DS)S λQ.Q ((V U)U)(V U)U who (DS)NN λZ0QZ.Q(λvc.Zv(QZ0c)) (V U)((V U)U)(V U)U and (αS)(αS)(αS) λR0λRλXλc.R0X(RXc) (ρ(α)U)(ρ(α)U)(ρ(α)U)
N-> (VU)U D -> V S -> U
Examples of Contexts
U
Update Function Contexts Contexts
:
Co-Occurrence Matrices Entity-Relation Cubes
Co-occurrence Matrices
m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk + ha, u, v, · · · i = m0
11 · · · m0 1k
m0
21 · · · m0 2k
. . . m0
n1 · · · m0 nk
Co-occurrence Matrices
m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk + ha, u, v, · · · i = m0
11 · · · m0 1k
m0
21 · · · m0 2k
. . . m0
n1 · · · m0 nk
m0
ij := mij + 1
Co-occurrence Matrices
m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk + ha, u, v, · · · i = m0
11 · · · m0 1k
m0
21 · · · m0 2k
. . . m0
n1 · · · m0 nk
i: tall
m0
ij := mij + 1
j: modifee of tall v
1, for i, j, k indices of by λvc.F(tall, v, c) tall and its modified
Co-occurrence Matrices
m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk + ha, u, v, · · · i = m0
11 · · · m0 1k
m0
21 · · · m0 2k
. . . m0
n1 · · · m0 nk
by λvc.G(smoke, v, c) increases indices of smoke and its subject i: smoke
m0
ij := mij + 1
j: subject of smoke v
Co-occurrence Matrices
m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk + ha, u, v, · · · i = m0
11 · · · m0 1k
m0
21 · · · m0 2k
. . . m0
n1 · · · m0 nk
i: love j: subject of love u
by λuv.λc.I(love, u, v, c) 1, for indices of loves
m0
ij := mij + 1
m0
jk := mjk + 1
m0
ik := mik + 1
k: object of love v
Example
1 2 3 4 5 man cat loves fears sleeps 1 Anna 100 700 800 500 400 2 woman 500 650 750 750 600 3 tall 300 50 500 400 400 4 smokes 400 50 600 600 200 5 loves 350 250 ✏ 600 500 6 knows 300 50 200 250 270
update by
= ⇒
G, I, F, J 1 2 3 4 5 man cat loves fears sleeps 1 Anna 100 700 800 500 400 2 woman 500 650 750 750 600 3 tall 650 50 500 400 400 4 smokes 700 50 600 600 200 5 loves 550 750 ✏ 600 500 6 knows 600 250 450 510 700
Entity-Relation Cubes
cijk entity relation entity + (a, u, v) = c0
ijk
entity relation entity where c0
ijk := cijk + 1
Entity-Relation Cube Example
100 Anna is-a smoker
update by G
= ⇒ 400 Anna is-a smoker 50 Anna is tall
update by F
= ⇒ 280 Anna is tall 200 Anna loves cat
update by I
= ⇒ Anna 350 loves cat
Dynamic Logic
φ ::= p | ¬φ | φ ∧ ψ |
Dynamic Logic
φ ::= p | ¬φ | φ ∧ ψ |
||p||(c) := c + ||p|| ||¬φ||(c) := c − ||φ||(c) ||φ ∧ ψ|| := ||ψ||(||φ||(c))
c: context || - ||: update
Dynamic Logic
||p||(c) := c + ||p|| ||¬φ||(c) := c − ||φ||(c) ||φ ∧ ψ|| := ||ψ||(||φ||(c))
+ = c ∩ ||p||
Heim, Karttunen c, ||p||: sets of valuations
Dynamic Logic
||p||(c) := c + ||p|| ||¬φ||(c) := c − ||φ||(c) ||φ ∧ ψ|| := ||ψ||(||φ||(c))
? for us Contexts = Matrices or Cubes
Dynamic Logic with Vector Semantics
||S||(c) := c +0 H(S) c − H(S) := (c +0 H(S))1
S: a sentence H(S): its term hom image
Dynamic Logic with Vector Semantics
B B B @ m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk 1 C C C A +0 ha, u, v, · · · i = B B B @ m0
11 · · · m0 1k
m0
21 · · · m0 2k
. . . m0
n1 · · · m0 nk
1 C C C A for m0
ij :=
( 1 mij = 1 1 mij = 0 B B B @ m11 · · · m1k m21 · · · m2k . . . mn1 · · · mnk 1 C C C A 0 ha, u, v, · · · i = B B B @ m0
11 · · · m0 1k
m0
21 · · · m0 2k
. . . m0
n1 · · · m0 nk
1 C C C A for m0
ij :=
( mij = 1 mij = 0
For co-occurence matrices
Dynamic Logic with Vector Semantics
Binary versions of the contexts.
1 2 3 4 5 man cat loves fears sleeps 1 Anna 100 700 500 400 2 woman 650 750 750 600 3 tall 300 50 500 400 400
Dynamic Logic with Vector Semantics
1 2 3 4 5 man cat loves fears sleeps 1 Anna 1 1 1 1 2 woman 1 1 1 1 3 tall 1 1 1 1 1
Binary versions of the contexts.
Dynamic Logic with Vector Semantics
Basically working with the graph of the relation of the matrix.
✓1 0 1 1 ◆ 7! 7! 1 2 1 1 0 2 1 1 {(1, 1), (2, 1), (2, 2)}
+’ = c ∩ ||p||
||S1, · · · , Sn||(c) := ||S1|| · · · ||Sn||(c)
Dynamic Logic with Vector Semantics
This semantics is indeed “dynamic”.
- Observation. The update obtained by a sequence of sentences
is the same as the update obtained by their composition.
Dynamic Logic with Vector Semantics
This semantics is indeed “dynamic”.
- Proposition. The context corresponding to a sequence of
sentences is the empty context (0 matrix or cube) updated by them.
c = ||S1, · · · , Sn||(0)
Application
L context c admits proposition φ ⇐ ⇒ ||φ||(c) = c
Admittance of a proposition by a context (Karttunen&Heim):
Application
L context c admits proposition φ ⇐ ⇒ ||φ||(c) = c
Admittance of a proposition by a context (Karttunen&Heim):
- Definition. A corpus admits a sentence if the context
corresponding to it admits it.
Application
1 2 3 4 5 man cat loves fears sleeps 1 Anna 100 700 500 400 2 woman 650 750 750 600 3 tall 300 50 500 400 400 1 2 3 4 5 man cat loves fears sleeps 1 Anna 1 1 1 1 2 woman 1 1 1 1 3 tall 1 1 1 1 1 1 2 3 4 5 man cat loves fears sleeps 1 Anna 1 1 1 1 2 woman 1 1 1 1 3 tall 1 1 1 1 1
update
Example
‘Cats and dogs are animals that sleep. Cats chase cats and mice. Dogs chase all
- animals. Cats like mice, but mice fear cats, since cats eat mice. Cats smell mice and
mice run from cats.’ It admits the following sentences:
Cats are animals. Dogs are animals. Cats chase cats. Cats chase mice. Dogs chase cats and dogs.
(*) Cats like dogs. (*) Cats eat dogs. (*) Dogs run from cats. (*) Dogs like mice. (*) Mice fear dogs. (*) Dogs eat mice.
(*) Cats are not animals. (*) Dogs do not sleep.
Experimental Evaluation
fracas-013 answer: yes P1 Both leading tenors are excellent. P2 Leading tenors who are excellent are indispensable. Q Are both leading tenors indispensable? H Both leading tenors are indispensable.
Check whether the context P1, P2 admits the sentence H.
Parents have a great influence on the career development of their children. Parents have a powerful influence on the career development of their children.
Experimental Evaluation
Zeichner S1 S2 Check whether the context S1 admits the sentence S2.
Conclusions and Future Work
Developed vector semantics for lambda calculus of NL. Extend to reference resolution: dynamic modality. Experiment with the notion of admittance Developed a dynamic logic for vector semantics. It does not matter where do the lambdas come from, it can be from ACG, CCG, LC, etc.