Introduction Learning by Inference Rules Experiment Conclusion
An Inference-rules based Categorial Grammar Learner for Simulating - - PowerPoint PPT Presentation
An Inference-rules based Categorial Grammar Learner for Simulating - - PowerPoint PPT Presentation
Introduction Learning by Inference Rules Experiment Conclusion An Inference-rules based Categorial Grammar Learner for Simulating Language Acquisition Xuchen Yao, Jianqiang Ma, Sergio Duarte, ar ltekin University of Groningen 18
Introduction Learning by Inference Rules Experiment Conclusion
Outline
Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Outline
Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Categorial Grammar
- basic categories: S
(sentence), NP (noun phrase), N (noun)
- Complex categories:
NP/N, S\NP and (S\NP)\(S\NP)
- Slash operators: / \
Peter saw a book NP VP DT N NP VP S Peter saw a book NP (S\NP)/NP NP/N N
>
NP
>
S\NP
<
S
Figure: Example derivation for sentence Peter saw a book.
Introduction Learning by Inference Rules Experiment Conclusion
Categorial Grammar
- basic categories: S
(sentence), NP (noun phrase), N (noun)
- Complex categories:
NP/N, S\NP and (S\NP)\(S\NP)
- Slash operators: / \
Peter saw a book NP VP DT N NP VP S Peter saw a book NP (S\NP)/NP NP/N N
>
NP
>
S\NP
<
S
Figure: Example derivation for sentence Peter saw a book.
Introduction Learning by Inference Rules Experiment Conclusion
Different Operation Rules
- Function application rules (CG)
Forward A/B B → A (>) Backward B A\B → A (<)
- Function composition rules (CCG)
Forward A/B B/C → A/C (> B) Backward B\C A\B → A\C (< B)
- Type raising rules (CCG)
Forward A → T/(T\A) (> T) Backward A → T\(T/A) (< T)
- Substitution rules (CCG)
Forward (A/B)/C B/C → A/C (>S) Backward B\C (A\B)\C → A\C (<S)
Introduction Learning by Inference Rules Experiment Conclusion
Outline
Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Nativist vs. Empiricist
- Auxiliary Verb Fronting
- Peter is awake.
- Is Peter awake?
- Peter who is sleepy is awake.
- Is Peter who is sleepy awake?
- *Is Peter who sleepy is awake?
- Word Order
- I should go.
- I have gone.
- I am going.
- I have been going.
- I should have gone.
- I should be going.
- I should have been going.
- *I have should been going.
Introduction Learning by Inference Rules Experiment Conclusion
Nativist vs. Empiricist
- Auxiliary Verb Fronting
- Peter is awake.
- Is Peter awake?
- Peter who is sleepy is awake.
- Is Peter who is sleepy awake?
- *Is Peter who sleepy is awake?
- Word Order
- I should go.
- I have gone.
- I am going.
- I have been going.
- I should have gone.
- I should be going.
- I should have been going.
- *I have should been going.
Introduction Learning by Inference Rules Experiment Conclusion
Nativist vs. Empiricist
- Auxiliary Verb Fronting
- Peter is awake.
- Is Peter awake?
- Peter who is sleepy is awake.
- Is Peter who is sleepy awake?
- *Is Peter who sleepy is awake?
- Word Order
- I should go.
- I have gone.
- I am going.
- I have been going.
- I should have gone.
- I should be going.
- I should have been going.
- *I have should been going.
Introduction Learning by Inference Rules Experiment Conclusion
Nativist vs. Empiricist
- Auxiliary Verb Fronting
- Peter is awake.
- Is Peter awake?
- Peter who is sleepy is awake.
- Is Peter who is sleepy awake?
- *Is Peter who sleepy is awake?
- Word Order
- I should go.
- I have gone.
- I am going.
- I have been going.
- I should have gone.
- I should be going.
- I should have been going.
- *I have should been going.
Introduction Learning by Inference Rules Experiment Conclusion
Research Questions
- 1. Can we give a computational simulation of the acquisition of
syntactic structures?
- How do we derive the category of an unknown word in a
sentence?
- 2. Can we give a judgement of the Nativist-Empiricist debate
from the perspective of CCG?
- How important is experience? Or the innate ability is more
important?
Introduction Learning by Inference Rules Experiment Conclusion
Outline
Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Level 0/1 Inference Rules
- Level 0 inference rules
B/A X → B ⇒ X = A ifA = S X B\A → B ⇒ X = A ifA = S
- Level 1 inference rules
A X → B ⇒ X = B\A ifA = S X A → B ⇒ X = B/A ifA = S Peter works NP X (S\NP)
<
S
Figure: Example of level 1 indefrence rules: Peter works.
Introduction Learning by Inference Rules Experiment Conclusion
Level 2 Inference Rules
- Level 2 side inference rules
X A B → C ⇒ X = (C/B)/A A B X → C ⇒ X = (C\A)\B
- Level 2 middle inference rule
A X B → C ⇒ X = (C\A)/B Peter saw a book NP X NP/N N ((S\NP)/NP)
>
NP
>
S\NP
<
S
Figure: Example of level 2 inference rules: Peter saw a book.
Introduction Learning by Inference Rules Experiment Conclusion
Level 3 Inference Rules
- Level 3 side inference rules
X A B C → D ⇒ X = ((D/C)/B)/A A B C X → D ⇒ X = ((D\A)\B)\C
- Level 3 middle inference rules
A X B C → D ⇒ X = ((D\A)/C)/B A B X C → D ⇒ X = ((D\A)\B)/C
- Inference rules of up to level 3 can derive most categories of
common English words.
Introduction Learning by Inference Rules Experiment Conclusion
Outline
Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
The Learning Architecture
Corpus Edge Generator Recursive Learner Level 0 IR can parse? Level 1 IR can parse? Level 2 IR can parse? Level 3 IR can parse? N N N N Applying level 0 and 1 inference rule recursively SCP Principle Right combining rule Output Selector Y Generation rule set Test rule set Cannot learn N N Y Y Y Y Y Y Y Y
Figure: Learning process using inference rules
Introduction Learning by Inference Rules Experiment Conclusion
Outline
Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Target Grammar
Peter := NP with := (N\N)/NP Mary := NP with := (NP\NP)/NP big := N/N with := ((S\NP)\(S\NP))/NP colorless := N/N sleep := S\NP book := N a := NP/N telescope := N give := ((S\NP)/NP)/NP the := NP/N saw := (S\NP)/NP run := S\NP read := (S\NP)/NP big := N/N furiously := (S\NP)\(S\NP)
Table: Target Grammar Rules
- Recursive & ambiguous
- Assume only NP and N are known to the learner
Introduction Learning by Inference Rules Experiment Conclusion
Result
Peter saw Mary with a big big telescope NP (S\NP)/NP NP (NP\NP)/NP NP/N N/N N/N N
>
N
>
N
<
NP
>
NP\NP
<
NP
>
S\NP
<
S
(a)
Peter saw Mary with a big big telescope NP (S\NP)/NP NP ((S\NP)\(S\NP))/NP NP/N N/N N/N N
> >
S\NP N
>
N
<
NP
>
(S\NP)\(S\NP)
<
S\NP
<
S
(b)
Figure: Two ambiguous parses of the sentence
Introduction Learning by Inference Rules Experiment Conclusion
Result
Peter saw Mary with a big big telescope NP (S\NP)/NP NP (NP\NP)/NP NP/N N/N N/N N
>
N
>
N
<
NP
>
NP\NP
<
NP
>
S\NP
<
S
Figure: Ambiguous parse 1
Introduction Learning by Inference Rules Experiment Conclusion
Result
Peter saw Mary with a big big telescope NP (S\NP)/NP NP ((S\NP)\(S\NP))/NP NP/N N/N N/N N
> >
S\NP N
>
N
<
NP
>
(S\NP)\(S\NP)
<
S\NP
<
S Figure: Ambiguous parse 2
Introduction Learning by Inference Rules Experiment Conclusion
Outline
Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Learning Auxiliary Verb Fronting 1
Peter is sleepy NP (S\NP)/(Sadj\NP) Sadj\NP
>
S\NP
<
S (a) Is Peter awake (Sq/(Sadj\NP))/NP NP Sadj\NP
>
Sq/(Sadj\NP)
<
Sq (b)
Peter who is sleepy is awake NP (NP\NP)/(S\NP) (S\NP)/(Sadj\NP) Sadj\NP (S\NP)/(Sadj\NP) Sadj\NP
> >
S\NP S\NP
>
NP\NP
<
NP
<
S
(c)
Figure: Learning Auxiliary Verb Fronting 1
Introduction Learning by Inference Rules Experiment Conclusion
Learning Auxiliary Verb Fronting 2
Is Peter who is sleepy awake (Sq/(Sadj\NP))/NP NP (NP\NP)/(S\NP) (S\NP)/(Sadj\NP) Sadj\NP Sadj\NP
>
S\NP
>
NP\NP
<
NP
>
Sq/(Sadj\NP)
>
Sq
Figure: Learning Auxiliary Verb Fronting 2
- is := (S\NP)/(Sadj\NP)
Is := (Sq/(Sadj\NP))/NP
Introduction Learning by Inference Rules Experiment Conclusion
Outline
Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Learning Correct Word Order
- I should go.
- I have gone.
- I am going.
- I have been going.
- I should have gone.
- I should be going.
- I should have been going.
- *I have should been going.
should := (Ss\NP)/(S\NP) should := (Ss\NP)/(Sh\NP) should := (Ss\NP)/(Sb\NP) have := (Sh\NP)/(S\NP) have := (Sh\NP)/(Sb\NP) be := (Sb\NP)/(S\NP)
I should have been going NP (Ss\NP)/(Sh\NP) (Sh\NP)/(Sb\NP) (Sb\NP)/(S\NP) S\NP
>
Sb\NP
>
Sh\NP
>
Ss\NP
<
Ss
Figure: Learning Correct Word Order
Introduction Learning by Inference Rules Experiment Conclusion
Conclusion
- 1. Can we give a computational simulation of the acquisition of
syntactic structures?
- How do we derive the category of an unknown word in a
sentence?
- This paper presents a simple and intuitive method to achieve
this.
- 2. Can we give a judgement of the Nativist-Empiricist debate
from the perspective of CCG?
- How important is experience? Or the innate ability is more
important?
- Simple and intuitive logical rules can also help resolve the
celebrated linguistic phenomena.
- Logic gives a third way beside experience and innateness.
Introduction Learning by Inference Rules Experiment Conclusion
Conclusion
- 1. Can we give a computational simulation of the acquisition of
syntactic structures?
- How do we derive the category of an unknown word in a
sentence?
- This paper presents a simple and intuitive method to achieve
this.
- 2. Can we give a judgement of the Nativist-Empiricist debate
from the perspective of CCG?
- How important is experience? Or the innate ability is more
important?
- Simple and intuitive logical rules can also help resolve the
celebrated linguistic phenomena.
- Logic gives a third way beside experience and innateness.