An Inference-rules based Categorial Grammar Learner for Simulating - - PowerPoint PPT Presentation

an inference rules based categorial grammar learner for
SMART_READER_LITE
LIVE PREVIEW

An Inference-rules based Categorial Grammar Learner for Simulating - - PowerPoint PPT Presentation

Introduction Learning by Inference Rules Experiment Conclusion An Inference-rules based Categorial Grammar Learner for Simulating Language Acquisition Xuchen Yao, Jianqiang Ma, Sergio Duarte, ar ltekin University of Groningen 18


slide-1
SLIDE 1

Introduction Learning by Inference Rules Experiment Conclusion

An Inference-rules based Categorial Grammar Learner for Simulating Language Acquisition

Xuchen Yao, Jianqiang Ma, Sergio Duarte, Çağrı Çöltekin

University of Groningen

18 May 2009

slide-2
SLIDE 2

Introduction Learning by Inference Rules Experiment Conclusion

Outline

Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion

slide-3
SLIDE 3

Introduction Learning by Inference Rules Experiment Conclusion

Outline

Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion

slide-4
SLIDE 4

Introduction Learning by Inference Rules Experiment Conclusion

Categorial Grammar

  • basic categories: S

(sentence), NP (noun phrase), N (noun)

  • Complex categories:

NP/N, S\NP and (S\NP)\(S\NP)

  • Slash operators: / \

Peter saw a book NP VP DT N NP VP S Peter saw a book NP (S\NP)/NP NP/N N

>

NP

>

S\NP

<

S

Figure: Example derivation for sentence Peter saw a book.

slide-5
SLIDE 5

Introduction Learning by Inference Rules Experiment Conclusion

Categorial Grammar

  • basic categories: S

(sentence), NP (noun phrase), N (noun)

  • Complex categories:

NP/N, S\NP and (S\NP)\(S\NP)

  • Slash operators: / \

Peter saw a book NP VP DT N NP VP S Peter saw a book NP (S\NP)/NP NP/N N

>

NP

>

S\NP

<

S

Figure: Example derivation for sentence Peter saw a book.

slide-6
SLIDE 6

Introduction Learning by Inference Rules Experiment Conclusion

Different Operation Rules

  • Function application rules (CG)

Forward A/B B → A (>) Backward B A\B → A (<)

  • Function composition rules (CCG)

Forward A/B B/C → A/C (> B) Backward B\C A\B → A\C (< B)

  • Type raising rules (CCG)

Forward A → T/(T\A) (> T) Backward A → T\(T/A) (< T)

  • Substitution rules (CCG)

Forward (A/B)/C B/C → A/C (>S) Backward B\C (A\B)\C → A\C (<S)

slide-7
SLIDE 7

Introduction Learning by Inference Rules Experiment Conclusion

Outline

Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion

slide-8
SLIDE 8

Introduction Learning by Inference Rules Experiment Conclusion

Nativist vs. Empiricist

  • Auxiliary Verb Fronting
  • Peter is awake.
  • Is Peter awake?
  • Peter who is sleepy is awake.
  • Is Peter who is sleepy awake?
  • *Is Peter who sleepy is awake?
  • Word Order
  • I should go.
  • I have gone.
  • I am going.
  • I have been going.
  • I should have gone.
  • I should be going.
  • I should have been going.
  • *I have should been going.
slide-9
SLIDE 9

Introduction Learning by Inference Rules Experiment Conclusion

Nativist vs. Empiricist

  • Auxiliary Verb Fronting
  • Peter is awake.
  • Is Peter awake?
  • Peter who is sleepy is awake.
  • Is Peter who is sleepy awake?
  • *Is Peter who sleepy is awake?
  • Word Order
  • I should go.
  • I have gone.
  • I am going.
  • I have been going.
  • I should have gone.
  • I should be going.
  • I should have been going.
  • *I have should been going.
slide-10
SLIDE 10

Introduction Learning by Inference Rules Experiment Conclusion

Nativist vs. Empiricist

  • Auxiliary Verb Fronting
  • Peter is awake.
  • Is Peter awake?
  • Peter who is sleepy is awake.
  • Is Peter who is sleepy awake?
  • *Is Peter who sleepy is awake?
  • Word Order
  • I should go.
  • I have gone.
  • I am going.
  • I have been going.
  • I should have gone.
  • I should be going.
  • I should have been going.
  • *I have should been going.
slide-11
SLIDE 11

Introduction Learning by Inference Rules Experiment Conclusion

Nativist vs. Empiricist

  • Auxiliary Verb Fronting
  • Peter is awake.
  • Is Peter awake?
  • Peter who is sleepy is awake.
  • Is Peter who is sleepy awake?
  • *Is Peter who sleepy is awake?
  • Word Order
  • I should go.
  • I have gone.
  • I am going.
  • I have been going.
  • I should have gone.
  • I should be going.
  • I should have been going.
  • *I have should been going.
slide-12
SLIDE 12

Introduction Learning by Inference Rules Experiment Conclusion

Research Questions

  • 1. Can we give a computational simulation of the acquisition of

syntactic structures?

  • How do we derive the category of an unknown word in a

sentence?

  • 2. Can we give a judgement of the Nativist-Empiricist debate

from the perspective of CCG?

  • How important is experience? Or the innate ability is more

important?

slide-13
SLIDE 13

Introduction Learning by Inference Rules Experiment Conclusion

Outline

Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion

slide-14
SLIDE 14

Introduction Learning by Inference Rules Experiment Conclusion

Level 0/1 Inference Rules

  • Level 0 inference rules

B/A X → B ⇒ X = A ifA = S X B\A → B ⇒ X = A ifA = S

  • Level 1 inference rules

A X → B ⇒ X = B\A ifA = S X A → B ⇒ X = B/A ifA = S Peter works NP X (S\NP)

<

S

Figure: Example of level 1 indefrence rules: Peter works.

slide-15
SLIDE 15

Introduction Learning by Inference Rules Experiment Conclusion

Level 2 Inference Rules

  • Level 2 side inference rules

X A B → C ⇒ X = (C/B)/A A B X → C ⇒ X = (C\A)\B

  • Level 2 middle inference rule

A X B → C ⇒ X = (C\A)/B Peter saw a book NP X NP/N N ((S\NP)/NP)

>

NP

>

S\NP

<

S

Figure: Example of level 2 inference rules: Peter saw a book.

slide-16
SLIDE 16

Introduction Learning by Inference Rules Experiment Conclusion

Level 3 Inference Rules

  • Level 3 side inference rules

X A B C → D ⇒ X = ((D/C)/B)/A A B C X → D ⇒ X = ((D\A)\B)\C

  • Level 3 middle inference rules

A X B C → D ⇒ X = ((D\A)/C)/B A B X C → D ⇒ X = ((D\A)\B)/C

  • Inference rules of up to level 3 can derive most categories of

common English words.

slide-17
SLIDE 17

Introduction Learning by Inference Rules Experiment Conclusion

Outline

Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion

slide-18
SLIDE 18

Introduction Learning by Inference Rules Experiment Conclusion

The Learning Architecture

Corpus Edge Generator Recursive Learner Level 0 IR can parse? Level 1 IR can parse? Level 2 IR can parse? Level 3 IR can parse? N N N N Applying level 0 and 1 inference rule recursively SCP Principle Right combining rule Output Selector Y Generation rule set Test rule set Cannot learn N N Y Y Y Y Y Y Y Y

Figure: Learning process using inference rules

slide-19
SLIDE 19

Introduction Learning by Inference Rules Experiment Conclusion

Outline

Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion

slide-20
SLIDE 20

Introduction Learning by Inference Rules Experiment Conclusion

Target Grammar

Peter := NP with := (N\N)/NP Mary := NP with := (NP\NP)/NP big := N/N with := ((S\NP)\(S\NP))/NP colorless := N/N sleep := S\NP book := N a := NP/N telescope := N give := ((S\NP)/NP)/NP the := NP/N saw := (S\NP)/NP run := S\NP read := (S\NP)/NP big := N/N furiously := (S\NP)\(S\NP)

Table: Target Grammar Rules

  • Recursive & ambiguous
  • Assume only NP and N are known to the learner
slide-21
SLIDE 21

Introduction Learning by Inference Rules Experiment Conclusion

Result

Peter saw Mary with a big big telescope NP (S\NP)/NP NP (NP\NP)/NP NP/N N/N N/N N

>

N

>

N

<

NP

>

NP\NP

<

NP

>

S\NP

<

S

(a)

Peter saw Mary with a big big telescope NP (S\NP)/NP NP ((S\NP)\(S\NP))/NP NP/N N/N N/N N

> >

S\NP N

>

N

<

NP

>

(S\NP)\(S\NP)

<

S\NP

<

S

(b)

Figure: Two ambiguous parses of the sentence

slide-22
SLIDE 22

Introduction Learning by Inference Rules Experiment Conclusion

Result

Peter saw Mary with a big big telescope NP (S\NP)/NP NP (NP\NP)/NP NP/N N/N N/N N

>

N

>

N

<

NP

>

NP\NP

<

NP

>

S\NP

<

S

Figure: Ambiguous parse 1

slide-23
SLIDE 23

Introduction Learning by Inference Rules Experiment Conclusion

Result

Peter saw Mary with a big big telescope NP (S\NP)/NP NP ((S\NP)\(S\NP))/NP NP/N N/N N/N N

> >

S\NP N

>

N

<

NP

>

(S\NP)\(S\NP)

<

S\NP

<

S Figure: Ambiguous parse 2

slide-24
SLIDE 24

Introduction Learning by Inference Rules Experiment Conclusion

Outline

Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion

slide-25
SLIDE 25

Introduction Learning by Inference Rules Experiment Conclusion

Learning Auxiliary Verb Fronting 1

Peter is sleepy NP (S\NP)/(Sadj\NP) Sadj\NP

>

S\NP

<

S (a) Is Peter awake (Sq/(Sadj\NP))/NP NP Sadj\NP

>

Sq/(Sadj\NP)

<

Sq (b)

Peter who is sleepy is awake NP (NP\NP)/(S\NP) (S\NP)/(Sadj\NP) Sadj\NP (S\NP)/(Sadj\NP) Sadj\NP

> >

S\NP S\NP

>

NP\NP

<

NP

<

S

(c)

Figure: Learning Auxiliary Verb Fronting 1

slide-26
SLIDE 26

Introduction Learning by Inference Rules Experiment Conclusion

Learning Auxiliary Verb Fronting 2

Is Peter who is sleepy awake (Sq/(Sadj\NP))/NP NP (NP\NP)/(S\NP) (S\NP)/(Sadj\NP) Sadj\NP Sadj\NP

>

S\NP

>

NP\NP

<

NP

>

Sq/(Sadj\NP)

>

Sq

Figure: Learning Auxiliary Verb Fronting 2

  • is := (S\NP)/(Sadj\NP)

Is := (Sq/(Sadj\NP))/NP

slide-27
SLIDE 27

Introduction Learning by Inference Rules Experiment Conclusion

Outline

Introduction Combinatory Categorial Grammar Language Acquisition Learning by Inference Rules Grammar Induction by Inference Rules The Learning Architecture Experiment Learning an Artificial Grammar Learning Auxiliary Verb Fronting Learning Correct Word Order Conclusion

slide-28
SLIDE 28

Introduction Learning by Inference Rules Experiment Conclusion

Learning Correct Word Order

  • I should go.
  • I have gone.
  • I am going.
  • I have been going.
  • I should have gone.
  • I should be going.
  • I should have been going.
  • *I have should been going.

should := (Ss\NP)/(S\NP) should := (Ss\NP)/(Sh\NP) should := (Ss\NP)/(Sb\NP) have := (Sh\NP)/(S\NP) have := (Sh\NP)/(Sb\NP) be := (Sb\NP)/(S\NP)

I should have been going NP (Ss\NP)/(Sh\NP) (Sh\NP)/(Sb\NP) (Sb\NP)/(S\NP) S\NP

>

Sb\NP

>

Sh\NP

>

Ss\NP

<

Ss

Figure: Learning Correct Word Order

slide-29
SLIDE 29

Introduction Learning by Inference Rules Experiment Conclusion

Conclusion

  • 1. Can we give a computational simulation of the acquisition of

syntactic structures?

  • How do we derive the category of an unknown word in a

sentence?

  • This paper presents a simple and intuitive method to achieve

this.

  • 2. Can we give a judgement of the Nativist-Empiricist debate

from the perspective of CCG?

  • How important is experience? Or the innate ability is more

important?

  • Simple and intuitive logical rules can also help resolve the

celebrated linguistic phenomena.

  • Logic gives a third way beside experience and innateness.
slide-30
SLIDE 30

Introduction Learning by Inference Rules Experiment Conclusion

Conclusion

  • 1. Can we give a computational simulation of the acquisition of

syntactic structures?

  • How do we derive the category of an unknown word in a

sentence?

  • This paper presents a simple and intuitive method to achieve

this.

  • 2. Can we give a judgement of the Nativist-Empiricist debate

from the perspective of CCG?

  • How important is experience? Or the innate ability is more

important?

  • Simple and intuitive logical rules can also help resolve the

celebrated linguistic phenomena.

  • Logic gives a third way beside experience and innateness.