Left-corner parsing Laura Kassner laura.kassner@gmx.de - - PowerPoint PPT Presentation

left corner parsing
SMART_READER_LITE
LIVE PREVIEW

Left-corner parsing Laura Kassner laura.kassner@gmx.de - - PowerPoint PPT Presentation

Left-corner parsing Laura Kassner laura.kassner@gmx.de Computational Linguistics II: Parsing January 10th, 2007 Left-corner parsing Basics Building a left-corner recognizer... ... and transforming it into a parser Comparison to


slide-1
SLIDE 1

Left-corner parsing

Laura Kassner

laura.kassner@gmx.de

Computational Linguistics II: Parsing

January 10th, 2007

slide-2
SLIDE 2

Left-corner parsing

  • Basics
  • Building a left-corner recognizer...
  • ... and transforming it into a parser
  • Comparison to top-down and bottom-up

approaches

slide-3
SLIDE 3

Left-corner parsing: Basics

What is left-corner parsing?

picture taken from Shravan Vasishth's HSP seminar slides

slide-4
SLIDE 4

Left-corner parsing: Basics

  • bottom-up and top-down aspects
slide-5
SLIDE 5

Left-corner parsing: Basics

  • bottom-up and top-down aspects
  • bottom-up: rule k0 -> k1... kn can only be applied

if for every ki (1 <= i <= n), a complete partial structure has been recognized

slide-6
SLIDE 6

Left-corner parsing: Basics

  • bottom-up and top-down aspects
  • bottom-up: rule k0 -> k1... kn can only be applied

if for every ki (1 <= i <= n), a complete partial structure has been recognized

  • left-corner: a structure dominated by k1 must

have been recognized for a rule to be applied

slide-7
SLIDE 7

Left-corner parsing: Basics

  • bottom-up and top-down aspects
  • bottom-up: rule k0 -> k1... kn can only be applied

if for every ki (1 <= i <= n), a complete partial structure has been recognized

  • left-corner: a structure dominated by k1 must

have been recognized for a rule to be applied => k1 is “left corner“ of the rule – first symbol on the right hand side

slide-8
SLIDE 8

Left-corner parsing: Basics

  • bottom-up and top-down aspects
  • bottom-up: rule k0 -> k1... kn can only be applied

if for every ki (1 <= i <= n), a complete partial structure has been recognized

  • left-corner: a structure dominated by k1 must

have been recognized for a rule to be applied => k1 is “left corner“ of the rule – first symbol on the right hand side => rule used to make assumptions about the category dominating k1 and about following constituents

slide-9
SLIDE 9

Left-corner parsing: Basics

What is a left-corner parse?

slide-10
SLIDE 10

Left-corner parsing: Basics

What is a left-corner parse?

  • context-free grammar G = <N, T, S, R>
slide-11
SLIDE 11

Left-corner parsing: Basics

What is a left-corner parse?

  • context-free grammar G = <N, T, S, R>
  • string w
slide-12
SLIDE 12

Left-corner parsing: Basics

What is a left-corner parse?

  • context-free grammar G = <N, T, S, R>
  • string w

=> series of rule indices γ = i1 ... in which corresponds to a derivation of string w in G

slide-13
SLIDE 13

Left-corner parsing: Basics

Ordering rules:

slide-14
SLIDE 14

Left-corner parsing: Basics

Ordering rules: 1 – β ist the tree structure implied by γ

slide-15
SLIDE 15

Left-corner parsing: Basics

Ordering rules: 1 – β ist the tree structure implied by γ 2 – nodes in β are ordered the following way:

slide-16
SLIDE 16

Left-corner parsing: Basics

Ordering rules: 1 – β ist the tree structure implied by γ 2 – nodes in β are ordered the following way:

a) if n DD n1 ... nm, all nodes of the subtree with root n1 are in front of n;

slide-17
SLIDE 17

Left-corner parsing: Basics

Ordering rules: 1 – β ist the tree structure implied by γ 2 – nodes in β are ordered the following way:

a) if n DD n1 ... nm, all nodes of the subtree with root n1 are in front of n; b) n is in front of all other nodes it dominates

slide-18
SLIDE 18

Left-corner parsing: Basics

Ordering rules: 1 – β ist the tree structure implied by γ 2 – nodes in β are ordered the following way:

a) if n DD n1 ... nm, all nodes of the subtree with root n1 are in front of n; b) n is in front of all other nodes it dominates c) all nodes dominated by ni are in front of the nodes dominated by ni+1

slide-19
SLIDE 19

Left-corner parsing: Basics

Ordering rules: 1 – β ist the tree structure implied by γ 2 – nodes in β are ordered the following way:

a) if n DD n1 ... nm, all nodes of the subtree with root n1 are in front of n; b) n is in front of all other nodes it dominates c) all nodes dominated by ni are in front of the nodes dominated by ni+1

3 – the order of rule applications described by γ does not violate these rules

slide-20
SLIDE 20

Left-corner parsing: Basics

=> inorder tree traversal!!!

slide-21
SLIDE 21

Left-corner parsing: Basics

An example:

slide-22
SLIDE 22

Left-corner parsing: Basics

An example:

  • grammar rules: 1: S -> AS

2: S-> BB 3: A -> bAA 4: A -> a 5: B -> b 6: B -> e

slide-23
SLIDE 23

Left-corner parsing: Basics

An example:

  • grammar rules: 1: S -> AS

2: S-> BB 3: A -> bAA 4: A -> a 5: B -> b 6: B -> e

  • sentence: bbaaab
slide-24
SLIDE 24

Left-corner parsing: Basics

An example:

  • grammar rules: 1: S -> AS

2: S-> BB 3: A -> bAA 4: A -> a 5: B -> b 6: B -> e

  • sentence: bbaaab
slide-25
SLIDE 25

Left-corner parsing: Basics

An example:

  • grammar rules: 1: S -> AS

2: S-> BB 3: A -> bAA 4: A -> a 5: B -> b 6: B -> e

  • sentence: bbaaab

Order of nodes: 4 2 9 5 15 10 16 11 12 6 1 13 7 3 14 8

slide-26
SLIDE 26

Left-corner parsing: Basics

An example:

  • grammar rules: 1: S -> AS

2: S-> BB 3: A -> bAA 4: A -> a 5: B -> b 6: B -> e

  • sentence: bbaaab

Order of nodes: 4 2 9 5 15 10 16 11 12 6 1 13 7 3 14 8 TD parse: 1 3 3 4 4 4 2 6 5

slide-27
SLIDE 27

Left-corner parsing: Basics

An example:

  • grammar rules: 1: S -> AS

2: S-> BB 3: A -> bAA 4: A -> a 5: B -> b 6: B -> e

  • sentence: bbaaab

Order of nodes: 4 2 9 5 15 10 16 11 12 6 1 13 7 3 14 8 TD parse: 1 3 3 4 4 4 2 6 5 BU parse: 4 4 3 4 3 6 5 2 1

slide-28
SLIDE 28

Left-corner parsing: Basics

An example:

  • grammar rules: 1: S -> AS

2: S-> BB 3: A -> bAA 4: A -> a 5: B -> b 6: B -> e

  • sentence: bbaaab

Order of nodes: 4 2 9 5 15 10 16 11 12 6 1 13 7 3 14 8 TD parse: 1 3 3 4 4 4 2 6 5 BU parse: 4 4 3 4 3 6 5 2 1 LC parse: 3 3 4 4 4 1 6 2 5

slide-29
SLIDE 29

Questions?

slide-30
SLIDE 30

Building a left-corner recognizer

slide-31
SLIDE 31

Building a left-corner recognizer

Data: CFG <N, T, S, R> Lexicon L

slide-32
SLIDE 32

Building a left-corner recognizer

Data: CFG <N, T, S, R> Lexicon L Data structures: 3 stacks

slide-33
SLIDE 33

Building a left-corner recognizer

Data: CFG <N, T, S, R> Lexicon L Data structures: 3 stacks 1) SENTENCE to be processed

slide-34
SLIDE 34

Building a left-corner recognizer

Data: CFG <N, T, S, R> Lexicon L Data structures: 3 stacks 1) SENTENCE to be processed 2) CATEGORIES to be recognized

slide-35
SLIDE 35

Building a left-corner recognizer

Data: CFG <N, T, S, R> Lexicon L Data structures: 3 stacks 1) SENTENCE to be processed 2) CATEGORIES to be recognized 3) CONSTITUENTS we are looking for

slide-36
SLIDE 36

Building a left-corner recognizer

Data: CFG <N, T, S, R> Lexicon L Data structures: 3 stacks 1) SENTENCE to be processed 2) CATEGORIES to be recognized 3) CONSTITUENTS we are looking for Stack operations:

slide-37
SLIDE 37

Building a left-corner recognizer

Data: CFG <N, T, S, R> Lexicon L Data structures: 3 stacks 1) SENTENCE to be processed 2) CATEGORIES to be recognized 3) CONSTITUENTS we are looking for Stack operations: pop(STACK) push(element, STACK) first(STACK)

slide-38
SLIDE 38

Building a left-corner recognizer

Procedures

slide-39
SLIDE 39

Building a left-corner recognizer

Procedures

REDUCE

slide-40
SLIDE 40

Building a left-corner recognizer

Procedures

REDUCE

Preconditions:

slide-41
SLIDE 41

Building a left-corner recognizer

Procedures

REDUCE

Preconditions: 1) There is a rule k0 -> k1 ... kn in R or k1 is part of k0 for an arbitrary lexical category k0

slide-42
SLIDE 42

Building a left-corner recognizer

Procedures

REDUCE

Preconditions: 1) There is a rule k0 -> k1 ... kn in R or k1 is part of k0 for an arbitrary lexical category k0 2) first(CATEGORIES) Є (N U T)

slide-43
SLIDE 43

Building a left-corner recognizer

Procedures

REDUCE

Preconditions: 1) There is a rule k0 -> k1 ... kn in R or k1 is part of k0 for an arbitrary lexical category k0 2) first(CATEGORIES) Є (N U T) Input: SENTENCE with first = k1; CATEGORIES; CONSTITUENTS

slide-44
SLIDE 44

Building a left-corner recognizer

Procedures

REDUCE

Preconditions: 1) There is a rule k0 -> k1 ... kn in R or k1 is part of k0 for an arbitrary lexical category k0 2) first(CATEGORIES) Є (N U T) Input: SENTENCE with first = k1; CATEGORIES; CONSTITUENTS Output: pop(SENTENCE); push(k2 ... kn t, CATEGORIES); push(k0, CONSTITUENTS)

slide-45
SLIDE 45

Building a left-corner recognizer

Procedures

REDUCE => delete first symbol from sentence ( = left corner of rule) => rest of right hand side of rule is pushed onto CATEGORIES together with signal symbol for end of rule 't' => CONSTITUENTS keeps in mind we are looking for k0

slide-46
SLIDE 46

Building a left-corner recognizer

Procedures

MOVE

slide-47
SLIDE 47

Building a left-corner recognizer

Procedures

MOVE

Preconditions:

slide-48
SLIDE 48

Building a left-corner recognizer

Procedures

MOVE

Preconditions: 1) first(CATEGORIES) = t

slide-49
SLIDE 49

Building a left-corner recognizer

Procedures

MOVE

Preconditions: 1) first(CATEGORIES) = t 2) first(CONSTITUENTS) = A Є (N U T)

slide-50
SLIDE 50

Building a left-corner recognizer

Procedures

MOVE

Preconditions: 1) first(CATEGORIES) = t 2) first(CONSTITUENTS) = A Є (N U T) Input: SENTENCE; CATEGORIES; CONSTITUENTS

slide-51
SLIDE 51

Building a left-corner recognizer

Procedures

MOVE

Preconditions: 1) first(CATEGORIES) = t 2) first(CONSTITUENTS) = A Є (N U T) Input: SENTENCE; CATEGORIES; CONSTITUENTS Output: push(first(CONSTITUENTS), SENTENCE); pop(CATEGORIES); pop(CONSTITUENTS)

slide-52
SLIDE 52

Building a left-corner recognizer

Procedures

MOVE => right-hand-side of rule whose left-hand-side is A has been completely processed, A was recognized => push A onto SENTENCE => remove the 't' from CATEGORIES => remove A from CONSTITUENTS

slide-53
SLIDE 53

Building a left-corner recognizer

Procedures

REMOVE

slide-54
SLIDE 54

Building a left-corner recognizer

Procedures

REMOVE

Precondition: first(SENTENCE) = first(CATEGORIES)

slide-55
SLIDE 55

Building a left-corner recognizer

Procedures

REMOVE

Precondition: first(SENTENCE) = first(CATEGORIES) Input: SENTENCE; CATEGORIES; CONSTITUENTS

slide-56
SLIDE 56

Building a left-corner recognizer

Procedures

REMOVE

Precondition: first(SENTENCE) = first(CATEGORIES) Input: SENTENCE; CATEGORIES; CONSTITUENTS Output: pop(SENTENCE); pop(CATEGORIES); CONSTITUENTS

slide-57
SLIDE 57

Building a left-corner recognizer

Procedures

REMOVE => is applied iff first(SENTENCE) is a category ki, a left corner, and this category has been recognized

slide-58
SLIDE 58

Building a left-corner recognizer

The Algorithm

slide-59
SLIDE 59

Building a left-corner recognizer

The Algorithm

RECOGNIZELC

slide-60
SLIDE 60

Building a left-corner recognizer

The Algorithm

RECOGNIZELC

Data: CFG G = <N, T, S, R> Lexicon L sentence w = w1 ... wn, n >= 1

slide-61
SLIDE 61

Building a left-corner recognizer

The Algorithm

RECOGNIZELC

Data: CFG G = <N, T, S, R> Lexicon L sentence w = w1 ... wn, n >= 1 Input: SENTENCE = [w1 ... wn]; CATEGORIES = [S]; CONSTITUENTS = [ ]

slide-62
SLIDE 62

Building a left-corner recognizer

The Algorithm

RECOGNIZELC

Data: CFG G = <N, T, S, R> Lexicon L sentence w = w1 ... wn, n >= 1 Input: SENTENCE = [w1 ... wn]; CATEGORIES = [S]; CONSTITUENTS = [ ] Output: true / false

slide-63
SLIDE 63

Building a left-corner recognizer

The Algorithm

RECOGNIZELC

Method: if (SENTENCE == CATEGORIES == CONSTITUENTS == [ ]) return true; else if (there is a procedure P Є {REDUCE, MOVE, REMOVE} whose preconditions are met) RECOGNIZELC(P(SENTENCE, CATEGORIES, CONSTITUENTS)); else return false;

slide-64
SLIDE 64

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler

slide-65
SLIDE 65

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE

slide-66
SLIDE 66

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE

slide-67
SLIDE 67

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE

slide-68
SLIDE 68

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE

slide-69
SLIDE 69

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE [sucht einen F...] [t n t S] [n NP] MOVE

slide-70
SLIDE 70

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE [sucht einen F...] [t n t S] [n NP] MOVE [n sucht einen F...] [n t S] [NP] REMOVE

slide-71
SLIDE 71

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE [sucht einen F...] [t n t S] [n NP] MOVE [n sucht einen F...] [n t S] [NP] REMOVE [sucht einen F...] [t S] [NP] MOVE

slide-72
SLIDE 72

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE [sucht einen F...] [t n t S] [n NP] MOVE [n sucht einen F...] [n t S] [NP] REMOVE [sucht einen F...] [t S] [NP] MOVE [NP sucht einen...] [S] [ ] REDUCE

slide-73
SLIDE 73

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE [sucht einen F...] [t n t S] [n NP] MOVE [n sucht einen F...] [n t S] [NP] REMOVE [sucht einen F...] [t S] [NP] MOVE [NP sucht einen...] [S] [ ] REDUCE [sucht einen F...] [VP t S] [S] REDUCE

slide-74
SLIDE 74

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE [sucht einen F...] [t n t S] [n NP] MOVE [n sucht einen F...] [n t S] [NP] REMOVE [sucht einen F...] [t S] [NP] MOVE [NP sucht einen...] [S] [ ] REDUCE [sucht einen F...] [VP t S] [S] REDUCE [einen Fehler] [t VP t S] [v S] MOVE

slide-75
SLIDE 75

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE [sucht einen F...] [t n t S] [n NP] MOVE [n sucht einen F...] [n t S] [NP] REMOVE [sucht einen F...] [t S] [NP] MOVE [NP sucht einen...] [S] [ ] REDUCE [sucht einen F...] [VP t S] [S] REDUCE [einen Fehler] [t VP t S] [v S] MOVE [v einen Fehler] [VP t S] [S] REDUCE

slide-76
SLIDE 76

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE [sucht einen F...] [t n t S] [n NP] MOVE [n sucht einen F...] [n t S] [NP] REMOVE [sucht einen F...] [t S] [NP] MOVE [NP sucht einen...] [S] [ ] REDUCE [sucht einen F...] [VP t S] [S] REDUCE [einen Fehler] [t VP t S] [v S] MOVE [v einen Fehler] [VP t S] [S] REDUCE [einen Fehler] [NP t VP t S] [VP S] REDUCE

slide-77
SLIDE 77

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE [sucht einen F...] [t n t S] [n NP] MOVE [n sucht einen F...] [n t S] [NP] REMOVE [sucht einen F...] [t S] [NP] MOVE [NP sucht einen...] [S] [ ] REDUCE [sucht einen F...] [VP t S] [S] REDUCE [einen Fehler] [t VP t S] [v S] MOVE [v einen Fehler] [VP t S] [S] REDUCE [einen Fehler] [NP t VP t S] [VP S] REDUCE [Fehler] [t NP t VP t S] [det VP S] MOVE

slide-78
SLIDE 78

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE [sucht einen F...] [t n t S] [n NP] MOVE [n sucht einen F...] [n t S] [NP] REMOVE [sucht einen F...] [t S] [NP] MOVE [NP sucht einen...] [S] [ ] REDUCE [sucht einen F...] [VP t S] [S] REDUCE [einen Fehler] [t VP t S] [v S] MOVE [v einen Fehler] [VP t S] [S] REDUCE [einen Fehler] [NP t VP t S] [VP S] REDUCE [Fehler] [t NP t VP t S] [det VP S] MOVE [det Fehler] [NP t VP t S] [VP S] REDUCE

slide-79
SLIDE 79

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE [sucht einen F...] [t n t S] [n NP] MOVE [n sucht einen F...] [n t S] [NP] REMOVE [sucht einen F...] [t S] [NP] MOVE [NP sucht einen...] [S] [ ] REDUCE [sucht einen F...] [VP t S] [S] REDUCE [einen Fehler] [t VP t S] [v S] MOVE [v einen Fehler] [VP t S] [S] REDUCE [einen Fehler] [NP t VP t S] [VP S] REDUCE [Fehler] [t NP t VP t S] [det VP S] MOVE [det Fehler] [NP t VP t S] [VP S] REDUCE [Fehler] [n t NP t VP t S] [NP VP S] REDUCE

slide-80
SLIDE 80

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [der Meister su...] [S] [ ] REDUCE [Meister sucht...] [t S] [det] MOVE [det Meister su...] [S] [ ] REDUCE [Meister sucht...] [n t S] [NP] REDUCE [sucht einen F...] [t n t S] [n NP] MOVE [n sucht einen F...] [n t S] [NP] REMOVE [sucht einen F...] [t S] [NP] MOVE [NP sucht einen...] [S] [ ] REDUCE [sucht einen F...] [VP t S] [S] REDUCE [einen Fehler] [t VP t S] [v S] MOVE [v einen Fehler] [VP t S] [S] REDUCE [einen Fehler] [NP t VP t S] [VP S] REDUCE [Fehler] [t NP t VP t S] [det VP S] MOVE [det Fehler] [NP t VP t S] [VP S] REDUCE [Fehler] [n t NP t VP t S] [NP VP S] REDUCE [ ] [t n t NP t VP t S] [n NP VP S] MOVE

slide-81
SLIDE 81

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [ ] [t n t NP t VP t S] [n NP VP S] MOVE

slide-82
SLIDE 82

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [ ] [t n t NP t VP t S] [n NP VP S] MOVE [n] [n t NP t VP t S] [NP VP S] REMOVE

slide-83
SLIDE 83

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [ ] [t n t NP t VP t S] [n NP VP S] MOVE [n] [n t NP t VP t S] [NP VP S] REMOVE [ ] [t NP t VP t S] [NP VP S] MOVE

slide-84
SLIDE 84

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [ ] [t n t NP t VP t S] [n NP VP S] MOVE [n] [n t NP t VP t S] [NP VP S] REMOVE [ ] [t NP t VP t S] [NP VP S] MOVE [NP] [NP t VP t S] [VP S] REMOVE

slide-85
SLIDE 85

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [ ] [t n t NP t VP t S] [n NP VP S] MOVE [n] [n t NP t VP t S] [NP VP S] REMOVE [ ] [t NP t VP t S] [NP VP S] MOVE [NP] [NP t VP t S] [VP S] REMOVE [ ] [t VP t S] [VP S] MOVE

slide-86
SLIDE 86

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [ ] [t n t NP t VP t S] [n NP VP S] MOVE [n] [n t NP t VP t S] [NP VP S] REMOVE [ ] [t NP t VP t S] [NP VP S] MOVE [NP] [NP t VP t S] [VP S] REMOVE [ ] [t VP t S] [VP S] MOVE [VP] [VP t S] [S] REMOVE

slide-87
SLIDE 87

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [ ] [t n t NP t VP t S] [n NP VP S] MOVE [n] [n t NP t VP t S] [NP VP S] REMOVE [ ] [t NP t VP t S] [NP VP S] MOVE [NP] [NP t VP t S] [VP S] REMOVE [ ] [t VP t S] [VP S] MOVE [VP] [VP t S] [S] REMOVE [ ] [t S] [S] MOVE

slide-88
SLIDE 88

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [ ] [t n t NP t VP t S] [n NP VP S] MOVE [n] [n t NP t VP t S] [NP VP S] REMOVE [ ] [t NP t VP t S] [NP VP S] MOVE [NP] [NP t VP t S] [VP S] REMOVE [ ] [t VP t S] [VP S] MOVE [VP] [VP t S] [S] REMOVE [ ] [t S] [S] MOVE [S] [S] [ ] REMOVE

slide-89
SLIDE 89

Building a left-corner recognizer

Example

Der Meister sucht einen Fehler SENTENCE CATEGORIES CONSTITUENTS procedure [ ] [t n t NP t VP t S] [n NP VP S] MOVE [n] [n t NP t VP t S] [NP VP S] REMOVE [ ] [t NP t VP t S] [NP VP S] MOVE [NP] [NP t VP t S] [VP S] REMOVE [ ] [t VP t S] [VP S] MOVE [VP] [VP t S] [S] REMOVE [ ] [t S] [S] MOVE [S] [S] [ ] REMOVE [ ] [ ] [ ] true

slide-90
SLIDE 90

Building a left-corner recognizer

Why is RECOGNIZELC non-deterministic?

slide-91
SLIDE 91

Building a left-corner recognizer

Why is RECOGNIZELC non-deterministic?

  • there may be several rules whose left corner is equal

to first(SENTENCE)

slide-92
SLIDE 92

Building a left-corner recognizer

Why is RECOGNIZELC non-deterministic?

  • there may be several rules whose left corner is equal

to first(SENTENCE)

  • there may be configurations where you could either

REDUCE or REMOVE:

slide-93
SLIDE 93

Building a left-corner recognizer

Why is RECOGNIZELC non-deterministic?

  • there may be several rules whose left corner is equal

to first(SENTENCE)

  • there may be configurations where you could either

REDUCE or REMOVE:

  • a newly created structure can be used to

complete the structure we are working at => REMOVE

slide-94
SLIDE 94

Building a left-corner recognizer

Why is RECOGNIZELC non-deterministic?

  • there may be several rules whose left corner is equal

to first(SENTENCE)

  • there may be configurations where you could either

REDUCE or REMOVE:

  • a newly created structure can be used to

complete the structure we are working at => REMOVE

  • or it could constitute a new structure of its own

=> REDUCE

slide-95
SLIDE 95

Building a left-corner recognizer

=> use breadth-first or depth-first search to check all possible configurations

slide-96
SLIDE 96

Building a left-corner recognizer

breadth-first search

RECOGNIZELC/BF

Data: CFG G = <N, T, S, R> Lexicon L sentence w = w1 ... wn, n >= 1 Input: SENTENCE = [w1 ... wn]; CATEGORIES = [S]; CONSTITUENTS = [ ] Output: true / false Structures: CONFIGS – set of configurations, null at the beginning

slide-97
SLIDE 97

Building a left-corner recognizer

breadth-first search

RECOGNIZELC/BF

Method: if (SENTENCE == CATEGORIES == CONSTITUENTS == [ ]) return true; else CONFIGS = set of all configurations derivable from the actual configuration using REMOVE, REDUCE or MOVE if (CONFIGS == null) return false; else for every configuration C Є CONFIGS: RECOGNIZELC/BF(SENTENCEC, CATEGORIESC, CONSTITUENTSC);

slide-98
SLIDE 98

Questions?

slide-99
SLIDE 99

A left-corner parsing algorithm

slide-100
SLIDE 100

A left-corner parsing algorithm

  • introduce another stack: STRUCTURE
slide-101
SLIDE 101

A left-corner parsing algorithm

  • introduce another stack: STRUCTURE
  • empty at the beginning; filled along the way
slide-102
SLIDE 102

A left-corner parsing algorithm

  • introduce another stack: STRUCTURE
  • empty at the beginning; filled along the way
  • return value: the structure stored in stack

STRUCTURE

slide-103
SLIDE 103

A left-corner parsing algorithm

Modifying the procedures

MOVELC/BF

Preconditions: 1) first(CATEGORIES) = t 2) first(CONSTITUENTS) = A Є (N U T) Input: SENTENCE; CATEGORIES; CONSTITUENTS; STRUCTURE Output: push(first(CONSTITUENTS), SENTENCE); pop(CATEGORIES); pop(CONSTITUENTS); STRUCTURE

slide-104
SLIDE 104

A left-corner parsing algorithm

Modifying the procedures

MOVELC/BF => just insert another parameter for the structure stack

slide-105
SLIDE 105

A left-corner parsing algorithm

Modifying the procedures

REDUCELC/BF

Preconditions: 1) There is a rule k0 -> k1 ... kn in R or k1 is part of k0 for an arbitrary lexical category k0 2) first(CATEGORIES) Є (N U T) Input: SENTENCE with first = k1; CATEGORIES; CONSTITUENTS; STRUCTURE Output: pop(SENTENCE); push(k2 ... kn t, CATEGORIES); push(k0, CONSTITUENTS); structure1(k0, k1, STRUCTURE)

slide-106
SLIDE 106

A left-corner parsing algorithm

Modifying the procedures

REDUCELC/BF – new subprocedure structure1

slide-107
SLIDE 107

A left-corner parsing algorithm

Modifying the procedures

REDUCELC/BF – new subprocedure structure1 Input: STRUCTURE, symbols k0, k1 Є (N U T)

slide-108
SLIDE 108

A left-corner parsing algorithm

Modifying the procedures

REDUCELC/BF – new subprocedure structure1 Input: STRUCTURE, symbols k0, k1 Є (N U T) Output: modified STRUCTURE'

slide-109
SLIDE 109

A left-corner parsing algorithm

Modifying the procedures

REDUCELC/BF – new subprocedure structure1 Input: STRUCTURE, symbols k0, k1 Є (N U T) Output: modified STRUCTURE' Method: if (STRUCTURE == [ ] U first(STRUCTURE) == k'α with k' != k1)) return push((k0 k1), STRUCTURE) else return(push((k0 first(STRUCTURE)), pop(STRUCTURE)))

slide-110
SLIDE 110

A left-corner parsing algorithm

Modifying the procedures

REDUCELC/BF => add structure1(k0,k1,STRUCTURE) to output structure1: => if there is already a structure dominated by k1, integrate the new symbols, else build up a new structure description

slide-111
SLIDE 111

A left-corner parsing algorithm

Modifying the procedures

REMOVELC/BF

Precondition: first(SENTENCE) = first(CATEGORIES) Input: SENTENCE; CATEGORIES; CONSTITUENTS; STRUCTURE Output: pop(SENTENCE); pop(CATEGORIES); CONSTITUENTS; structure2(CONSTITUENTS, STRUCTURE)

slide-112
SLIDE 112

A left-corner parsing algorithm

Modifying the procedures

REMOVELC/BF – subprocedure structure2

slide-113
SLIDE 113

A left-corner parsing algorithm

Modifying the procedures

REMOVELC/BF – subprocedure structure2 Input: CONSTITUENTS; STRUCTURE

slide-114
SLIDE 114

A left-corner parsing algorithm

Modifying the procedures

REMOVELC/BF – subprocedure structure2 Input: CONSTITUENTS; STRUCTURE Output: modified STRUCTURE'

slide-115
SLIDE 115

A left-corner parsing algorithm

Modifying the procedures

REMOVELC/BF – subprocedure structure2 Input: CONSTITUENTS; STRUCTURE Output: modified STRUCTURE' Method: if(CONSTITUENTS == [ ]) return STRUCTURE else return(push((second(STRUCTURE) + first(STRUCTURE)), pop(pop(STRUCTURE))))

slide-116
SLIDE 116

A left-corner parsing algorithm

Modifying the procedures

REMOVELC/BF with subprocedure structure2 => if CONSTITUENTS is not empty, associate the last two partial structure descriptions on STRUCTURE

slide-117
SLIDE 117

A left-corner parsing algorithm

Example

Eva sah Adam am Morgen SENTENCE CATEGORIES CONSTITUENTS STRUCTURE [Eva sah Adam...] [S] [ ] [ ] [sah Adam...] [t S] [n] [(n1)] [n sah Adam...] [S] [ ] [(n1)] [sah Adam...] [t S] [NP] [(NP(n1))] [NP sah Adam...] [S] [ ] [(NP(n1))] [sah Adam...] [VP t S] [S] [S(NP(n1))] [Adam am Morgen] [t VP t S] [v S] [(v2)(S(NP(n1)))] [v Adam am Morgen] [VP t S] [S] [(v2)(S(NP(n1)))] [Adam am Morgen] [NP PP t VP t S] [VP S] [(VP(v2))(S(NP...] [am Morgen] [t NP PP t VP t S] [n VP S] [(n3)(VP(v2))(S...] ... ... ... ...

(S (NP(n1)) (VP (v2) (NP(n3)) (PP (p4) (NP(n5)))))

slide-118
SLIDE 118

Questions?

slide-119
SLIDE 119

Left-corner parsing with look-ahead

slide-120
SLIDE 120

Left-corner parsing with look-ahead

  • become more efficient...
slide-121
SLIDE 121

Left-corner parsing with look-ahead

  • become more efficient...
  • ... by reducing number of rules that can be used

to generate next derivation

slide-122
SLIDE 122

Left-corner parsing with look-ahead

  • become more efficient...
  • ... by reducing number of rules that can be used

to generate next derivation

  • for every nonterminal n, calculate the set of all

symbols which are left corners of constituents reachable from n

slide-123
SLIDE 123

Left-corner parsing with look-ahead

  • become more efficient...
  • ... by reducing number of rules that can be used

to generate next derivation

  • for every nonterminal n, calculate the set of all

symbols which are left corners of constituents reachable from n => relation “LINK“

slide-124
SLIDE 124

Left-corner parsing with look-ahead

LINK(G)

slide-125
SLIDE 125

Left-corner parsing with look-ahead

LINK(G) set of all ordered pairs <X, Y> with X Є N and Y Є (N U T) which fulfill either of these conditions:

slide-126
SLIDE 126

Left-corner parsing with look-ahead

LINK(G) set of all ordered pairs <X, Y> with X Є N and Y Є (N U T) which fulfill either of these conditions:

1) X = Y (reflexivity)

slide-127
SLIDE 127

Left-corner parsing with look-ahead

LINK(G) set of all ordered pairs <X, Y> with X Є N and Y Є (N U T) which fulfill either of these conditions:

1) X = Y (reflexivity) 2) there is a rule X -> Yα Є R

slide-128
SLIDE 128

Left-corner parsing with look-ahead

LINK(G) set of all ordered pairs <X, Y> with X Є N and Y Є (N U T) which fulfill either of these conditions:

1) X = Y (reflexivity) 2) there is a rule X -> Yα Є R 3) <X, X'> Є LINK(G) and <X',Y> Є LINK(G) for an arbitrary X' Є N (transitivity)

slide-129
SLIDE 129

Left-corner parsing with look-ahead

LINK(G) set of all ordered pairs <X, Y> with X Є N and Y Є (N U T) which fulfill either of these conditions:

1) X = Y (reflexivity) 2) there is a rule X -> Yα Є R 3) <X, X'> Є LINK(G) and <X',Y> Є LINK(G) for an arbitrary X' Є N (transitivity)

=> should be calculated before parsing

slide-130
SLIDE 130

Left-corner parsing with look-ahead

Example

Grammar G with rules:

S -> X2 X3 X4 X2 -> e f X3 -> X1 X1 -> g X4 -> h

slide-131
SLIDE 131

Left-corner parsing with look-ahead

Example

Grammar G with rules:

S -> X2 X3 X4 X2 -> e f X3 -> X1 X1 -> g X4 -> h

slide-132
SLIDE 132

Left-corner parsing with look-ahead

Example

Grammar G with rules:

S -> X2 X3 X4 X2 -> e f X3 -> X1 X1 -> g X4 -> h LINK(G) = {<S,S>, <X1, X1>, <X2, X2>, <X3, X3>, <X4, X4>, <S, X2>, <S, e>, <X2, e>, <X1, g>, <X3, X1>, <X3, g>, <X4, h>

slide-133
SLIDE 133

Left-corner parsing with look-ahead

Example

Grammar G with rules:

S -> X2 X3 X4 X2 -> e f X3 -> X1 X1 -> g X4 -> h LINK(G) = {<S,S>, <X1, X1>, <X2, X2>, <X3, X3>, <X4, X4>, <S, X2>, <S, e>, <X2, e>, <X1, g>, <X3, X1>, <X3, g>, <X4, h> => strings like 'fghe' or 'hefg' needn't be parsed at all!

slide-134
SLIDE 134

Left-corner parsing with look-ahead

Modifying the procedures

  • nly necessary change: REDUCELC/LA

Preconditions: 1) There is a rule k0 -> k1 ... kn in R or k1 is part of k0 for an arbitrary lexical category k0 2) first(CATEGORIES) Є (N U T) 3) <first(CATEGORIES), k0> Є LINK(G) Input: SENTENCE with first = k1; CATEGORIES; CONSTITUENTS; STRUCTURE Output: pop(SENTENCE); push(k2 ... kn t, CATEGORIES); push(k0, CONSTITUENTS); structure1(STRUCTURE)

slide-135
SLIDE 135

Questions?

slide-136
SLIDE 136

Comparison to other approaches

slide-137
SLIDE 137

Comparison to other approaches

Drawback of top-down:

slide-138
SLIDE 138

Comparison to other approaches

Drawback of top-down:

  • ignores what the actual input string looks like most of

the time

slide-139
SLIDE 139

Comparison to other approaches

Drawback of top-down:

  • ignores what the actual input string looks like most of

the time

Drawback of bottom-up:

slide-140
SLIDE 140

Comparison to other approaches

Drawback of top-down:

  • ignores what the actual input string looks like most of

the time

Drawback of bottom-up:

  • we don't know what we're trying to build at the moment
slide-141
SLIDE 141

Comparison to other approaches

Drawback of top-down:

  • ignores what the actual input string looks like most of

the time

Drawback of bottom-up:

  • we don't know what we're trying to build at the moment

=> Left-corner can handle these... examples follow!

slide-142
SLIDE 142

Comparison to other approaches

Example TD vs LC

slide-143
SLIDE 143

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

slide-144
SLIDE 144

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died.

slide-145
SLIDE 145

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down:

slide-146
SLIDE 146

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP

slide-147
SLIDE 147

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP

slide-148
SLIDE 148

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP -> DEAD END!

Vincent isn't det, det cannot be expanded => need to backtrack ;-(

slide-149
SLIDE 149

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP -> DEAD END!

Vincent isn't det, det cannot be expanded => need to backtrack ;-(

Left-corner:

slide-150
SLIDE 150

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP -> DEAD END!

Vincent isn't det, det cannot be expanded => need to backtrack ;-(

Left-corner: predict S (TD)

slide-151
SLIDE 151

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP -> DEAD END!

Vincent isn't det, det cannot be expanded => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize PN (BU)

slide-152
SLIDE 152

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP -> DEAD END!

Vincent isn't det, det cannot be expanded => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize PN (BU) -> select rule 'NP -> PN'

slide-153
SLIDE 153

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP -> DEAD END!

Vincent isn't det, det cannot be expanded => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize PN (BU) -> select rule 'NP -> PN'

  • > select rule 'S -> NP VP'
slide-154
SLIDE 154

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP -> DEAD END!

Vincent isn't det, det cannot be expanded => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize PN (BU) -> select rule 'NP -> PN'

  • > select rule 'S -> NP VP' -> MATCH!
slide-155
SLIDE 155

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP -> DEAD END!

Vincent isn't det, det cannot be expanded => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize PN (BU) -> select rule 'NP -> PN'

  • > select rule 'S -> NP VP' -> MATCH!
  • > input: died – predict VP (TD)
slide-156
SLIDE 156

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP -> DEAD END!

Vincent isn't det, det cannot be expanded => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize PN (BU) -> select rule 'NP -> PN'

  • > select rule 'S -> NP VP' -> MATCH!
  • > input: died – predict VP (TD) -> recognize IV (BU)
slide-157
SLIDE 157

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP -> DEAD END!

Vincent isn't det, det cannot be expanded => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize PN (BU) -> select rule 'NP -> PN'

  • > select rule 'S -> NP VP' -> MATCH!
  • > input: died – predict VP (TD) -> recognize IV (BU)
  • > select rule 'VP -> IV'
slide-158
SLIDE 158

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP -> DEAD END!

Vincent isn't det, det cannot be expanded => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize PN (BU) -> select rule 'NP -> PN'

  • > select rule 'S -> NP VP' -> MATCH!
  • > input: died – predict VP (TD) -> recognize IV (BU)
  • > select rule 'VP -> IV' -> MATCH!
slide-159
SLIDE 159

Comparison to other approaches

Example TD vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV det -> the N -> robber PN -> Vincent IV -> died

Input sentence: Vincent died. Top-down: S -> NP VP -> det N VP -> DEAD END!

Vincent isn't det, det cannot be expanded => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize PN (BU) -> select rule 'NP -> PN'

  • > select rule 'S -> NP VP' -> MATCH!
  • > input: died – predict VP (TD) -> recognize IV (BU)
  • > select rule 'VP -> IV' -> MATCH! => successful parse
slide-160
SLIDE 160

Comparison to other approaches

Example BU vs LC

slide-161
SLIDE 161

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

slide-162
SLIDE 162

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died

slide-163
SLIDE 163

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up:

slide-164
SLIDE 164

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died

slide-165
SLIDE 165

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died

slide-166
SLIDE 166

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died

slide-167
SLIDE 167

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV

slide-168
SLIDE 168

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP

slide-169
SLIDE 169

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END!

slide-170
SLIDE 170

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

slide-171
SLIDE 171

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

Left-corner:

slide-172
SLIDE 172

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

Left-corner: predict S (TD)

slide-173
SLIDE 173

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize det (BU)

slide-174
SLIDE 174

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize det (BU)

  • > select rule 'NP -> det N'
slide-175
SLIDE 175

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize det (BU)

  • > select rule 'NP -> det N' -> recognize N (BU)
slide-176
SLIDE 176

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize det (BU)

  • > select rule 'NP -> det N' -> recognize N (BU) -> MATCH!
slide-177
SLIDE 177

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize det (BU)

  • > select rule 'NP -> det N' -> recognize N (BU) -> MATCH!
  • > select rule 'S -> NP VP'
slide-178
SLIDE 178

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize det (BU)

  • > select rule 'NP -> det N' -> recognize N (BU) -> MATCH!
  • > select rule 'S -> NP VP' -> input: died – predict VP (TD)
slide-179
SLIDE 179

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize det (BU)

  • > select rule 'NP -> det N' -> recognize N (BU) -> MATCH!
  • > select rule 'S -> NP VP' -> input: died – predict VP (TD) ->

recognize IV (BU)

slide-180
SLIDE 180

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize det (BU)

  • > select rule 'NP -> det N' -> recognize N (BU) -> MATCH!
  • > select rule 'S -> NP VP' -> input: died – predict VP (TD) ->

recognize IV (BU) -> select rule 'VP -> IV'

slide-181
SLIDE 181

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize det (BU)

  • > select rule 'NP -> det N' -> recognize N (BU) -> MATCH!
  • > select rule 'S -> NP VP '-> input: died – predict VP (TD) ->

recognize IV (BU) -> select rule 'VP -> IV'

  • > MATCH!
slide-182
SLIDE 182

Comparison to other approaches

Example BU vs LC

Grammar: S -> NP VP

NP -> det N NP -> PN VP -> IV VP -> TV NP TV -> plant IV -> died det -> the N -> plant

Input sentence: the plant died Bottom-up: the plant died -> det plant died -> det TV died -> det TV IV ->

det TV VP -> DEAD END! => need to backtrack ;-(

Left-corner: predict S (TD) -> recognize det (BU)

  • > select rule 'NP -> det N' -> recognize N (BU) -> MATCH!
  • > select rule 'S -> NP VP' -> input: died – predict VP (TD) ->

recognize IV (BU) -> select rule 'VP -> IV'

  • > MATCH! => successful parse
slide-183
SLIDE 183

Comparison to other approaches

Conclusion and outlook

slide-184
SLIDE 184

Comparison to other approaches

Conclusion and outlook

  • left-corner diminishes risk of having to

backtrack after a series of wrong moves

slide-185
SLIDE 185

Comparison to other approaches

Conclusion and outlook

  • left-corner diminishes risk of having to

backtrack after a series of wrong moves

  • but: also combines some of the problems TD

and BU have => hardly used in practice

slide-186
SLIDE 186

Comparison to other approaches

Conclusion and outlook

  • left-corner parsing might be a good model for

the human parser!

slide-187
SLIDE 187

Comparison to other approaches

Conclusion and outlook

  • left-corner parsing might be a good model for

the human parser!

Complexity issues:

Strategy Left-branching Center Embedding Right-branching TD O(n) O(n) O(1) BU O(1) O(n) O(n) LC O(1) O(n) O(1)

table taken from Shravan Vasishth's HSP slides

slide-188
SLIDE 188

Questions?

slide-189
SLIDE 189

Bibliography

  • Naumann, Sven and Langer, Haben 1994. Parsing.

Eine Einführung in die maschinelle Analyse natürlicher

  • Sprache. B.G. Teubner Stuttgart
  • a very short section from the Grune & Jacobs book
  • http://www.coli.uni-saarland.de/~kris/nlp-with-

prolog/html/node53.html

  • Shravan Vasishth's slides for the Human Sentence

Processing seminar from last semester

slide-190
SLIDE 190

Thanks for your attention!