Natural Language Processing (CSE 490U): Compositional Semantics - - PowerPoint PPT Presentation

natural language processing cse 490u compositional
SMART_READER_LITE
LIVE PREVIEW

Natural Language Processing (CSE 490U): Compositional Semantics - - PowerPoint PPT Presentation

Natural Language Processing (CSE 490U): Compositional Semantics Noah Smith 2017 c University of Washington nasmith@cs.washington.edu March 1, 2017 1 / 56 Bridging the Gap between Language and the World In order to link NL to a knowledge


slide-1
SLIDE 1

Natural Language Processing (CSE 490U): Compositional Semantics

Noah Smith

c 2017 University of Washington nasmith@cs.washington.edu

March 1, 2017

1 / 56

slide-2
SLIDE 2

Bridging the Gap between Language and the World

In order to link NL to a knowledge base, we might want to design a formal way to represent meaning. Desiderata for a meaning representation language:

2 / 56

slide-3
SLIDE 3

Bridging the Gap between Language and the World

In order to link NL to a knowledge base, we might want to design a formal way to represent meaning. Desiderata for a meaning representation language:

◮ represent the state of the world, i.e., a knowledge base

3 / 56

slide-4
SLIDE 4

Bridging the Gap between Language and the World

In order to link NL to a knowledge base, we might want to design a formal way to represent meaning. Desiderata for a meaning representation language:

◮ represent the state of the world, i.e., a knowledge base ◮ query the knowledge base (e.g., verify that a statement is

true, or answer a question)

4 / 56

slide-5
SLIDE 5

Bridging the Gap between Language and the World

In order to link NL to a knowledge base, we might want to design a formal way to represent meaning. Desiderata for a meaning representation language:

◮ represent the state of the world, i.e., a knowledge base ◮ query the knowledge base (e.g., verify that a statement is

true, or answer a question)

◮ handle ambiguity, vagueness, and non-canonical forms

◮ “I wanna eat someplace that’s close to UW” ◮ “something not too spicy” 5 / 56

slide-6
SLIDE 6

Bridging the Gap between Language and the World

In order to link NL to a knowledge base, we might want to design a formal way to represent meaning. Desiderata for a meaning representation language:

◮ represent the state of the world, i.e., a knowledge base ◮ query the knowledge base (e.g., verify that a statement is

true, or answer a question)

◮ handle ambiguity, vagueness, and non-canonical forms

◮ “I wanna eat someplace that’s close to UW” ◮ “something not too spicy”

◮ support inference and reasoning

◮ “can Karen eat at Schultzy’s?” 6 / 56

slide-7
SLIDE 7

Bridging the Gap between Language and the World

In order to link NL to a knowledge base, we might want to design a formal way to represent meaning. Desiderata for a meaning representation language:

◮ represent the state of the world, i.e., a knowledge base ◮ query the knowledge base (e.g., verify that a statement is

true, or answer a question)

◮ handle ambiguity, vagueness, and non-canonical forms

◮ “I wanna eat someplace that’s close to UW” ◮ “something not too spicy”

◮ support inference and reasoning

◮ “can Karen eat at Schultzy’s?”

Eventually (but not today):

◮ deal with non-literal meanings ◮ expressiveness across a wide range of subject matter

7 / 56

slide-8
SLIDE 8

A (Tiny) World Model

◮ Domain: Adrian, Brook, Chris, Donald, Schultzy’s Sausage,

Din Tai Fung, Banana Leaf, American, Chinese, Thai

◮ Property: Din Tai Fung has a long wait, Schultzy’s is noisy;

Alice, Bob, and Charles are human

◮ Relations: Schultzy’s serves American, Din Tai Fung serves

Chinese, and Banana Leaf serves Thai Simple questions are easy:

◮ Is Schultzy’s noisy? ◮ Does Din Tai Fung serve Thai?

8 / 56

slide-9
SLIDE 9

A (Tiny) World Model

◮ Domain: Adrian, Brook, Chris, Donald, Schultzy’s Sausage,

Din Tai Fung, Banana Leaf, American, Chinese, Thai a, b, c, d, ss, dtf , bl, am, ch, th

◮ Property: Din Tai Fung has a long wait, Schultzy’s is noisy;

Alice, Bob, and Charles are human Longwait = {dtf }, Noisy = {ss}, Human = {a, b, c}

◮ Relations: Schultzy’s serves American, Din Tai Fung serves

Chinese, and Banana Leaf serves Thai Serves = {(ss, am), (dtf , ch), (bl, th)}, Likes = {(a, ss), (a, dtf ), . . .} Simple questions are easy:

◮ Is Schultzy’s noisy? ◮ Does Din Tai Fung serve Thai?

9 / 56

slide-10
SLIDE 10

A Quick Tour of First-Order Logic

◮ Term: a constant (ss) or a variable ◮ Formula: defined inductively . . .

◮ If R is an n-ary relation and t1, . . . , tn are terms, then

R(t1, . . . , tn) is a formula.

◮ If φ is a formula, then its negation, ¬φ, is a formula. ◮ If φ and ψ are formulas, then binary logical connectives can be

used to create formulas:

◮ φ ∧ ψ ◮ φ ∨ ψ ◮ φ ⇒ ψ ◮ φ ⊕ ψ ◮ If φ is a formula and v is a variable, then quantifiers can be

used to create formulas:

◮ Universal quantifier: ∀v, φ ◮ Existential quantifier: ∃v, φ

Note: Leaving out functions, because we don’t need them in a single lecture on FOL for NL.

10 / 56

slide-11
SLIDE 11

Translating Between FOL and NL

  • 1. Schultzy’s is not loud
  • 2. Some human likes Chinese
  • 3. If a person likes Thai, then they aren’t friends with Donald
  • 4. ∀x, Restaurant(x) ⇒ (Longwait(x) ∨ ¬Likes(a, x))
  • 5. ∀x, ∃y, ¬Likes(x, y)
  • 6. ∃y, ∀x, ¬Likes(x, y)

11 / 56

slide-12
SLIDE 12

Translating Between FOL and NL

  • 1. Schultzy’s is not loud

¬Noisy(ss)

  • 2. Some human likes Chinese
  • 3. If a person likes Thai, then they aren’t friends with Donald
  • 4. ∀x, Restaurant(x) ⇒ (Longwait(x) ∨ ¬Likes(a, x))
  • 5. ∀x, ∃y, ¬Likes(x, y)
  • 6. ∃y, ∀x, ¬Likes(x, y)

12 / 56

slide-13
SLIDE 13

Translating Between FOL and NL

  • 1. Schultzy’s is not loud

¬Noisy(ss)

  • 2. Some human likes Chinese

∃x, Human(x) ∧ Likes(x, ch)

  • 3. If a person likes Thai, then they aren’t friends with Donald
  • 4. ∀x, Restaurant(x) ⇒ (Longwait(x) ∨ ¬Likes(a, x))
  • 5. ∀x, ∃y, ¬Likes(x, y)
  • 6. ∃y, ∀x, ¬Likes(x, y)

13 / 56

slide-14
SLIDE 14

Translating Between FOL and NL

  • 1. Schultzy’s is not loud

¬Noisy(ss)

  • 2. Some human likes Chinese

∃x, Human(x) ∧ Likes(x, ch)

  • 3. If a person likes Thai, then they aren’t friends with Donald

∀x, Human(x) ∧ Likes(x, th) ⇒ ¬Friends(x, d)

  • 4. ∀x, Restaurant(x) ⇒ (Longwait(x) ∨ ¬Likes(a, x))
  • 5. ∀x, ∃y, ¬Likes(x, y)
  • 6. ∃y, ∀x, ¬Likes(x, y)

14 / 56

slide-15
SLIDE 15

Translating Between FOL and NL

  • 1. Schultzy’s is not loud

¬Noisy(ss)

  • 2. Some human likes Chinese

∃x, Human(x) ∧ Likes(x, ch)

  • 3. If a person likes Thai, then they aren’t friends with Donald

∀x, Human(x) ∧ Likes(x, th) ⇒ ¬Friends(x, d)

  • 4. ∀x, Restaurant(x) ⇒ (Longwait(x) ∨ ¬Likes(a, x))

Every restaurant has a long wait or is disliked by Adrian.

  • 5. ∀x, ∃y, ¬Likes(x, y)
  • 6. ∃y, ∀x, ¬Likes(x, y)

15 / 56

slide-16
SLIDE 16

Translating Between FOL and NL

  • 1. Schultzy’s is not loud

¬Noisy(ss)

  • 2. Some human likes Chinese

∃x, Human(x) ∧ Likes(x, ch)

  • 3. If a person likes Thai, then they aren’t friends with Donald

∀x, Human(x) ∧ Likes(x, th) ⇒ ¬Friends(x, d)

  • 4. ∀x, Restaurant(x) ⇒ (Longwait(x) ∨ ¬Likes(a, x))

Every restaurant has a long wait or is disliked by Adrian.

  • 5. ∀x, ∃y, ¬Likes(x, y)

Everybody has something they don’t like.

  • 6. ∃y, ∀x, ¬Likes(x, y)

16 / 56

slide-17
SLIDE 17

Translating Between FOL and NL

  • 1. Schultzy’s is not loud

¬Noisy(ss)

  • 2. Some human likes Chinese

∃x, Human(x) ∧ Likes(x, ch)

  • 3. If a person likes Thai, then they aren’t friends with Donald

∀x, Human(x) ∧ Likes(x, th) ⇒ ¬Friends(x, d)

  • 4. ∀x, Restaurant(x) ⇒ (Longwait(x) ∨ ¬Likes(a, x))

Every restaurant has a long wait or is disliked by Adrian.

  • 5. ∀x, ∃y, ¬Likes(x, y)

Everybody has something they don’t like.

  • 6. ∃y, ∀x, ¬Likes(x, y)

There exists something that nobody likes.

17 / 56

slide-18
SLIDE 18

Logical Semantics

(Montague, 1970)

The denotation of a NL sentence is the set of conditions that must hold in the (model) world for the sentence to be true. Every restaurant has a long wait or Adrian doesn’t like it. is true if and only if ∀x, Restaurant(x) ⇒ (Longwait(x) ∨ ¬Likes(a, x)) is true. This is sometimes called the logical form of the NL sentence.

18 / 56

slide-19
SLIDE 19

The Principle of Compositionality

The meaning of a NL phrase is determined by the meanings of its sub-phrases.

19 / 56

slide-20
SLIDE 20

The Principle of Compositionality

The meaning of a NL phrase is determined by the meanings of its sub-phrases. I.e., semantics is derived from syntax.

20 / 56

slide-21
SLIDE 21

The Principle of Compositionality

The meaning of a NL phrase is determined by the meanings of its sub-phrases. I.e., semantics is derived from syntax. We need a way to express semantics of phrases, and compose them together!

21 / 56

slide-22
SLIDE 22

λ-Calculus

(Much more powerful than what we’ll see today; ask your PL professor!) Informally, two extensions:

◮ λ-abstraction is another way to “scope” variables.

◮ If φ is a FOL formula and v is a variable, then λv.φ is a

λ-term, meaning: an unnamed function from values (of v) to formulas (usually involving v)

◮ application of such functions: if we have λv.φ and ψ, then

[λv.φ](ψ) is a formula.

◮ It can be reduced by substituting ψ in for every instance of v

in φ.

22 / 56

slide-23
SLIDE 23

λ-Calculus

(Much more powerful than what we’ll see today; ask your PL professor!) Informally, two extensions:

◮ λ-abstraction is another way to “scope” variables.

◮ If φ is a FOL formula and v is a variable, then λv.φ is a

λ-term, meaning: an unnamed function from values (of v) to formulas (usually involving v)

◮ application of such functions: if we have λv.φ and ψ, then

[λv.φ](ψ) is a formula.

◮ It can be reduced by substituting ψ in for every instance of v

in φ.

Example: λx.Likes(x, dtf ) maps things to statements that they like Din Tai Fung

23 / 56

slide-24
SLIDE 24

λ-Calculus

(Much more powerful than what we’ll see today; ask your PL professor!) Informally, two extensions:

◮ λ-abstraction is another way to “scope” variables.

◮ If φ is a FOL formula and v is a variable, then λv.φ is a

λ-term, meaning: an unnamed function from values (of v) to formulas (usually involving v)

◮ application of such functions: if we have λv.φ and ψ, then

[λv.φ](ψ) is a formula.

◮ It can be reduced by substituting ψ in for every instance of v

in φ.

Example: [λx.Likes(x, dtf )](c) reduces to Likes(c, dtf )

24 / 56

slide-25
SLIDE 25

λ-Calculus

(Much more powerful than what we’ll see today; ask your PL professor!) Informally, two extensions:

◮ λ-abstraction is another way to “scope” variables.

◮ If φ is a FOL formula and v is a variable, then λv.φ is a

λ-term, meaning: an unnamed function from values (of v) to formulas (usually involving v)

◮ application of such functions: if we have λv.φ and ψ, then

[λv.φ](ψ) is a formula.

◮ It can be reduced by substituting ψ in for every instance of v

in φ.

Example: λx.λy.Friends(x, y) maps things x to maps of things y to statements that x and y are friends

25 / 56

slide-26
SLIDE 26

λ-Calculus

(Much more powerful than what we’ll see today; ask your PL professor!) Informally, two extensions:

◮ λ-abstraction is another way to “scope” variables.

◮ If φ is a FOL formula and v is a variable, then λv.φ is a

λ-term, meaning: an unnamed function from values (of v) to formulas (usually involving v)

◮ application of such functions: if we have λv.φ and ψ, then

[λv.φ](ψ) is a formula.

◮ It can be reduced by substituting ψ in for every instance of v

in φ.

Example: [λx.λy.Friends(x, y)](b) reduces to λy.Friends(b, y)

26 / 56

slide-27
SLIDE 27

λ-Calculus

(Much more powerful than what we’ll see today; ask your PL professor!) Informally, two extensions:

◮ λ-abstraction is another way to “scope” variables.

◮ If φ is a FOL formula and v is a variable, then λv.φ is a

λ-term, meaning: an unnamed function from values (of v) to formulas (usually involving v)

◮ application of such functions: if we have λv.φ and ψ, then

[λv.φ](ψ) is a formula.

◮ It can be reduced by substituting ψ in for every instance of v

in φ.

Example: [[λx.λy.Friends(x, y)](b)](a) reduces to [λy.Friends(b, y)](a), which reduces to Friends(b, a)

27 / 56

slide-28
SLIDE 28

Semantic Attachments to CFG

◮ NNP → Adrian {a} ◮ VBZ → likes {λf.λy.∀xf(x) ⇒ Likes(y, x)} ◮ JJ → expensive {λx.Expensive(x)} ◮ NNS → restaurants {λx.Restaurant(x)} ◮ NP → NNP {NNP.sem} ◮ NP → JJ NNS {λx.JJ.sem(x) ∧ NNS.sem(x)} ◮ VP → VBZ NP {VBZ.sem(NP.sem)} ◮ S → NP VP {VP.sem(NP.sem)}

28 / 56

slide-29
SLIDE 29

Example

S NP NNP Adrian VP VBZ likes NP JJ expensive NNS restaurants

29 / 56

slide-30
SLIDE 30

Example

S : VP.sem(NP.sem) NP : NNP.sem NNP : a Adrian VP : VBZ.sem(NP.sem) VBZ : . . . likes NP : λv.JJ.sem(v) ∧ NNS.sem(v) JJ : λz.Expensive(z) expensive NNS : λw.Restaurant(w) restaurants

30 / 56

slide-31
SLIDE 31

Example

S : VP.sem(NP.sem) NP : NNP.sem NNP : a Adrian VP : VBZ.sem(NP.sem) VBZ : . . . likes NP : λv.Expensive(v) ∧ Restaurant(v) JJ : λz.Expensive(z) expensive NNS : λw.Restaurant(w) restaurants

λv.  λz.Expensive(z)

  • JJ.sem

  (v) ∧  λw.Restaurant(w)

  • NNS.sem

  (v)

31 / 56

slide-32
SLIDE 32

Example

. . . VP : VBZ.sem(NP.sem) VBZ : λf.λy.∀xf(x) ⇒ Likes(y, x) likes NP : λv.Expensive(v) ∧ Restaurant(v) expensive restaurants

32 / 56

slide-33
SLIDE 33

Example

. . . VP : λy.∀x, Expensive(x) ∧ Restaurant(x) ⇒ Likes(y, x) VBZ : λf.λy.∀xf(x) ⇒ Likes(y, x) likes NP : λv.Expensive(v) ∧ Restaurant(v) expensive restaurants

 λf.λy.∀xf(x) ⇒ Likes(y, x)

  • VBZ.sem

   λv.Expensive(v) ∧ Restaurant(v)

  • NP.sem

  λy.∀x [λv.Expensive(v) ∧ Restaurant(v)] (x) ⇒ Likes(y, x) λy.∀x, Expensive(x) ∧ Restaurant(x) ⇒ Likes(y, x)

33 / 56

slide-34
SLIDE 34

Example

S : VP.sem(NP.sem) NP : NNP.sem NNP : a Adrian VP : λy.∀x, Expensive(x) ∧ Restaurant(x) ⇒ Likes(y, x) likes expensive restaurants

34 / 56

slide-35
SLIDE 35

Example

S : VP.sem(NP.sem) NP : a NNP : a Adrian VP : λy.∀x, Expensive(x) ∧ Restaurant(x) ⇒ Likes(y, x) likes expensive restaurants

35 / 56

slide-36
SLIDE 36

Example

S : ∀x, Expensive(x) ∧ Restaurant(x) ⇒ Likes(a, x) NP : a NNP : a Adrian VP : λy.∀x, Expensive(x) ∧ Restaurant(x) ⇒ Likes(y, x) likes expensive restaurants

 λy.∀x, Expensive(x) ∧ Restaurant(x) ⇒ Likes(y, x)

  • VP.sem

    a

  • NP.sem

  ∀x, Expensive(x) ∧ Restaurant(x) ⇒ Likes(a, x)

36 / 56

slide-37
SLIDE 37

Graph-Based Representations

Abstract Meaning Representation (Banarescu et al., 2013)

want-01 boy visit-01 city name “New” “York” “City” ARG0 ARG1 ARG0 ARG1 name

  • p1
  • p2
  • p3

“The boy wants to visit New York City.” Designed for (1) annotation-ability and (2) eventual use in machine translation.

37 / 56

slide-38
SLIDE 38

Combinatory Categorial Grammar

(Steedman, 2000)

CCG is a grammatical formalism that is well-suited for tying together syntax and semantics. Formally, it is more powerful than CFG—it can represent some of the context-sensitive languages (which we do not have time to define formally).

38 / 56

slide-39
SLIDE 39

CCG Types

Instead of the “N” of CFGs, CCGs can have an infinitely large set

  • f structured categories (called types).

◮ Primitive types: typically S, NP, N, and maybe more ◮ Complex types, built with “slashes,” for example:

◮ S/NP is “an S, except that it lacks an NP to the right” ◮ S\NP is “an S, except that it lacks an NP to its left” ◮ (S\NP)/NP is “an S, except that it lacks an NP to its right,

and its left”

You can think of complex types as functions, e.g., S/NP maps NPs to Ss.

39 / 56

slide-40
SLIDE 40

CCG Combinators

Instead of the production rules of CFGs, CCGs have a very small set of generic combinators that tell us how we can put types together. Convention writes the rule differently from CFG: X Y ⇒ Z means that X and Y combine to form a Z (the “parent” in the tree).

40 / 56

slide-41
SLIDE 41

Application Combinator

Forward (X/Y Y ⇒ X) and backward (Y X\Y ⇒ X)

41 / 56

slide-42
SLIDE 42

Application Combinator

Forward (X/Y Y ⇒ X) and backward (Y X\Y ⇒ X) NP NP/N the N dog

42 / 56

slide-43
SLIDE 43

Application Combinator

Forward (X/Y Y ⇒ X) and backward (Y X\Y ⇒ X) NP NP/N the N N/N yellow N dog

43 / 56

slide-44
SLIDE 44

Application Combinator

Forward (X/Y Y ⇒ X) and backward (Y X\Y ⇒ X) S NP NP/N the N dog S\NP (S\NP)/NP bit NP John

44 / 56

slide-45
SLIDE 45

Conjunction Combinator

X and X ⇒ X NP NP cats and NP dogs

45 / 56

slide-46
SLIDE 46

Conjunction Combinator

X and X ⇒ X S NP John S\NP S\NP (S\NP)/NP ate NP anchovies and S\NP (S\NP)/NP drank NP beer

46 / 56

slide-47
SLIDE 47

Conjunction Combinator

X and X ⇒ X S NP NP/N the N dog S\NP (S\NP)/NP (S\NP)/NP bit and (S\NP)/NP infected NP John

47 / 56

slide-48
SLIDE 48

Composition Combinator

Forward (X/Y Y/Z ⇒ X/Z) and backward (Y \Z X\Y ⇒ X\Z) S NP I S\NP (S\NP)/NP (S\NP)/(S\NP) would (S\NP)/NP prefer NP

  • lives

48 / 56

slide-49
SLIDE 49

Composition Combinator

Forward (X/Y Y/Z ⇒ X/Z) and backward (Y \Z X\Y ⇒ X\Z) S NP I S\NP (S\NP)/(S\NP) would S \NP (S\NP)/NP prefer NP

  • lives

49 / 56

slide-50
SLIDE 50

Type-Raising Combinator

Forward (X ⇒ Y/(Y \X)) and backward (X ⇒ Y \(Y/X)) S S/NP S/NP S/(S\NP) NP I (S\NP)/NP love and S/NP S/(S\NP) NP Karen (S\NP)/NP hates NP chocolate

50 / 56

slide-51
SLIDE 51

Back to Semantics

Each combinator also tells us what to do with the semantic attachments.

◮ Forward application: X/Y : f

Y : g ⇒ X : f(g)

◮ Forward composition:

X/Y : f Y/Z : g ⇒ X/Z : λx.f(g(x))

◮ Forward type-raising: X : g ⇒ Y/(Y \X) : λf.f(g)

51 / 56

slide-52
SLIDE 52

CCG Lexicon

Most of the work is done in the lexicon! Syntactic and semantic information is much more formal here.

◮ Slash categories define where all the syntactic arguments are

expected to be

◮ λ-expressions define how the expected arguments get “used”

to build up a FOL expression Extensive discussion: Carpenter (1997)

52 / 56

slide-53
SLIDE 53

Some Topics We Don’t Have Time For

◮ Tasks, evaluations, annotated datasets (e.g., CCGbank,

Hockenmaier and Steedman, 2007)

◮ Learning for semantic parsing (Zettlemoyer and Collins, 2005)

and CCG parsing (Clark and Curran, 2004a)

◮ Using CCG to represent other kinds of semantics (e.g.,

predicate-argument structures; Lewis and Steedman, 2014)

◮ Integrating continuous representations in semantic parsing

(Lewis and Steedman, 2013; Krishnamurthy and Mitchell, 2013)

◮ Supertagging (Clark and Curran, 2004b) and making semantic

parsing efficient (Lewis and Steedman, 2014)

53 / 56

slide-54
SLIDE 54

To-Do List

◮ Compositional semantics chapter in Jurafsky and Martin

(2008).

54 / 56

slide-55
SLIDE 55

References I

Laura Banarescu, Claire Bonial, Shu Cai, Madalina Georgescu, Kira Griffitt, Ulf Hermjakob, Kevin Knight, Philipp Koehn, Martha Palmer, and Nathan Schneider. Abstract meaning representation for sembanking. In Proc. of the Linguistic Annotation Workshop and Interoperability with Discourse, 2013. Bob Carpenter. Type-logical semantics. MIT Press, 1997. Stephen Clark and James R. Curran. Parsing the WSJ using CCG and log-linear

  • models. In Proc. of ACL, 2004a.

Stephen Clark and James R. Curran. The importance of supertagging for wide-coverage CCG parsing. In Proc. of COLING, 2004b. Julia Hockenmaier and Mark Steedman. CCGbank: a corpus of CCG derivations and dependency structures extracted from the Penn Treebank. Computational Linguistics, 33(3):355–396, 2007. Daniel Jurafsky and James H. Martin. Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition. Prentice Hall, second edition, 2008. Jayant Krishnamurthy and Tom Mitchell. Vector space semantic parsing: A framework for compositional vector space models. 2013. Mike Lewis and Mark Steedman. Combining distributional and logical semantics. Transactions of the Association for Computational Linguistics, 1:179–192, 2013. Mike Lewis and Mark Steedman. A* CCG parsing with a supertag-factored model. In

  • Proc. of EMNLP, 2014.

55 / 56

slide-56
SLIDE 56

References II

Richard Montague. Universal grammar. Theoria, 36:373–398, 1970. Mark Steedman. The Syntactic Process. MIT Press, 2000. Luke Zettlemoyer and Michael Collins. Learning to map sentences to logical form: Structured classification with probabilistic categorial grammars. In Proc. of UAI, 2005.

56 / 56