Scott Wen en-tau au Yih Who is Justin Biebers sister? Jazmyn - - PowerPoint PPT Presentation

β–Ά
scott wen en tau au yih
SMART_READER_LITE
LIVE PREVIEW

Scott Wen en-tau au Yih Who is Justin Biebers sister? Jazmyn - - PowerPoint PPT Presentation

Scott Wen en-tau au Yih Who is Justin Biebers sister? Jazmyn Bieber semantic parsing Knowledge . sister_of(justin_bieber, ) Base query matching . sibling_of(justin_bieber, x ) gender( x , female) Who is Justin


slide-1
SLIDE 1

Scott Wen en-tau au Yih

slide-2
SLIDE 2
slide-3
SLIDE 3
slide-4
SLIDE 4

Knowledge Base

πœ‡π‘¦. sister_of(justin_bieber, 𝑦) Who is Justin Bieber’s sister? πœ‡π‘¦. sibling_of(justin_bieber, x) ∧ gender(x, female) semantic parsing query matching Jazmyn Bieber

slide-5
SLIDE 5

Knowledge Base

Who is Justin Bieber’s sister? πœ‡π‘¦. sibling_of(justin_bieber, x) ∧ gender(x, female) semantic parsing query Jazmyn Bieber

slide-6
SLIDE 6

β€œWhat was the date that Minnesota became a state?” β€œWhen was the state Minnesota created?” β€œMinnesota's date it entered the union?” location.dated_location.date_founded

slide-7
SLIDE 7

directly grows staged

Basic idea

slide-8
SLIDE 8

Addresses Key Challenges

52.5

slide-9
SLIDE 9
  • Introduction
  • Background
  • Graph knowledge base
  • Query graph
slide-10
SLIDE 10
  • Family Guy

cvt2 Meg Griffin Lacey Chabert 1/31/1999 cvt1

from

12/26/1999 cvt3

series

Mila Kunis

slide-11
SLIDE 11

Family Guy

cast

Meg Griffin

argmin

x y

topic entity core inferential chain constraints

slide-12
SLIDE 12
  • Introduction
  • Background
  • Staged Query Graph Generation (Our Approach)
  • Link topic entity
  • Identify core inferential chain
  • Augment constraints
slide-13
SLIDE 13

Staged

Meg Family Guy

Family Guy

s1

Meg Griffin

s2 Ο• s0

(1) Link Topic Entity

slide-14
SLIDE 14

Staged

Meg Family Guy (2) Identify Core Inferential Chain

Family Guy

s1

Family Guy

cast actor

x y

s3

Family Guy

writer start

x y

s4

Family Guy

genre

x

s5

slide-15
SLIDE 15

Staged

Meg Family Guy (3) Augment Constraints

Family Guy

cast actor

x y

Family Guy

cast actor

x y

Meg Griffin Family Guy

x y

Meg Griffin

argmin

s3 s6 s7

slide-16
SLIDE 16

Family Guy

s1

Meg Griffin

s2 Ο• s0

slide-17
SLIDE 17

Family Guy

s1

Family Guy

cast actor

x y

s3

Family Guy

writer start

x y

s4

Family Guy

genre

x

s5

Who first voiced Meg on Family Guy? {castβˆ’actor, writerβˆ’start, genre}

slide-18
SLIDE 18
  • Input is mapped to two 𝑙-dimensional vectors

𝑄 𝑆 𝑄 = exp cos(𝑧𝑆, 𝑧𝑄) 𝑆′ exp cos(𝑧𝑆′, 𝑧𝑄)

𝑧𝑄 ∈ R𝑙 𝑧𝑆 ∈ R𝑙

who voiced meg on 𝑓 castβˆ’actor

15K 15K 15K 15K 15K 1000 1000 1000

max max

... ... ...

max

300

...

...

<s> w1 w2 wT </s> ... ...

300

slide-19
SLIDE 19
  • Who

voiced Family Guy cast FamilyGuy 𝑧 actor 𝑧 𝑦

  • One or more constraint nodes can be added to 𝑧 or 𝑦
  • 𝑧 : Additional property of this event (e.g., character 𝑧 MegGriffin )
  • 𝑦 : Additional property of the answer entity (e.g., gender)

Family Guy

cast actor

x y

Family Guy

cast actor

x y

Meg Griffin Family Guy

x y

Meg Griffin

argmin

s3 s6 s7

Family Guy

cast actor

x y

s3

slide-20
SLIDE 20

Who first voiced Meg on Family Guy?

Family Guy

cast actor

x y

s3

Family Guy

writer start

x y

s4

slide-21
SLIDE 21

Who first voiced Meg on Family Guy?

Family Guy

cast actor

x y

s3

Family Guy

x y

Meg Griffin

argmin

s7

slide-22
SLIDE 22

π‘Ÿ =Who first voiced Meg on Family Guy?

Family Guy

cast

Meg Griffin

argmin

x y

𝑑 =

slide-23
SLIDE 23
  • Introduction
  • Background
  • Staged Query Graph Generation (Our Approach)
  • Experiments
  • Data & evaluation metric
  • Creating training data from Q/A pairs
  • Results
slide-24
SLIDE 24
  • What character did Natalie Portman play in Star Wars?

Padme Amidala

  • What currency do you use in Costa Rica?

Costa Rican colon

  • What did Obama study in school?

political science

  • What do Michelle Obama do for a living?

writer, lawyer

  • What killed Sammy Davis Jr?

throat cancer

[Examples from Berant]

slide-25
SLIDE 25

Relation Matching (Identifying Core Inferential Chain)

Pattern Inferential Chain what was <e> known for people.person.profession what kind of government does <e> have location.country.form_of_government what year were the <e> established sports.sports_team.founded what city was <e> born in people.person.place_of_birth what did <e> die from people.deceased_person.cause_of_death who married <e> people.person.spouse_s people.marriage.spouse

slide-26
SLIDE 26

Reward Function 𝛿

slide-27
SLIDE 27

33 35.7 37.5 39.2 39.9 41.3 44.3 45.3 52.5 10 20 30 40 50 60

  • Avg. F1 (Accuracy) on WebQuestions Test Set

Yao-14 Berant-13 Bao-14 Bordes-14b Berant-14 Yang-14 Yao-15 Wang-14 Yih-15

slide-28
SLIDE 28

Method #Entities Covered Ques. Labeled Ent. Freebase API 19,485 98.8% 81.2% Yang & Chang, ACL-15 9,147 99.8% 87.8%

52.5% 48.4%

slide-29
SLIDE 29

49.6 52.5

slide-30
SLIDE 30

A random sample of 100 incorrectly answered questions

slide-31
SLIDE 31

directly

slide-32
SLIDE 32

http://aka.ms/sent2vec http://aka.ms/codalab-webq http://aka.ms/stagg