the generation of referring expressions the generation of
play

The Generation of Referring Expressions: The Generation of Referring - PowerPoint PPT Presentation

The Generation of Referring Expressions: The Generation of Referring Expressions: Where We've Been, How We Got Here, and Where Where We've Been, How We Got Here, and Where We re Going We We're Going We re Going re Going Robert Dale Robert


  1. The Generation of Referring Expressions: The Generation of Referring Expressions: Where We've Been, How We Got Here, and Where Where We've Been, How We Got Here, and Where We re Going We We're Going We re Going re Going Robert Dale Robert Dale Robert Dale Robert Dale Robert.Dale@mq.edu.au Robert.Dale@mq.edu.au Athens 2008-05-21 1

  2. The Aims of This Talk The Aims of This Talk • To outl To outline ine what referring expressi hat referring expression generation is about on generation is about • To characterise To characterise the current state the current state of the art and developments in of the art and developments in the field the field • To outl To outline an ine an agenda agenda for future work in the area for future work in the area Athens 2008-05-21 2

  3. Outline Outline • The Context: Natural Language Generation The Context: Natural Language Generation • The Story So Far: The Story So Far: Algorithm Development to Empiricism Algorithm Development to Empiricism • Challenges for the Future Challenges for the Future Athens 2008-05-21 3

  4. The Context The Context • Natural Language Generation Natural Language Generation is concerned with generati is concerned with generating lingui ng linguistic tic material from some non material from some non-ling material from some non material from some non ling ling istic base linguistic base istic base stic base • Why is this important? Why is this important? – Applications Applications: : – any situation where it is not pr any situation where it is not practical to construct the full actical to construct the full range range of f f f requ require i d red d outpu outputs ahea h ead d f d of f i f time me – Theory Theory: : – understanding what drives choice-making in language understanding what drives choice-making in language Athens 2008-05-21 4

  5. Natural Language Generation Applications Natural Language Generation Applications • Generating text from large data sets: Generating text from large data sets: – Weather reports, stock market reports Weather reports, stock market reports • Information personalisation: Information personalisation: – Tailored web pages that take account of what you know Tailored web pages that take account of what you know • Context-sensitive Context-sensitive generation: g eneration: – Dynamic utterance construction in dialog systems Dynamic utterance construction in dialog systems • Multilingual generation: • Multilingual generation: Multilingual generation: Multilingual generation: – Multiple languages from a Multiple languages from a common knowledge source common knowledge source Athens 2008-05-21 5

  6. NL Understanding vs NL Understanding vs NL Generation NL Generation • The view from Natural Language Understanding: The view from Natural Language Understanding: – Deriving meaning from text means throwing away Deriving meaning from text means throwing away or or ignoring irrelevant detail ignoring irrelevant detail • The view from Natural Language Generation: The view from Natural Language Generation: – Very Very few, if any, few, if any, surface variations are surface variations are meaningless; we meaningless; we need nee d to d d to exp explain their f l i h h i f funct unction i on if if if we if we are are to to un understan d erstand d h d them h em properly properly Athens 2008-05-21 6

  7. Mapping Between Representations: NLU Mapping Between Representations: NLU [e,x,y]kissing(e) ∧ past(e) past(e) ∧ [e,x,y]kissing(e) John) ∧ name(y hn) ∧ name(y ) ∧ ) ∧ name(x John) name(x name(x,Jo name(x name(y,Mar name(yMar Mary) Mary) ent(e,x) ∧ patient(e,y) agent(e,x) patient(e,y) John kissed Mar John kissed Mar hn kissed Mar hn kissed Mary. y. Mar Mar as kissed b Mar Mary was kissed b as kissed b J as kissed by Jo John John. hn Athens 2008-05-21 7

  8. Mapping Between Representations: NLG Mapping Between Representations: NLG [e,x,y]kissing(e) ∧ past(e) past(e) ∧ [e,x,y]kissing(e) John) ∧ name(y hn) ∧ name(y ) ∧ ) ∧ name(x name(x,Jo name(x John) name(x name(yMar name(y,Mar Mary) Mary) ent(e,x) ∧ patient(e,y) agent(e,x) patient(e,y) ... ∧ fo ... ∧ fo ... focus(x) ... focus(y) John kissed Mar John kissed Mar hn kissed Mar hn kissed Mary. y. Mar Mar as kissed b Mary was kissed b Mar as kissed b J as kissed by Jo John John. hn Athens 2008-05-21 8

  9. The NLGer’s Position The NLGer’s osition • If we understand how and why te If we understand how and why texts are xts are put together the way put together the way they are the the are the are e are, we e will be in a e ill be in a ill be in a better position to take them apart ill be in a better position to take them apart better position to take them apart better position to take them apart • Generation provides insights that should improve Generation provides insights that should improve – Information extraction: working out what parts of a Information extraction: working out what parts of a text are text are important important – Text T T ext summar summarisat i i sation: i on: wor worki king ki ng out out h h how ow to to rep replace l ace i i incomp ncomplete l ete references in extracted material references in extracted material – Mac M h M hi achi hine ne trans t ranslati l ti ti tion: on: ma maki king ki ki ng choices h i ces th th t th that t t are are appropr appropriate t i t t t to context context Athens 2008-05-21 9

  10. An Architecture for Generation An Architecture for Generation Content Determination Content Determinati Document Document Pl Pl Pl Plann anning i ng Text T T ext S Structur S S tructuring i ng Le Lexicali Lexicali Le xicalisati xicalisati sation sation on on Micr Micro o Ag Aggr gregation ation Planning Plannin g Refer R f R f erring i ng E E Expr E xpress ssion i on G Gener G G enerat ation i on Syntax Syntax, m Syntax m morphology morphology ogy ogy, S f Sur S f urface ace or orthogr thograph phy and pr y and prosod osody Realization alization Athens 2008-05-21 10

  11. Referring Expression Generation Referring Expression Generation Input pr Input propositions: opositions: owns(m, owns ( ( m, j1), wear j ) j , wears(m, ( m, j1, d1 j 1, d1) ) Doma D D omain M i M i M d l Model: What What Ther ere is In T e is In The W e World Referring Re Discour Discourse Model: e Model: Expressio Expr ession What Has Been T What Has Been Talked lked About About Gener Generator tor User Mode User Model: l: What the What the Hear What the What the Hear Hearer Knows About Hearer Knows About er Knows About er Knows About NP semantics: NP semantics: ) ∧ colour(j1, w isa(j1, jac isa(j , jacket et) colour(j1, white) ite) Athens 2008-05-21 11

  12. The Effect of Discourse Context on Reference The Effect of Discourse Context on Reference • Example 1: Example 1: j1) → Matt – owns owns(m, ( m, j1) j1) j1) M M att owns owns a whi hi j hite te j jac j k acket. et. Dif Differ erent ent d) → He wears it – wears(m, j1, wears(m, j1, d) He wears it on Sundays. on Sundays. • Examp E E xample 2: l l 2: owns(m, [j1+c1]) → Matt owns a – owns(m, [j1+c1]) Matt owns a white jacket and a white jacket and a white coat. white coat. 1, d) → He wears the Same Same – wears(m, wears(m, j1, d) He wears the jacket acket on Sundays. on Sundays. • Example 3: Example 3: 2]) → Matt owns – owns owns(m, [ (m, [j1+ 1+j2]) Matt owns a a white white jacket and a acket and a blue blue jacket. acket. d) → He wears the white one – wears(m, j1, wears(m, j1, d) He wears the white one on Sundays. on Sundays. Athens 2008-05-21 12

  13. Outline Outline • The Context: Natural Language Generation The Context: Natural Language Generation • The Story So Far: The Story So Far: Algorithm Development to Empiricism Algorithm Development to Empiricism • Challenges for the Future Challenges for the Future Athens 2008-05-21 13

  14. The Consensus Problem Statement The Consensus Problem Statement The goal: The goal: Generate G G enerate a di di di i dist stingu nguishi i hi hi hing ng d d d descr escript i i ption on Given: Given: • an an inten i i ntended d d d f d re f referent erent; • a a knowledge base of entiti knowledge base of entities es characterised characterised by properties y properties expressed as expressed as attribute expressed as expressed as attribute attribute–value pairs attribute value pairs value pairs; and value pairs; and ; and ; and • a context a context consis consistin ting of other enti of other entities that are ties that are salient; salient; Then: Then: Then: Then: • choose a choose a set of attribute–value pairs that uniquely identi set of attribute–value pairs that uniquely identify the fy the intended referent intended referent intended referent intended referent Athens 2008-05-21 14

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend