History and goals of NLU; course plan and goals Bill MacCartney and - - PowerPoint PPT Presentation

history and goals of nlu course plan and goals
SMART_READER_LITE
LIVE PREVIEW

History and goals of NLU; course plan and goals Bill MacCartney and - - PowerPoint PPT Presentation

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context History and goals of NLU; course plan and goals Bill MacCartney and Christopher Potts CS 244U: Natural language


slide-1
SLIDE 1

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

History and goals of NLU; course plan and goals

Bill MacCartney and Christopher Potts CS 244U: Natural language understanding Jan 10

1 / 25

slide-2
SLIDE 2

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Goals of NLU

A handful of broad goals, and ignoring internal tensions:

  • Insights into language and society
  • Insights into computation
  • Insights into human cognition
  • Solve a major sub-problem of Artificial Intelligence
  • Computers that can do our most tedious, dangerous, and/or

high-precision language-oriented tasks

2 / 25

slide-3
SLIDE 3

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Technological and cognitive goals

Allen (1987:2): [T]here can be two underlying motivations for building a computational theory. The technological goal is simply to build better computers, and any solution that works would be acceptable. The cognitive goal is to build a computational analog of the human-language-processing mechanism; such a theory would be acceptable only after it had been verified by experiment. [. . . ] Thus, the technological goal cannot be realized without using sophisticated underlying theories that are on the level being developed by theoretical linguists. On the

  • ther hand, the present state of knowledge about natural

language processing is so preliminary that attempting to build a cognitively correct model is not feasible.

3 / 25

slide-4
SLIDE 4

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

What is understanding? Some possible answers

To understand a statement is to

  • determine its truth (perhaps with justification);

and/or

  • calculate its entailments;

and/or

  • take appropriate action in light of it;

and/or

  • translate it accurately into another language;

and/or

  • ground it in a cognitively realistic conceptual space;

and/or

  • . . .

4 / 25

slide-5
SLIDE 5

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Turing’s (1950) ‘imitation game’ (the Turing test)

Turing replaced “Can machines think?”, which he regarded as “too meaningless to deserve discussion” (p. 442), with the ques- tion whether an interrogator could be tricked into thinking that a machine was a human using only conversation (no visuals, no de- mands for physical performance, etc.). “May not machines carry out something which ought to be described as thinking but which is very different from what a man does? This objection is a very strong one, but at least we can say that if, nevertheless, a machine can be constructed to play the imita- tion game satisfactorily, we need not be trou- bled by this objection.” (p. 435)

http://en.wikipedia.org/wiki/ Turing_test

5 / 25

slide-6
SLIDE 6

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Objections to the imitation game

Turing considers nine objections. The following have played significant roles in the scientific debate since then:

  • 1. The theological objection (p. 443)

“Thinking is a function of man’s immortal soul. God has given an immortal soul to every man and woman, but not to any-other animal or to machines. Hence no animal or machine can think.”

6 / 25

slide-7
SLIDE 7

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Objections to the imitation game

Turing considers nine objections. The following have played significant roles in the scientific debate since then:

  • 1. The theological objection (p. 443)

“Thinking is a function of man’s immortal soul. God has given an immortal soul to every man and woman, but not to any-other animal or to machines. Hence no animal or machine can think.” Turing’s reply: we want to allow that God could imbue an elephant with the power to think, so why not a machine? Proponent: this has no prominent modern proponents, as far as I know, but Chomsky takes the position that, as a matter of usage, we use think only for humans and human-like entities, but he adds that this question is a uninteresting as holding birds up as the definitive case of flying and then asking whether jets really fly.

http://www.framingbusiness.net/archives/1366

6 / 25

slide-8
SLIDE 8

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Objections to the imitation game

Turing considers nine objections. The following have played significant roles in the scientific debate since then:

  • 3. The Mathematical Objection (p. 444)

“There are a number of results of mathematical logic which can be used to show that there are limitations to the powers of discrete-state machines.”

6 / 25

slide-9
SLIDE 9

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Objections to the imitation game

Turing considers nine objections. The following have played significant roles in the scientific debate since then:

  • 3. The Mathematical Objection (p. 444)

“There are a number of results of mathematical logic which can be used to show that there are limitations to the powers of discrete-state machines.” Turing’s reply: it hasn’t been shown that the brain doesn’t suffer from the same limitations. Proponent: Penrose (1990), who calls on inferences from G¨

  • del’s

incompleteness results (anticipated by Turing).

6 / 25

slide-10
SLIDE 10

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Objections to the imitation game

Turing considers nine objections. The following have played significant roles in the scientific debate since then:

  • 4. The Argument from Consciousness

The machine must have a rich, human-like cognitive life, and we must be able to verify that.

6 / 25

slide-11
SLIDE 11

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Objections to the imitation game

Turing considers nine objections. The following have played significant roles in the scientific debate since then:

  • 4. The Argument from Consciousness

The machine must have a rich, human-like cognitive life, and we must be able to verify that. Turing’s reply: in weak (behavioral) form, this is fine. In strong form, it leads us to question whether other humans can think. Proponents: Penrose (1990) and Chalmers (1997).

6 / 25

slide-12
SLIDE 12

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Searle’s Chinese Room Argument

The thought experiment (see Searle 1980, 1990; Cole 2009)

Imagine yourself in a room containing a basketful of symbols from a language L that you don’t understand, along with a rule book (written in English) for matching symbols in L with other symbols in L. People outside the room pass you strings

  • f symbols in L, you follow your rules, and pass them back symbols in L. The rule

book is so good that the symbols you pass back are indistinguishable from the replies of a native speaker of L. You would pass the Turing test, but (Searle says) no one would say you understand.

7 / 25

slide-13
SLIDE 13

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Searle’s Chinese Room Argument

The thought experiment (see Searle 1980, 1990; Cole 2009)

Imagine yourself in a room containing a basketful of symbols from a language L that you don’t understand, along with a rule book (written in English) for matching symbols in L with other symbols in L. People outside the room pass you strings

  • f symbols in L, you follow your rules, and pass them back symbols in L. The rule

book is so good that the symbols you pass back are indistinguishable from the replies of a native speaker of L. You would pass the Turing test, but (Searle says) no one would say you understand.

Computer programs are formal (syntactic). Human minds have mental contents (semantics)
  • ings. That is the principle on which the
Chinese room argument works. I emphasize these points here partly because it seems to me the Church- lands [see "Could a Machine Think?" by Paul M. Churchland and Patricia Smith Churchland, page 32] have not quite understood the issues. They think that strong AI is claiming that computers might turn out to think and that I am denying this possibility
  • n commonsense grounds. But that is
not the claim of strong AI, and my argument against it has nothing to do with common sense. I will have more to say about their objections later. Meanwhile I should point out that, contrary to what the Churchlands suggest, the Chinese room argument also refutes any strong-AI claims made for the new parallel technologies that are inspired by and modeled on neural networks. Unlike the traditional von Neumann computer, which proceeds in a step- by-step fashion, these systems have many computational elements that
  • perate in parallel and interact with
  • ne another according to rules in-
spired by neurobiology. Although the results are still modest, these "parallel distributed processing," or "connec- tionist," models raise useful questions about how complex, parallel network systems like those in brains might actually function in the production of intelligent behavior. The parallel, "bralnlike" character of the processing, however, is irrelevant to the purely computational aspects
  • f the process. Any function that can
be computed on a parallel machine can also be computed on a serial ma-
  • chine. Indeed, because parallel ma-
chines are still rare, connectionist pro- grams are usually run on traditional serial machines. Parallel processing, then, does not afford a way around the Chinese room argument. What is more, the connectionist sys- tem is subject even on its own terms to a variant of the objection present- ed by the original Chinese room ar-
  • gument. Imagine that instead of a Chi-
nese room, I have a Chinese gym: a hall containing many monolingual, En- glish-speaking men. These men would carry out the same operations as the nodes and synapses in a connection- ist architecture as described by the Churchlands, and the outcome would be the same as having one man ma- nipulate symbols according to a rule
  • book. No one in the gym speaks a
word of Chinese, and there is no way for the system as a whole to learn the meanings of any Chinese words. Yet with appropriate adjustments, the system could give the correct answers to Chinese questions. There are, as I suggested earlier, interesting properties of connection- ist nets that enable them to simulate brain processes more accurately than traditional serial architecture does. But the advantages of parallel archi- tecture for weak AI are quite irrele- vant to the issues between the Chinese room argument and strong AI. The Churchlands miss this point when they say that a big enough Chi- nese gym might have higher-level mental features that emerge from the size and complexity of the system, just as whole brains have mental fea- tures that are not had by individual
  • neurons. That is, of course, a possibili-
ty, but it has nothing to do with com-
  • putation. Computationally, serial and
parallel systems are equivalent: any computation that can be done in par- allel can be done in serial. Ifthe man in the Chinese room is computationally equivalent to both, then if he does not understand Chinese solely by virtue of doing the computations, neither do
  • they. The Churchlands are correct in
saying that the original Chinese room argument was designed with tradi- tionalAi in mindbut wrong in thinking that connectionism is immune to the
  • argument. It applies to any computa-
tional system. You can't get semanti- cally loaded thought contents from formal computations alone, whether they are done in serial or in parallel; that is why the Chinese room argu- ment refutes strong AIin any form.

M

any people who are impressed by this argument are none- theless puzzled about the dif- ferences between people and comput-
  • ers. If humans are, at least in a triv-
ial sense, computers, and if humans have a semantics, then why couldn't we give semantics to
  • ther com-
puters? Why couldn't we program a Vax or a Cray so that it too would have thoughts and feelings? Or why couldn't some new computer technol-
  • gy overcome the gulf between form
and content, between syntax and se- mantics? What, in fact, are the differ- ences between animal brains and com- puter systems that enable the Chinese room argument to work against com- puters but not against brains? The most obvious difference is that the processes that define something as a computer-computational proc- esses-are completely independent
  • f any reference to a specific type of
hardware implementation. One could in principle make a computer out of
  • ld beer cans strung together with
wires and powered by windmills. But when it comes to brains, al- though science is largely ignorant of how brains function to produce men- tal states, one is struckby the extreme specificity of the anatomy and the 28 SCIENTIFIC AMERICAN January 1990

7 / 25

slide-14
SLIDE 14

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Searle’s Chinese Room Argument

The thought experiment (see Searle 1980, 1990; Cole 2009)

Imagine yourself in a room containing a basketful of symbols from a language L that you don’t understand, along with a rule book (written in English) for matching symbols in L with other symbols in L. People outside the room pass you strings

  • f symbols in L, you follow your rules, and pass them back symbols in L. The rule

book is so good that the symbols you pass back are indistinguishable from the replies of a native speaker of L. You would pass the Turing test, but (Searle says) no one would say you understand.

Computer programs are formal (syntactic). Human minds have mental contents (semantics)
  • ings. That is the principle on which the
Chinese room argument works. I emphasize these points here partly because it seems to me the Church- lands [see "Could a Machine Think?" by Paul M. Churchland and Patricia Smith Churchland, page 32] have not quite understood the issues. They think that strong AI is claiming that computers might turn out to think and that I am denying this possibility
  • n commonsense grounds. But that is
not the claim of strong AI, and my argument against it has nothing to do with common sense. I will have more to say about their objections later. Meanwhile I should point out that, contrary to what the Churchlands suggest, the Chinese room argument also refutes any strong-AI claims made for the new parallel technologies that are inspired by and modeled on neural networks. Unlike the traditional von Neumann computer, which proceeds in a step- by-step fashion, these systems have many computational elements that
  • perate in parallel and interact with
  • ne another according to rules in-
spired by neurobiology. Although the results are still modest, these "parallel distributed processing," or "connec- tionist," models raise useful questions about how complex, parallel network systems like those in brains might actually function in the production of intelligent behavior. The parallel, "bralnlike" character of the processing, however, is irrelevant to the purely computational aspects
  • f the process. Any function that can
be computed on a parallel machine can also be computed on a serial ma-
  • chine. Indeed, because parallel ma-
chines are still rare, connectionist pro- grams are usually run on traditional serial machines. Parallel processing, then, does not afford a way around the Chinese room argument. What is more, the connectionist sys- tem is subject even on its own terms to a variant of the objection present- ed by the original Chinese room ar-
  • gument. Imagine that instead of a Chi-
nese room, I have a Chinese gym: a hall containing many monolingual, En- glish-speaking men. These men would carry out the same operations as the nodes and synapses in a connection- ist architecture as described by the Churchlands, and the outcome would be the same as having one man ma- nipulate symbols according to a rule
  • book. No one in the gym speaks a
word of Chinese, and there is no way for the system as a whole to learn the meanings of any Chinese words. Yet with appropriate adjustments, the system could give the correct answers to Chinese questions. There are, as I suggested earlier, interesting properties of connection- ist nets that enable them to simulate brain processes more accurately than traditional serial architecture does. But the advantages of parallel archi- tecture for weak AI are quite irrele- vant to the issues between the Chinese room argument and strong AI. The Churchlands miss this point when they say that a big enough Chi- nese gym might have higher-level mental features that emerge from the size and complexity of the system, just as whole brains have mental fea- tures that are not had by individual
  • neurons. That is, of course, a possibili-
ty, but it has nothing to do with com-
  • putation. Computationally, serial and
parallel systems are equivalent: any computation that can be done in par- allel can be done in serial. Ifthe man in the Chinese room is computationally equivalent to both, then if he does not understand Chinese solely by virtue of doing the computations, neither do
  • they. The Churchlands are correct in
saying that the original Chinese room argument was designed with tradi- tionalAi in mindbut wrong in thinking that connectionism is immune to the
  • argument. It applies to any computa-
tional system. You can't get semanti- cally loaded thought contents from formal computations alone, whether they are done in serial or in parallel; that is why the Chinese room argu- ment refutes strong AIin any form.

M

any people who are impressed by this argument are none- theless puzzled about the dif- ferences between people and comput-
  • ers. If humans are, at least in a triv-
ial sense, computers, and if humans have a semantics, then why couldn't we give semantics to
  • ther com-
puters? Why couldn't we program a Vax or a Cray so that it too would have thoughts and feelings? Or why couldn't some new computer technol-
  • gy overcome the gulf between form
and content, between syntax and se- mantics? What, in fact, are the differ- ences between animal brains and com- puter systems that enable the Chinese room argument to work against com- puters but not against brains? The most obvious difference is that the processes that define something as a computer-computational proc- esses-are completely independent
  • f any reference to a specific type of
hardware implementation. One could in principle make a computer out of
  • ld beer cans strung together with
wires and powered by windmills. But when it comes to brains, al- though science is largely ignorant of how brains function to produce men- tal states, one is struckby the extreme specificity of the anatomy and the 28 SCIENTIFIC AMERICAN January 1990

Responses:

  • The room understands.
  • There is no reason to believe

Searle’s axiom “syntax by itself is neither constitutive of nor sufficient for semantics” (Churchland and Churchland 1990).

7 / 25

slide-15
SLIDE 15

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

A worry based on the first Turing Test (Loebner prize)

Shieber (1994) offers an entertaining and insightful overview. Cynthia Clay, the Shakespeare aficionado, was thrice misclassified as a computer. At least one of the judges made her classifications on the premise that “[no] human would have that amount of knowledge about Shakespeare” [26]. Lisette Gozo was honored as the most human of the agents for her discussion of women’s clothing, although one judge rated two computer programs above her. One reporter noted that Weizenbaum himself was “disturbed” by how easily people were fooled by these programs [26], and more than one of the judges reported that they were disappointed in the programs’ capabilities after their expectations had been raised by interacting with ELIZA in the interviewing process. [Egads; see slide 11 —CP]

8 / 25

slide-16
SLIDE 16

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Human-level or complementary to humans?

From ‘The Challenge and Promise of Artificial Intelligence’, with Peter Norvig and Eric Horvitz (Nov 30, 2011; 51 min. in)

http://www.kqed.org/a/radiospecials/R201111302000

Moderator: How far are we away from human intelligence? Just take a gamble. Peter Norvig: Well, first of all, I object to that, because I think that’s a low target to aim at. [Audience laughs.] Right, ’cause certainly there’s lots of things already that computers are much, much better than people at. [. . . ] We want to be able to say, “What is it that humans can’t do that computers can do better?” Now, part

  • f that may be that the computers want to have some basic

competency at the human-level in order to interact with us better. But the goal shouldn’t be human-level performance. The goal should be super-human partnership.

9 / 25

slide-17
SLIDE 17

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

The history of NLU

Rough overview of the progression:

1 Pattern-matching using a small set of hand-built rules 2 Pattern-matching with large corpora/databases 3 Linguistically rich, logically-informed systems, usually

grounded in highly structured databases

4 Statistical models that use mixtures of the above and strive for

wide coverage and error tolerance

10 / 25

slide-18
SLIDE 18

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Eliza (Weizenbaum 1966)

1 User: You are like my father in some ways. 2 ELIZA: What resemblance do you see 3 User: You are not very aggressive but I think you don’t want

me to notice that.

4 ELIZA: What makes you think I am not aggressive? 5 User: You don’t argue with me. 6 ELIZA: Why do you think I don’t argue with you? 7 User: You are afraid of me. 8 ELIZA: Does it please you to believe I am afraid of you?

Rewrite user’s reply by (i) swapping 1st and 3rd person, (ii) interpolating stock phrases, and (iii) using scores to rank possible transformations.

11 / 25

slide-19
SLIDE 19

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

STUDENT (Bobrow 1964)

Sample input (p. 54)

“If the number of customer Tom gets is twice the square of 20% of the number of advertisement he runs, and the number of advertisements he runs is 45, what is the number of customers Tom gets?”

Overview of the method

1 Map referential expressions to variables. 2 Use regular expression templates to identify and transform

mathematical predications.

3 Let the computer solve the resulting transformed set of

statements using a kind of constraint satisfaction.

12 / 25

slide-20
SLIDE 20

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Corpus-based approaches

Method

1 Store and index a large collection of texts. 2 (Impose additional structure: tagging, parsing, contextual

metadata, etc.)

3 Accept input sentences and match them to relevant sentences

using some metric. For an overview of early examples, see Simmons 1970. This is still how most modern question-answering systems work, including Watson (Ferrucci et al. 2010).

13 / 25

slide-21
SLIDE 21

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Procedural systems

Winograd (1972): full-grounded system that parses the user’s input, maps it to logical form, interprets that logical form in its world, and then tries to take appropriate action.

http://hci.stanford.edu/winograd/shrdlu/

One project did succeed. Terry Winograd’s program SHRDLU could use English intelligently, but there was a catch: the only subject you could discuss was a micro-world of simulated blocks. Compare with Watson (Ferrucci et al. 2010):

http://www-03.ibm.com/innovation/us/watson/

14 / 25

slide-22
SLIDE 22

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Logical systems

Focus on mapping text to logical forms and interpreting them in a (usually hand-built) database.

  • Chat-80 (Pereira and Warren 13; Warren and Pereira 1982):

Prolog program with a database that allowed users to pose queries about world geography. Chat-80 is still distributed, and there is a Python/NLTK module for working with it:

  • http://www.cis.upenn.edu/˜pereira/oldies.html
  • http://www.nltk.org/
  • SRI’s Core Language Engine (Alshawi et al. 1988): Prolog

system for mapping texts to a predicate calculus.

15 / 25

slide-23
SLIDE 23

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Discourse models

Beyond sentence meaning

In the late 1970s, researchers began focussing on studying discourse-level phenomena like intonational meaning, anaphora and coreference, conversational implicature, presuppositions, connotations, and other phenomena that typically involve going beyond the encoded content to resolve underspecification and extract implicit meaning (Grosz 1977; Hobbs 1979; Sidner 1979; Webber 1979).

Scripts and comprehensive understanding

The Story Understanding Paradigm of Roger Schank and his students and fellow-travelers (Yale, 1969-1988) sought to achieve comprehensive understanding of unconstrained texts. The

  • ver-arching idea was to reduce all communication to a set of

scripts (Schank 1969, 1977; Schank and Abelson 1977; Lehnert 1977; DeJong 1982).

16 / 25

slide-24
SLIDE 24

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

The statistical revolution

  • The statistical revolution of the mid-to-late 1990s profoundly

affected all aspects of NLP .

  • It was initially detrimental to NLU researchers turned to more

constrained problems to explore the new approaches. (But see Ng and Zelle 1997.)

  • At present, NLU is enjoying a renaissance. There is a feeling

that NLP has sufficient mastery of the statistical techniques and the initial set of problems that it is appropriate to build on them to achieve the goals of NLU.

  • As a result, all of the above approaches are being employed,
  • ften with modern new mixes: logical approaches

(MacCartney 2009), including probabilistic logics (Richardson and Domingos 2006; McCallum et al. 2008), rule-based and script-based interpretation (Lee et al. 2011; Chambers 2011), typically with the backing of very large semi-structured or unstructured databases (like the Web).

17 / 25

slide-25
SLIDE 25

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Connections with nearby fields

  • NLU is a large part of NLP

. The boundaries are unclear; even apparently superficial tasks might be influenced by semantic and pragmatic inference.

  • (Relationship of our class to CS224N: similar structure and

style, significant overlap in content, but substantial shift in

  • emphasis. This one omits many NLP topics, expands on
  • thers, and includes some topics that don’t get covered at all
  • ver there. Also somewhat less math-heavy.)
  • NLU has relatively little overlap with formal semantics, though

this might change as NLU re-embraces logical approaches

  • NLU arguably contains computational semantics; whereas the

goal of computational semantics is an understanding of sentence meaning and inference, NLU seeks true utterance understanding.

18 / 25

slide-26
SLIDE 26

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

And now over to Bill . . .

19 / 25

slide-27
SLIDE 27

!(,'#9-"4'>'#9?9%%9@'

  • ! A#B.'")'+C2/D)*'D?+'#9'E+'-9/)*'!(,F'
  • ! A)'"2"-+?/"G'"'%+.$%*+)2+'98'/)#+%+.#'/)'!(,'

H"I+%'"'&9)*'@/)#+%J'

  • ! K/-+.L%+"-'L+%2+LD9)'#M"#'!(,'/.'L9/.+-'#9'E%+"N'#M%9$*M'>'

M":+'M$*+'/?L"2#'

  • ! OCL&9./9)'/)'E$./)+..+.G'L%9-$2#.G'")-'.+%:/2+.'#M"#'-9'!(,'

H9%'L%9?/.+'#9J'

  • ! KM/#+PM9#'Q9E'?"%N+#'89%'5#")89%-'*%"-.'@/#M'?"-'!(,'.N/&&RF'
slide-28
SLIDE 28

5/%/'

  • ! SM+':9/2+P-%/:+)'L+%.9)"&'"../.#")#'9)'49$%'/TM9)+'75'
  • ! T+%M"L.'#M+'?9.#':/./E&+'>'+C2/D)*'"LL&/2"D9)'98'!(,'#9-"4'
  • ! A8'49$'E+&/+:+'#M+'?"%N+D)*'M4L+'>'#M+'E%+"#M&+..'L%+..U'
  • ! 5/%/'%+L%+.+)#.'"'?"Q9%'E%+"N#M%9$*M'/)'"%DV2/"&'/)#+&&/*+)2+'HWAJ'
  • ! 5+#.'#M+'.#")-"%-'89%'#M+')+C#'*+)+%"D9)'98',A'-+./*)'
slide-29
SLIDE 29

X9@'-9+.'5/%/'@9%NY'

!(,' W$#9?"D2'.L++2M' %+29*)/D9)'HW5ZJ' !(T'"))9#"D9)'

HT[5'#"**/)*G'!T'2M$)N/)*G'\J'

T"]+%)P"2D9)'?"LL/)*.' Z+.L9).+'89%?$&"D9)' S+C#P#9P.L++2M'HSS5J' 5+%:/2+' ?")"*+%' A)#+%)"&'>' +C#+%)"&' WTA5'

+?"/&' 515' ?"L.' @+"#M+%' .#92N.' +#2^'

slide-30
SLIDE 30

KM"#'5/%/'2")B#'-9'_4+#`'

KM+%+'/.'W'0$*B.'(/8+'L&"4/)*'/)'19$)#"/)'a/+@Y' W'0$*B.'(/8+'/.'L&"4/)*'"#'#M+'3+)#$%4';b'SM+"#+%^' KM+)'/.'/#'L&"4/)*'#M+%+Y' A#B.'L&"4/)*'"#'6L?G'cL?G'")-'dL?^' [e^''AB-'&/N+';'"-$&#'")-'6'2M/&-%+)'89%'#M+'V%.#'.M9@^' X9@'?$2M'@9$&-'#M"#'29.#Y'

!++-'-9?"/)'N)9@&+-*+G'-/.29$%.+'N)9@&+-*+G'@9%&-'N)9@&+-*+'

slide-31
SLIDE 31

fW'.4.#+?.'>'").@+%'+)*/)+.'

  • ! f$+.D9)P").@+%/)*'HfWJ'.4.#+?.U'A01B.'K"#.9)'
  • ! W).@+%'+)*/)+.U'K9&8%"?'W&LM"'
  • ! g99*&+'")-'0/)*'"%+'.&/LL/)*')"#$%"&P&")*$"*+'

fW'8$)2D9)"&/#4'/)#9'.+"%2M'$)9E#%$./:+&4'

slide-32
SLIDE 32

5+)D?+)#'")"&4./.'

  • ! W)"&4R+'$.+%P*+)+%"#+-'#+C#'29)#+)#'
  • ! T%9-$2#'%+:/+@.U'W?"R9)G'h+&LG'S%/LW-:/.9%G'\'
  • ! 592/"&'?+-/"U'S@/]+%G'i"2+E99NG'g99*&+jG'E&9*.G'\''
  • ! OC#%"2#'29"%.+P'")-'V)+P*%"/)+-'.+)D?+)#'
  • ! 39"%.+P*%"/)+-U'"k#$-+.'#9@"%-'L%9-$2#.G'E%")-.G'L+%.9)"&/D+.G'\'
  • ! i/)+P*%"/)+-U'."&/+)#'8+"#$%+.'98'L%9-$2#.'>'"k#$-+.'#9@"%-'#M+?'
  • ! l/&&/9).'98'.#"%#$L.U'
  • ! i9%'?"%N+#+%.U'@M"#'-9'L+9L&+'&/N+m-/.&/N+'"E9$#'#M+/%'L%9-$2#.Y'
  • ! i9%'29).$?+%.U'@M/2M'L%9-$2#.'-9'9#M+%'L+9L&+'L%+8+%Y'
  • ! i9%'#%"-+%.U'?"%N+#'.+)D?+)#'8%9?'S@/]+%'8++-.'_09&&+)'+#'"&^'6<;;`'
slide-33
SLIDE 33

W$#9?"#+-'#%"-/)*'

  • ! 19.#'V)")2/"&'#%"-/)*'/.')9@'-9)+'E4'"$#9?"#+-'.4.#+?.'

HnM/*MP8%+o$+)24'#%"-/)*pG'XiSJ'

  • ! 19.#'XiS'.#%"#+*/+.'%+&4'/)'L"%#'9)'"$#9?"#+-'")"&4./.'98'

$).#%$2#$%+-'-"#"'8++-.'q'/^+^G')"#$%"&'&")*$"*+'#+C#'

  • ! h9$'2")'?"N+':".#'L%9V#.'/8'49$'2")'-/.29:+%'")-'"2#'9)'

?"%N+#P?9:/)*')+@.'"'8+@'?/&&/.+29)-.'8".#+%'#M")'49$%'%/:"&.'

  • ! O..+)D"&&4G'#M+4B%+'$./)*'!(,'#9'L%+-/2#'#M+'?"%N+#.'
  • ! 3")'*9'.L+2#"2$&"%&4'@%9)*U'6<<d'/)2/-+)#'@/#M',W('.#92N'
slide-34
SLIDE 34

0$./)+..'/)#+&&/*+)2+'

  • ! OC#%"2D)*'"2D9)"E&+'/)#+&&/*+)2+'8%9?'?/&&/9).'98'

$).#%$2#$%+-'-92$?+)#.'

  • ! 3"#"LM9%"G'XcU'&+*"&'-/.29:+%4G'29?L&/")2+G'")-'

/)89%?"D9)'?")"*+?+)#'

  • ! T"&")D%G'f$/-U'/)#+&&/*+)2+'89%'*9:+%)?+)#'>'E$./)+..'
  • ! W$#9)9?4U'n1+")/)*'0".+-'39?L$D)*pG'"2o$/%+-'E4'XT'

/)'[2#9E+%'89%'r;<0'

slide-35
SLIDE 35

,.+%'?9-+&/)*'>'L+%.9)"&/R"D9)'

  • ! SM+'E+]+%'")'/)#+%)+#'.+%:/2+'$)-+%.#")-.'49$'")-'@M"#'49$'

2"%+'"E9$#G'#M+'?9%+':"&$+'/#'2")'-+&/:+%'#9'49$'

  • ! [)+'98'#M+'E+.#'@"4.'#9'V*$%+'9$#'@M"#'49$'2"%+'"E9$#'/.'#9'

$)-+%.#")-'49$%')"#$%"&'&")*$"*+'

  • ! OC"?L&+'
  • ! h9$'L9.#+-'.9?+#M/)*'9)'49$%'.92/"&')+#@9%N'"E9$#'#M+'K9%&-'5+%/+.'
  • ! h9$'%+2+)#&4'%+"-'"'E99N'E4's":/-'0%99N.'
  • ! K+'%+29??+)-'")'9LP+-'L/+2+'E4's":/-'0%99N.'"E9$#'#M+'E$./)+..'98'

E".+E"&&''

  • ! g99*&+G'i"2+E99NG'0/)*G'WLL&+G'?")4'9#M+%.'
slide-36
SLIDE 36

KM4')9@Y'

KM4'/.'!(,'+CL+%/+)2/)*'"'%+.$%*+)2+')9@Y' SM%++'N+4'%+".9).U'

  • ! 19%+'-"#"'
  • ! 19%+'29?L$#+'L9@+%'
  • ! 0+]+%'/-+".'

\'")-G'.4)+%*/+.'"?9)*'#M+.+'8"2#9%.^'

slide-37
SLIDE 37

19%+'-"#"'

  • ! SM+'+CL&9./9)'98'?"2M/)+P%+"-"E&+')"#$%"&'&")*$"*+'#+C#''
  • ! OC"E4#+.'H;<;d'E4#+.J'98'#+C#G'-9$E&/)*'+:+%4'4+"%'9%'#@9'
  • ! ;<<C'*%9@#M'98'@+E'/)'&".#'#+)'4+"%.'
  • ! K+E'L"*+.G'+?"/&.G'A1.G'515.G'#@++#.G'-92.G'Tsi.G'\'
  • ! [LL9%#$)/#4'q'")-')+2+../#4'q'#9'+C#%"2#'?+")/)*'
slide-38
SLIDE 38

19%+'29?L$#+'L9@+%'

  • ! 5$%L%/./)*&4G'/#B.')9#'.9'?$2M'"E9$#'8".#+%'2M/L.'
  • ! Z"#M+%G'/#B.'"E9$#'?"../:+'.+%:+%'8"%?.'
  • ! WE9:+'"&&G'#M+'.9I@"%+'")-'.+%:/2+.'#9'+CL&9/#'#M+?'

H+^*^G'W?"R9)'O36G'09%*G'1"LZ+-$2+G'X"-99LG'i&$?+G' T/*G'X/:+G'1"M9$#G'\J'

  • ! W&.9U'?9%+'?+?9%4G'?9%+'-/.NG'

>'?9%+')+#@9%N'E")-@/-#M'

slide-39
SLIDE 39

0+]+%'/-+".'

  • ! Z+2+)#'@9%N'/)'!(,'E$/&-.'9)'"-:")2+.'/)'.#"D.D2"&'!(T'
  • ! [I+)'$.+.'L%9:+)'!(T'29?L9)+)#.'".'E$/&-/)*'E&92N.'
  • ! (+:+%"*+.'/-+".'>'#+2M)/o$+.'-+:+&9L+-'/)'.#"D.D2"&'!(T'
  • ! 19%+'.9LM/.D2"#+-'.#"D.D2"&'?9-+&/)*'")-'?"2M/)+'

&+"%)/)*'"&*9%/#M?.'

  • ! i92$.'.M/I/)*'8%9?'.$L+%:/.+-'#9'$).$L+%:/.+-'&+"%)/)*'
slide-40
SLIDE 40

SM%++'&+:+&.'98'?+")/)*'

  • ! [)+'@"4'#9'2"%:+'$L'!(,'/.'E4'!"#"!$%&'%(")*+*,U'
  • ! 1+")/)*.'98'@9%-.U'&+C/2"&'.+?")D2.'
  • ! 1+")/)*.'98'.+)#+)2+.U'29?L9./D9)"&'m'89%?"&'.+?")D2.'
  • ! 1+")/)*.'98'L"%"*%"LM.G'-/"&9*.G'-/.29$%.+.'
  • ! SM"#B.'M9@'@+B:+'.#%$2#$%+-'#M/.'29$%.+G'#99'
slide-41
SLIDE 41

(+C/2"&'.+?")D2.'

KM4'-9+.'&+C/2"&'.+?")D2.'?"]+%Y''0+2"$.+'#M+'?+")/)*.'98' .+)#+)2+.'"%+'E$/&#'$L'8%9?'#M+'?+")/)*'98'@9%-.F' A)'9%-+%'#9'-+#+%?/)+'@M+#M+%'#M/.'/.'"':"&/-'/)8+%+)2+G'49$' )++-'#9'N)9@'#M+'%+&"D9)'E+#@++)'#M+'?+")/)*.'98'large% ")-'bigG'iguana ")-'lizardG'sofa ")-'couch^' There’s a large iguana on my sofa. There’s a big lizard on my couch.

slide-42
SLIDE 42

f$+.D9).'89%'&+C/2"&'.+?")D2.'

KM"#'o$+.D9).'-9'@+'@")#'#9'E+'"E&+'#9'").@+%Y'

  • ! X9@'?")4'-/.D)2#'.+).+.'-9+.'+"2M'@9%-'M":+Y'
  • ! KM"#B.'#M+'(")*+*,%98'+"2M'@9%-'.+).+Y'

HW)-G'M9@'.M9$&-'@+'%+L%+.+)#'#M+.+'?+")/)*.YJ'

  • ! X9@'"%+'@9%-'.+).+.'%+&"#+-'#9'+"2M'9#M+%Y'

H54)9)4?4G'")#9)4?4G'M4L+%)4?4G'+#2^J'

  • ! X9@'2")'@+'/-+)D84'#M+'.+).+'98'@9%-'/)'29)#+C#Y'

59?+'98'#M+.+'o$+.D9).'2")'E+'").@+%+-'@/#M'#M+' M+&L'98'()*-)!!.%/&*$01-/0"2%1"$&-1/"$^'

slide-43
SLIDE 43

s/2D9)"%/+.'

s/2D9)"%/+.'+)$?+%"#+'#M+'.+).+.'98'"'@9%-G'")-' /)-/2"#+'@M"#'+"2M'.+).+'?+").'$./)*'*&9..+.'")-' H.9?+D?+.J'+C"?L&+.^'

slide-44
SLIDE 44

SM+."$%/'

SM+."$%/'-9)B#'-/%+2#&4'+)$?+%"#+'.+).+.G'E$#'#M+4'-9' L%9:/-+'?+")/)*.'/)'#+%?.'98'.4)9)4?.'")-' H.9?+D?+.J'")#9)4?.^'

slide-45
SLIDE 45

K9%-!+#'

K9%-!+#'+)$?+%"#+'.+).+.'")-'/)-/2"#+.'#M+/%' ?+")/)*.'$./)*'*&9..+.'")-'+C"?L&+.G'")-'.+:+%"&'#4L+.' 98'%+&"D9).'E+#@++)'.+).+.^'

slide-46
SLIDE 46

T%9E&+?.'

1")$"&&4P29).#%$2#+-'%+.9$%2+.'2")'E+':+%4'$.+8$&^' 0$#G'#M+4U'

  • ! Z+o$/%+'&9#.'98'D?+G'?9)+4G'>'+CL+%D.+'#9'E$/&-'
  • ! h9$B%+'L%+]4'?$2M'.#$2N'@/#M'@M"#B.'"&%+"-4'":"/&"E&+'
  • ! s9)B#'29:+%')+9&9*/.?.U'retweet, iPad, blog, ...
  • ! s9)B#'/)2&$-+'Q"%*9)U'poset, LIBOR, hypervisor, …
  • ! s9)B#'29:+%'"&&'&")*$"*+.'
slide-47
SLIDE 47

3")'@+'!")1*%&+C/2"&'.+?")D2.Y'

e+4'/-+"'98'.#"D.D2"&'%+:9&$D9)U'

  • ! s9)B#'29).#%$2#'N)9@&+-*+'%+.9$%2+.'?")$"&&4'q'

/#B.'#99'&"E9%/9$.'")-'+CL+)./:+^'

  • ! A).#+"-G'"$#9?"D2"&&4'+*2-/"'N)9@&+-*+'E4'
  • /.29:+%/)*'.#"D.D2"&'%+*$&"%/D+.'/)'&"%*+'29%L9%"^''
slide-48
SLIDE 48

(+"%)/)*'#9'2+$3*,-+$4%.+).+.'

(9#.'98'@9%N'9)'$).$L+%:/.+-'2&$.#+%/)*'98'@9%-'.+).+.'

_+^*^G'52Mt#R+';uu6G';uud`'

3")'$.+'8+"#$%+.'98'29)#+C#'H-+V)+-'/)':"%/9$.'@"4.JU' 3")'"&.9'%+&4'9)'L"%"&&+&'#%").&"D9).U'

The bass guitar is a stringed instrument played primarily … … the string and wind bass instruments are usually … Smallmouth bass anglers often use smaller versions … … coverage of professional tournament bass fishing … The bass guitar is a stringed instrument played primarily … El bajo es un instrumento de cuerdas que se toca principalmente … … coverage of professional tournament bass fishing … ... la cobertura de los profesionales torneo de pesca de la lubina ...

slide-49
SLIDE 49

(+"%)/)*'@M"#'.+).+.'(")*%

K+&&G')9'9)+'M".'-+:/.+-'"'@"4'#9'29).#%$2#' 29):+)D9)"&'-/2D9)"%4'-+V)/D9).'"$#9?"D2"&&4' HWiWAeJ^' 0$#'#M+%+'"%+'9#M+%'@"4.'#9'%+L%+.+)#'?+")/)*'\'

slide-50
SLIDE 50

A)-$2/)*'@9%-'./?/&"%/#4'

[)+'@"4'#9'%+L%+.+)#'#M+'?+")/)*'98'"'@9%-'/.'E4'&/.D)*'@9%-.' @/#M'./?/&"%'?+")/)*.'q'&/N+'"'#M+."$%$.F' W)-'@9%-'./?/&"%/#4'2")'E+'/)-$2+-'8%9?'29)#+C#^'

n-%")N'"'E9]&+'98'vp'

Hi I'm Noreen and I once drank a bottle of wine in under 4 minutes SHE DRANK A BOTTLE OF JACK?! harleyabshireblondie. he drank a bottle of beer like any man I topped off some salted peanuts and drank a bottle of water The partygoers drank a bottle of champagne. MR WEST IS DEAD AS A HAMMER HE DRANK A BOTTLE OF ROGAINE aug 29th 2010 i drank a bottle of Odwalla Pomegranate Juice and got ... The 3 of us drank a bottle of Naga Viper Sauce ... We drank a bottle of Lemelson pinot noir from Oregon ($52) she drank a bottle of bleach nearly killing herself, "to clean herself from her wedding"

slide-51
SLIDE 51

a+2#9%P.L"2+'?9-+&.'98'?+")/)*'

W)9#M+%'@"4'@+'2")'%+L%+.+)#'@9%-'?+")/)*.'/.'/)'#+%?.'98' :+2#9%'.L"2+.'98'H%+&"D:+&4J'&9@'-/?+)./9)^' 1")4'L9../E&+'"LL%9"2M+.'M+%+^''W'.$L+%P./?L&+'9)+'@9$&-' &99N'"#'@9%-'8%+o$+)2/+.'/)'-/w+%+)#')+@.'#9L/2.^' HZ+"&'a51.'"%+'29)./-+%"E&4'?9%+'.9LM/.D2"#+-^J'

L9&/D2.' E$./)+..' .L9%#.' .2/+)2+'

treaty

L9&/D2.' E$./)+..' .L9%#.' .2/+)2+'

contract

L9&/D2.' E$./)+..' .L9%#.' .2/+)2+'

star

slide-52
SLIDE 52

A)-$2/)*'1"!)3&*$%E+#@++)'.+).+.'

K+B:+'"&%+"-4'&99N+-'"#'@"4.'#9'/)-$2+'.4)9)4?4^' KM"#'"E9$#'9#M+%'&+C/2"&'.+?")D2'%+&"D9).Y' X+"%.#'_;uu6G';uud`'L%9L9.+-'"'2&+:+%'@"4'#9'/)8+%'M4L+%)4?4' 8%9?'&"%*+'29%L9%"'$./)*'L"]+%).^'

xv'.$2M'".'vx'

Legal scholars such as Erwin Chemerinsky, dean of the UC Irvine School of Law, … … in the form of generalized online communities such as Theglobe.com … Many animals such as tubeworms, vent mussels, vent crabs, and vent shrimps, … Remove all MIME encodings, such as content-transfer encoding, … is abundant in green, leafy vegetables such as collard greens, spinach, and kale. Should performance enhancing drugs (such as steroids) be accepted in sports? Some states, such as Colorado, don't require much farming to get a tax-saving …

slide-53
SLIDE 53

K9%-'.+).+'-/."?E/*$"D9)'

  • ! '(+C/2"&'"?E/*$/#4'/.'5"1#)$+#"%/)')"#$%"&'&")*$"*+'#+C#'
  • ! 19.#'98'#M+';<<<'?9.#'29??9)'@9%-.'"%+'(-!35!.%"?E/*$9$.'
  • ! A)'K9%-!+#G'set'M".';y'.+).+.'".')9$)G'6c'".':+%EG'z'".'"-Q+2D:+F'
  • ! 3%+"#+.'*%/+:9$.'2M"&&+)*+.'/)'1SG'AZG'fWG'#+C#'2"#G'\'
  • ! K5s'/.'#M+'#".N'98'/-+)D84/)*'#M+'.+).+'98'"'@9%-'+*%/&*0"60%
  • ! s+L+)-.'9)'M":/)*'L%+-+V)+-'.+).+'/):+)#9%4'89%'+"2M'@9%-'
  • ! 19.#'@9%N'/)'K5s'$.+.'#M+'.$L+%:/.+-'&+"%)/)*'L"%"-/*?'
  • ! g+#'"))9#"#+-'#%"/)/)*'-"#"{'-+V)+'8+"#$%+'%+L%+.+)#"D9){'

"LL&4'?"2M/)+'&+"%)/)*'"&*9%/#M?.'

slide-54
SLIDE 54

0+#@++)'@9%-.'")-'.+)#+)2+.'

!+C#G'@+B&&'&99N'"#'#@9'#9L/2.'@M/2M'&/+'.9?+@M+%+' n"E9:+p'&+C/2"&'.+?")D2.'")-'nE+&9@p'.+)#+)D"&' .+?")D2.U'

  • ! Z+&"D9)'+C#%"2D9)'
  • ! 5+?")D2'%9&+'&"E+&/)*'

A)'#M+.+'#".N.G'@+B%+'+C#%"2D)*'?+")/)*'8%9?'.+)#+)2+'8%"*?+)#.^'' K+B%+'&99N/)*'"#'?9%+'#M")'/)-/:/-$"&'@9%-.G'")-'#M/)*.'&/N+'@9%-' 9%-+%'")-'.4)#"C'E+*/)'#9'?"]+%G'E$#'/#B.'&+..'#M")'8$&&'.+?")D2' /)#+%L%+#"D9)'98'29?L&+#+'.+)#+)2+.^'

slide-55
SLIDE 55

Z+&"D9)'+C#%"2D9)'

A)'%+&"D9)'+C#%"2D9)G'@+'"/?'#9'-+%/:+'.#%$2#$%+-'/)89%?"D9)' H+.L^'/).#")2+.'98'E/)"%4'%+&"D9).J'8%9?'$).#%$2#$%+-'#+C#U'

Born in 1955, Bill Gates is the co-founder, chairman, and former CEO of Microsoft, the world’s largest software company, with revenues of $70 billion per year.

yearOfBirth(Bill Gates, 1955)! founder(Bill Gates, Microsoft)! chairman(Bill Gates, Microsoft)! revenues(Microsoft, $70 billion)!

3")'E+'$.+-'#9'L9L$&"#+'-"#"E".+.G'#9'").@+%'o$+.D9).G'\'

slide-56
SLIDE 56

5+?")D2'%9&+'&"E+&/)*'

John broke the window. John broke the window with a rock. The rock broke the window. The window broke. The window was broken by John.

E%+"N/)*' 'WgO!SU'=9M)' 'TWSAO!SU'@/)-9@' 'A!5SZ,1O!SU'%92N'

5+?")D2'%9&+.'2")'"2#'".'"'.M"&&9@'?+")/)*'%+L%+.+)#"D9)'#M"#'2")'&+#' $.'?"N+'./?L&+'/)8+%+)2+.'#M"#'"%+)B#'L9../E&+'8%9?'#M+'L$%+'.$%8"2+' .#%/)*'98'@9%-.G'9%'+:+)'"'.4)#"2D2'L"%.+'#%++^'

SM+'."?+'+:+)#'2")'E+'-+.2%/E+-'/)'?")4'-/w+%+)#'@"4.G' ")-'$.*0)/3/%1&!"$%?"4')9#'29%%+.L9)-'#9'$"()*3/%1&!"$^'

slide-57
SLIDE 57

5+)#+)D"&'.+?")D2.'

i$%#M+%'$L'#M+':"&$+'2M"/)G'@+B-'&/N+'#9'-9'8$&&'.+?")D2' /)#+%L%+#"D9)'98'29?L&+#+'.+)#+)2+.^' SM/.'*9"&'@".'"'?"Q9%'892$.'98'+"%&4'@9%N'/)'!(,^'

5S,sO!S'_09E%9@';ub7`' If the number of customers Tom gets is twice the square of … 5XZs(,'_K/)9*%"-';uz;`U' Move the red block on top of the smaller green one. 3XWSPd<'_K"%%+)'>'T+%+/%"';ud6`' What are the capitals of the countries bordering the Baltic?

0$#'#M+.+'.4.#+?.'@+%+'%$&+P-%/:+)'")-'2&9.+-P-9?"/)^'

slide-58
SLIDE 58

39?L9./D9)"&'.+?")D2.'

5+)#+)D"&'.+?")D2.'/.'9I+)'N)9@)'".'/&(5&$+3&*)!% $"()*3/$G'E+2"$.+'98'#M/.'E/*'/-+"U''

HK/-+&4'"]%/E$#+-'#9'i%+*+G'#M9$*M'")D2/L"#+-'E4'h|.N"'")-'T&"#9^J'

!"#$%&'()'*+#$,-$.,/*,0'1,(2+'34$ SM+'?+")/)*'98'#M+'@M9&+' /.'"'8$)2D9)'98'#M+'?+")/)*.'98'#M+'L"%#.' ")-'#M+'@"4'#M+4'"%+'29?E/)+-^'

slide-59
SLIDE 59

39?L9./D9)"&/#4'/)'"%/#M?+D2'

20! 8! 4! 12!

(4 ! 5 – 2 ! 4) / 3

4! "! 5! –! 2! "! 4! /! 3!

slide-60
SLIDE 60

39?L9./D9)"&/#4'/)'&")*$"*+'

What are the capitals of the countries bordering the Baltic?

???! ???! ???! ???! ???! ???! ???!

slide-61
SLIDE 61

i9%?"&'.+?")D2.'

0$#'M9@'-9'@+'%+L%+.+)#'?+")/)*.'-$%/)*'29?L9./D9)Y' A)-++-G'@M+)'@+'-9'8$&&'.+?")D2'/)#+%L%+#"D9)'98'"' .+)#+)2+G'@M"#'N/)-'98'V)"&'9$#L$#'"%+'@+'"/?/)*'"#Y' [)+'").@+%U'+CL%+../9).'98'89%?"&'&9*/2^' [%'?9%+'*+)+%"&&4U'+CL%+../9).'98'&"?E-"'2"&2$&$.^' X+)2+G'29?L9./D9)"&'.+?")D2.'⇒'89%?"&'.+?")D2.'

What are the capitals of the countries bordering the Baltic?

λx ∃y capital(x, y) # country(y) # border(y, Baltic)!

slide-62
SLIDE 62

19)#"*$+'.+?")D2.'

What are the capitals of the countries bordering the Baltic?

λx ∃y capital(x, y) # country(y) # border(y, Baltic)! Baltic! λx λy border(y, x)! λy country(y)! λx λy capital(x, y)! λy border(y, Baltic)! λy country(y) # border(y, Baltic)!

slide-63
SLIDE 63

3")'@+'$.+'1S'#+2M)/o$+.Y'

1"4E+'@+'2")'!")1*%M9@'#9'-9'8$&&'.+?")D2'/)#+%L%+#"D9)' E4'E9%%9@/)*'#M+'M/*M&4'.$22+..8$&'.#%"#+*/+.'98'.#"D.D2"&' ?"2M/)+'#%").&"D9)Y' 0$#'@M+%+'@/&&'@+'*+#'#M+'L"%"&&+&'29%L9%"'H#%"/)/)*'-"#"JY'

Texas borders Kansas.

border(Texas, Kansas)!

What is the capital of Utah?

λx capital(x, Utah)!

What states border Maine?

λx state(x) # border(x, Maine)!

Maine does not border Utah.

¬border(Maine, Utah)!

Every state has a capital.

∀x state(x) → ∃y capital(y, x)!

Austin is the capital of Texas.

capital(Austin, Texas)!

slide-64
SLIDE 64

!"#$%"&'&")*$"*+'/)8+%+)2+'

  • ! [)+'#+.#'98'$)-+%.#")-/)*'/.'#M+'"E/&/#4'#9'-%"@'/)8+%+)2+.'
  • ! SM+'Z+29*)/R/)*'S+C#$"&'O)#"/&?+)#'HZSOJ'2M"&&+)*+.'
  • ! W'E%9"-':"%/+#4'98'"LL%9"2M+.'M":+'E++)'+CL&9%+-'
  • ! i$&&'.+?")D2'/)#+%L%+#"D9)'⇒'#M+9%+?'L%9:+%.'
  • ! 1+".$%/)*'&+C/2"&'./?/&"%/#4'E+#@++)'E"*.'98'@9%-.'
  • ! A)#+%?+-/"#+'"LL%9"2M+.G'+^*^'.M"&&9@'.+?")D2'/)#+%L%+#"D9)'
  • ! W'2&9.+&4'%+&"#+-'#".NU'%+29*)/R/)*'L"%"LM%".+.'
  • !

X causes Y ! X can lead to Y ! Y is triggered by X ! Y is attributed to X ! …

  • P. Twenty-five of the dead were members of the law enforcement agencies

and the rest of the 67 were civilians.

  • H. 25 of the dead were civilians.
slide-65
SLIDE 65

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

Discourse and context

In the discourse and context unit, we will go (way) beyond the encoded semantic content to try to capture something more like utterance meaning. The specific topics we’ve chosen:

  • Hedges: the methods speakers employ for indicating their

confidence and commitment

  • Sentiment analysis: how information about attitudes,

perspectives, and emotions is conveyed.

  • Textual coherence: the often invisible glue that binds

sentences together

  • Discourse and dialogue: interactional aspects of language

These topics do not exhaust the space, but we think they provide a good sample of the problems and approaches, which should make it possible for you to pursue other areas (coreference, presupposition, connotation, speech acts, . . . ) if you wish.

20 / 25

slide-66
SLIDE 66

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

References I

Allen, James F. 1987. Natural Language Understanding. Menlo Park, CA: The Benjamin/Cummings Publishing Company, Inc. Alshawi; H.; Carter; D. M.; van Eijck; J.; Moore; R. C.; Moran; D. B.; Pereira; F. C. N.; Pulman; S. G.; Smith; and A. G. 1988. Interim report on the SRI core language engine. Technical Report CCSRC-5, SRI International, Cambridge Research Centre, Cambridge, England. Bobrow, Daniel G. 1964. Natural Language Input for a Computer Problem Solving System. Ph.D. thesis, Rensselaer Polytechnic Institute. Chalmers, David. 1997. The Conscious Mind: In Search of a Fundamental

  • Theory. Oxford University Press.

Chambers, Nathanael. 2011. Inducing Event Schemas and their Participants from Unlabeled Text. Ph.D. thesis, Stanford. Churchland, Paul M. and Patricia Smith Churchland. 1990. Could a machine think? Scientific American 32–37. Cole, David. 2009. The Chinese room argument. In Edward N. Zalta, ed., Stanford Encyclopedia of Philosophy (Winter 2003 Edition). CSLI. URL http://plato.stanford.edu/entries/chinese-room/.

21 / 25

slide-67
SLIDE 67

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

References II

DeJong, Gerald F. 1982. An overview of the FRUMP system. In Wendy G. Lehnert and Martin H. Ringle, eds., Strategies for Natural Language Processing, 149–176. Hillsdale, NJ: Erlbaum. Ferrucci, David; Eric Brown; Jennifer Chu-Carroll; James Fan; David Gondek; Aditya A. Kalyanpur; Adam Lally; J. William Murdock; Eric Nyberg; John Prager; Nico Schlaefer; and Chris Welty. 2010. Building Watson: An overview

  • f the deepQA project. AI Magazine 31(3):59–79.

Grosz, Barbara J. 1977. The Representation and Use of Focus in Dialogue

  • Understanding. Ph.D. thesis, UC Berkeley.

Hobbs, Jerry R. 1979. Coherence and coreference. Cognitive Science 3(1):67–90. Lee, Heeyoung; Yves Peirsman; Angel Chang; Nathanael Chambers; Mihai Surdeanu; and Daniel Jurafsky. 2011. Stanford’s multi-pass sieve coreference resolution system at the CoNLL-2011 shared task. In Proceedings of the CoNLL-2011 Shared Task. Lehnert, Wendy. 1977. Human and computational question answering. Cognitive Science 1(1):47–73. MacCartney, Bill. 2009. Natural Language Inference. Ph.D. thesis, Stanford University.

22 / 25

slide-68
SLIDE 68

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

References III

McCallum, Andrew; Khashayar Rohanimanesh; Michael Wick; Karl Schultz; and Sameer Singh. 2008. FACTORIE: Efficient probabilistic programming for relational factor graphs via imperative declarations of structure, inference and

  • learning. In NIPS Workshop on Probabilistic Programming.

Ng, Hwee Tou and John Zelle. 1997. Corpus-based approaches to semantic interpretation in natural language processing. AI Magazine 18(4):45–64. Penrose, Roger. 1990. The Emperor’s New Mind: Concerning Computers, Minds, and The Laws of Physics. Oxford University Press. Pereira, Fernando C. N. and David H. D. Warren. 13. Definite clause grammars for language analysis — a survey of the formalism and a comparison with augmented transition networks. Artificial Intelligence 3(231–278). Richardson, Matthew and Pedro Domingos. 2006. Markov logic networks. Machine Learning 62(1–2):107–136. Schank, Roger. 1969. A conceptual dependency representation for a computer-oriented semantics. Technical Report 83, Stanford Computer Science. Schank, Roger. 1977. Sam–a story understander. Technical Report 43, Yale University Computer Science, New Haven, CT.

23 / 25

slide-69
SLIDE 69

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

References IV

Schank, Roger and Robert P . Abelson. 1977. Scripts, Plans, Goals and Understanding: An Inquiry into Human Knowledge Structures. Hillsdale, NJ: Erlbaum. Searle, John R. 1980. Minds, brains and programs. Behavioral and Brain Sciences 3(3):417–457. Searle, John R. 1990. Is the brain’s mind a computer program? Scientific American 26–31. Shieber, Stuart. 1994. Lessons from a restricted Turing test. Communications of the ACM 37(6):70–78. Sidner, Candace L. 1979. Towards a Computational Theory of Definite Anaphora. Ph.D. thesis, City University of New York. Simmons, Robert F. 1970. Natural language question-answering systems: 1969. Computational Linguistics 13(1):15–30. Turing, Alan M. 1950. Computing machinery and intelligence. Mind 59(236):433–460. Warren, David H. D. and Fernando C. N. Pereira. 1982. An efficient easily adaptable system for interpreting natural language queries. American Journal

  • f Computational Linguistics 8(3–4):110–122.

24 / 25

slide-70
SLIDE 70

Goals of NLU The history of NLU Connections with nearby fields NLU present & future Subfields Discourse and context

References V

Webber, Bonnie Lynn. 1979. A Formal Approach to Discourse Anaphora. New York: Garland. Weizenbaum, Joseph. 1966. ELIZA — a computer program for the study of natural language communication between man and machine. Communications of the ACM 9(1):36–45. Winograd, Terry. 1972. Understanding natural language. Cognitive Psychology 3(1):1–191.

25 / 25