Administrative Notes March 15, 2018 Do you want to present your - - PowerPoint PPT Presentation

administrative notes march 15 2018
SMART_READER_LITE
LIVE PREVIEW

Administrative Notes March 15, 2018 Do you want to present your - - PowerPoint PPT Presentation

Administrative Notes March 15, 2018 Do you want to present your project to the class? If so, sign up using the following link (also listed on the website)! https://ubc.ca1.qualtrics.com/jfe/form/SV_d6aeH7 wmATnJNIh Midterm 2


slide-1
SLIDE 1

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Administrative Notes March 15, 2018

  • Do you want to present your project to the

class? If so, sign up using the following link (also listed on the website)!

  • https://ubc.ca1.qualtrics.com/jfe/form/SV_d6aeH7

wmATnJNIh

  • Midterm 2 grading status
  • Just about done. Midterms should be available

via handback tonight or tomorrow morning.

  • Look out for an announcement on Canvas
slide-2
SLIDE 2

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

While Watson won, it did make an embarrassing

  • mistake. It clearly didn’t fully understand.

http://www.youtube.com/watch?v=7h4baBEi0iA

slide-3
SLIDE 3

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Is Watson intelligent by Strong AI criteria? Clicker question

  • A. Yes
  • B. No

Reminder: Strong AI – is epitomized by the Chinese Room (Section 6 of the reading) – the computer has to be able to THINK

slide-4
SLIDE 4

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Is Watson intelligent by Turing/weak AI criteria? Clicker question

  • A. Yes
  • B. No

Reminder: Weak AI is epitomized by Turing’s approach – the computer just has to APPEAR intelligent – fool a person for 5 minutes that it’s human

slide-5
SLIDE 5

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

What is Watson up to now? Fighting cancer…

“Using a Watson app developed with Baylor College of Medicine called KnIT (Knowledge Integration Toolkit) that reads and analyzes millions

  • f scientific papers and suggests to researchers

where to look and what to look for, a Baylor team has identified six new proteins to target for cancer

  • research. How hard is that? Very. In the last 30

years, scientists have uncovered 28 protein targets, according to IBM. The Baylor team found half a dozen in a month.” http://time.com/3208716/ibm-watson-cancer/

slide-6
SLIDE 6

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

What is Watson up to now? Cooking…

“Researchers at IBM have teamed up with the Institute of Culinary Education in New York. They've re-programmed Watson to serve as a sort

  • f sous-chef that can spit out novel ingredient

combinations and recipes on command. The IBM researchers call it "creative computing." Chefs can specify a key ingredient and a cuisine, and IBM's computer program will come up with millions of ideas.” http://www.npr.org/blogs/thesalt/2014/03/03/285326 611/our-supercomputer-overlord-is-now-running-a- food-truck

slide-7
SLIDE 7

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

What is Watson up to now? Debating…

https://www.youtube.com/watch?v=6fJOtAzICzw&t=45m26s

slide-8
SLIDE 8

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

The person running the demo asks this question:

“Can a computer take raw information and digest and reason on that information, and understand the context?” Does Watson do that here?

  • A. Yes
  • B. No
slide-9
SLIDE 9

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Group Exercise

Develop a definition of computational intelligence that you're happy with. Consider the examples that we've looked at as well as other examples (e.g., Chrome, Siri) as a way to help make your definition robust.

  • A. My definition is weak AI (Turing's)
  • B. My definition is strong AI
  • C. My definition is neither.
slide-10
SLIDE 10

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

By your definition, is Watson intelligent or not?

  • A. Yes
  • B. No
slide-11
SLIDE 11

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

AI Definition round up

slide-12
SLIDE 12

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Okay, so that’s what AI is. But how did they do that?

  • There are LOTS of different parts involved
  • We’ll look at a few
  • Note that we’ll cover the general idea of how

things work, but not the specific details

  • We’ll start with looking at how Watson

“understands”

slide-13
SLIDE 13

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

How did they do that? A final look behind the scenes

http://www.youtube.com/watch?v=lI- M7O_bRNg&t=3m20s

slide-14
SLIDE 14

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

The overall Watson architecture

slide-15
SLIDE 15

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

First part of how Watson works: Knowledge bases

  • The answer sources and evidence sources

are stored in Watson’s system; the internet is not used directly

  • These local data stores are called knowledge

bases; many applications use them.

slide-16
SLIDE 16

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

How are knowledge bases organized?

  • Some knowledge bases are structured databases –

the data is put in in a very specific format

  • Other knowledge bases are unstructured or semi-

structured – the data is not as rigidly organized

  • E.g., Google stores entire webpages (unstructured).

Wikipedia has some structure, but it’s not totally rigid (semi-structured)

  • An index (like in a book) helps find relevant

information But how does Watson know what information to find?

slide-17
SLIDE 17

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

How does Watson process language? Natural Language processing

  • Natural Language Processing (NLP): automatic

processing of human language, e.g., by computers

  • Examples:
  • Siri processes human language to appropriately respond

to the command “set a 5 minute timer” – that uses NLP.

  • Your web browser just displays the information you

request – that does not require NLP… but your search engine does

http://www.aaai.org/ojs/index.php/aimagazine/article/view/2303

slide-18
SLIDE 18

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Group exercise

NLP is needed for many different things that computers do these days. List applications that you have used that need NLP and what they used it for.

  • Google Translate
  • A computer trying to understand commands
  • E.g., Siri, Alexa, Google Home, etc.
  • Call systems
  • Grammar correctors
slide-19
SLIDE 19

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

How does NLP work?

  • NLP is challenging!
  • NLP draws on many disciplines: linguistics,

cognitive science, psychology, logic, computer science, philosophy, engineering, …

slide-20
SLIDE 20

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Typical NLP steps

  • 1. Recognize speech (Watson skipped this – it

received ASCII versions of the questions)

  • 2. Syntax analysis, or parsing: inferring parts of

speech and sentence structure, using a lexicon and grammar

  • 3. Semantic analysis: inferring meaning using

syntax and semantic rules

  • 4. Pragmatics: inferring meaning from contextual

information

slide-21
SLIDE 21

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Parsing: identifying parts of speech and sentence structure using lexicon and grammar Input:

Word Category Cat Noun Cheese Noun Ate Verb the Article

Lexicon

Sentence à NounPhrase, VerbPhrase VerbPhrase à Verb, NounPhrase NounPhrase à Article, Noun NounPhrase à Noun

Grammar

Output: a parse tree à

the rat ate cheese noun verb article noun Verb phrase Noun phrase Sentence Noun phrase

slide-22
SLIDE 22

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

How parsing helped Watson

The structure of some clues and certain keywords tells Watson what the form of the answer will be – without considering semantics. Consider the following clue that Watson can answer:

Category: Oooh....Chess Clue: Invented in the 1500s to speed up the game, this maneuver involves two pieces of the same color. Answer: Castling

Parsing is key in Watson’s ability to answer this question

slide-23
SLIDE 23

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

How parsing helped Watson

Parsing takes the sentence and shows how the words are assigned parts of speech and build up to form a sentence: Data mining showed that given this structure, the noun between the two verb phrases was the type of thing the answer is. In this case, the answer was a “maneuver.”

slide-24
SLIDE 24

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Group exercise: create a parse tree

Using the above lexicon and grammar, parse the sentence: “the large cat chased the rat” If you have a choice of rules, pick the one that works best. You don’t have to use all the rules.

Word Category Cat Noun Rat Noun Chased Verb Large Adjective the Article

Lexicon

Sentence à NounPhrase, VerbPhrase VerbPhrase à Verb, NounPhrase NounPhrase à Article, Noun NounPhraseà Article, Adjective, Noun

Grammar

slide-25
SLIDE 25

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Parsing: identifying parts of speech and sentence structure using lexicon and grammar

Word Category Cat Noun Rat Noun Chased Verb Large Adjective the Article

Lexicon

Sentence à NounPhrase, VerbPhrase VerbPhrase à Verb, NounPhrase NounPhrase à Article, Noun NounPhraseà Article, Adjective, Noun

Grammar

slide-26
SLIDE 26

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Parse “time flies like an arrow” Group exercise

Write down your tree structure and your algorithm. Note: you don’t have to use all the rules!

Word Category an article arrow noun flies noun flies verb time noun time verb like adverb like verb

Lexicon Grammar

Sentence à NounPhrase, VerbPhrase NounPhrase à Article, Noun NounPhrase à Article, Adjective, Noun NounPhrase à Noun NounPhrase à Noun, Noun VerbPhrase à Verb, Adverb, NounPhrase VerbPhrase à Verb, NounPhrase

slide-27
SLIDE 27

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Use your algorithm to parse “fruit flies like a banana” Group exercise

Did the algorithm work?

  • A. Yes
  • B. No
  • C. Kind of… but “flies” wasn’t quite right.
slide-28
SLIDE 28

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

The point: Parsing is hard!

  • Those were short, yet tricky examples – natural

languages are ambiguous!

  • Imagine trying to write a parsing algorithm that

works for a natural language… sentences of 20-30 words may have 10,000 possible syntactic structures!

  • Jeopardy makes the problem much easier, because

the structure of Jeopardy clues are relatively simple

slide-29
SLIDE 29

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

How good are computers at parsing?

  • A recent Google Parser – Parsey

McParseface – claims to have a record setting 94% accuracy for a newspaper dataset… but only 90% for web content

  • This sounds pretty good, but that means that

assuming accuracy is measured per word, you’d expect to have ~5 words parsed incorrectly on this slide.

https://research.googleblog.com/2016/05/announcing-syntaxnet- worlds-most.html

slide-30
SLIDE 30

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Final note on parsing: it’s the basis for computer programming

  • A computer has to "understand" programs in
  • rder to execute them
  • Programming languages are designed so that

they can be parsed unambiguously

  • A grammar specifies all the possible programs

that can be written in a language

  • Designing programming languages (and their

grammars) is a fun and important part of computer science

slide-31
SLIDE 31

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Recall: Typical NLP steps

  • 1. Recognize speech (Watson skipped this)
  • 2. Syntax analysis, or parsing: inferring parts of

speech and sentence structure, using a lexicon and grammar

  • 3. Semantic analysis: inferring meaning using

syntax and semantic rules

  • 4. Pragmatics: inferring meaning from contextual

information

slide-32
SLIDE 32

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Semantic analysis: inferring meaning using syntax and semantic rules

Syntax analysis/parsing can sometimes help determine semantics, or meaning Examples:

  • Knowing whether “flies” is a noun or a verb (the

syntax) tells us something about its meaning (the semantics)

  • Semantic rules provide additional information:
  • Word categories: e.g., a cat is a feline
  • Relationships between words, e.g., a semantic rule for

the word “like” can help us interpret “the boy likes the cat”

slide-33
SLIDE 33

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Semantic analysis: inferring meaning using syntax and semantic rules

Syntax describes a sentence’s structure. Semantics adds (limited) meaning that can be figured out using simple rules that don’t require much context. Examples:

  • Word categories: e.g., a cat is a feline
  • “gave” is the past tense of “give”
slide-34
SLIDE 34

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Recall: Typical NLP steps

  • 1. Recognize speech (Watson skipped this)
  • 2. Syntax analysis, or parsing: inferring parts of

speech and sentence structure, using a lexicon and grammar

  • 3. Semantic analysis: inferring meaning using

syntax and semantic rules

  • 4. Pragmatics: inferring meaning from

contextual information

slide-35
SLIDE 35

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Pragmatics: inferring meaning from contextual information

  • Most techniques to find semantic meaning of

words will look for clues in the surrounding text to disambiguate word meaning. For example, the real estate meaning of “lot” might have the words “vacant” or “square foot” near by.

  • Pragmatics becomes important also when

sentences contain pronouns

slide-36
SLIDE 36

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Pragmatics and Watson An example Watson can solve

Category: Decorating Clue: Though it sounds “harsh,” it’s just embroidery, often in a floral pattern, done with yarn on cotton cloth. Answer: crewel

  • Syntax parses the sentence and determines the parts of the

speech and the parse tree. It shows that the answer is what “it’s” refers to

  • Semantics provides definitions of terms such as “harsh” and

“crewel”

  • Pragmatics determines what “it’s” refers to and differentiates

between the different definitions of “Harsh” and “crewel/cruel”

slide-37
SLIDE 37

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Group Exercise

Write down a list of all the definitions of “bat” that you can think of and words that might be near by that would help you disambiguate the meaning

Bat à swinging, outside, exercise, baseball Bat à violence Bat à flying mammal, sleeping, nesting, nursing, vampire Bat à eyelashes

slide-38
SLIDE 38

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Clicker exercise

Would your definitions disambiguate between a baseball bat and a baseball player being AT bat?

  • A. Yes
  • B. No
  • C. My head hurts
slide-39
SLIDE 39

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Clicker question

Given what we’ve covered so far, would you say the computer really understands what the sentence “the cat chases the rat” really means?

  • A. Yes
  • B. No
slide-40
SLIDE 40

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Summary: Watson, Jeopardy and NLP

  • The Jeopardy clues are again highly structured,

which suit the NLP techniques we’ve talked about.

  • Jeopardy also tends to use similar questions and

topics over again, so studying those narrows things down a lot.

  • The categories also help, but as shown in the video,

sometimes not enough!

slide-41
SLIDE 41

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Sooo….

Given what we’ve discussed about natural language processing, do you think that Watson can understand general language?

  • A. Yes
  • B. No
slide-42
SLIDE 42

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Two somewhat contradictory takes on Watson. Who do you agree with most? Group discussion

  • "The illusion is that the computer is doing the same

thing that a very good jeopardy player would do. It's not. It's doing something sort of different that looks the same

  • n the surface. And every so often you see the

cracks." Ken Jennings, Jeopardy player

  • "When I do step back I think it is a very important

technical achievement that will reveal both really important applications but it will also reveal a deeper understanding of our intelligence, and that is fascinating." Dave Ferrucci

slide-43
SLIDE 43

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Two somewhat contradictory takes on Watson. Clicker question: Who do you agree with most?

  • "The illusion is that the computer is doing the same

thing that a very good jeopardy player would do. It's not." Ken Jennings, Jeopardy player

  • "When I do step back I think it is a very important

technical achievement that will reveal […] a deeper understanding of our intelligence" Dave Ferrucci

A. Jennings

  • C. Both

B. Ferrucci

  • D. Neither
slide-44
SLIDE 44

Computational Thinking www.ugrad.cs.ubc.ca/~cs100

Let’s leave Watson behind

  • Next, let’s look at another application of NLP:

language translation

  • To translate languages, the computer needs

to be able to “learn” both languages and how to go between them.

  • We’ve covered how to learn a language. How

do you go between them?