LCS 11: Cognitive Science Searles Chinese room argument GQ 2.3 - - PowerPoint PPT Presentation

lcs 11 cognitive science
SMART_READER_LITE
LIVE PREVIEW

LCS 11: Cognitive Science Searles Chinese room argument GQ 2.3 - - PowerPoint PPT Presentation

Agenda Pomona College Turing test review LCS 11: Cognitive Science Searles Chinese room argument GQ 2.3 group discussion Chinese room argument Selection of responses What makes brains special? Jesse A. Harris TED talks for


slide-1
SLIDE 1

Pomona College

LCS 11: Cognitive Science

Chinese room argument

Jesse A. Harris February 25, 2013

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 1

Agenda

֠ Turing test review ֠ Searle’s Chinese room argument ֠ GQ 2.3 group discussion ֠ Selection of responses ֠ What makes brains special? ֠ TED talks for next class

⋆ Cynthia Breazeal: The rise of personal robots ⋆ David Hanson: Robots that “show emotion”

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 2

Turing test

The question Can machines think? to be replaced with a less ambiguous formulation: Could some conceivable digital computer perform well in the imitation game.

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 3

Behavior and functionalism

Functionalist slogan

What matters is the software, not the hardware; cognitive systems are multiply realizable.

Behavior

All that matters is that the machine emulates the appropriate behavior of a human.

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 4

slide-2
SLIDE 2

Varieties of AI

Weak AI Use computer in the study of mind is merely a useful tool. Strong AI With the right set of programs, a computer understands and may have genuine cognitive states. More on this next class!

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 5

Chinese room

Searle’s slogan

The mind is not a computer program.

Chinese room

Searle uses a thought experiment designed to show that the thesis of Strong AI results in absurdity if taken seriously.

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 6

Chinese room

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 7

Chinese room

◮ Searle doesn’t understand a

word of Chinese

◮ All he has done is manipulate

the formal symbols

◮ Has no real understanding of

Chinese or the task.

◮ Problem is inherent to formal

computers: no meaning associated with the syntax.

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 8

slide-3
SLIDE 3

Chinese room

Open question

What does Searle mean by syntax? How about semantics?

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 9

GQ 2.3 – Group discussion

First, is Searle a functionalist? A physicalist? Or something in between? Second, what do you think Searle mean when he says that ‘brains cause meaning’? Do you agree or disagree with this assertion? Defend your answers concretely. Group leaders: Shalina, Thomas, Paul, Orren, Joel, Cole, Juliana, Mary Margaret, Natasha

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 10

Some objections

Systems reply (c)

It’s not the individual who understands Chinese, but the entire system.

Searle’s response

Even if the Chinese room were internalized - so that the English rule book only formed a subpart of the system, we’ve still provided no way to attach meaning to the symbols the subsystem manipulates.

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 11

Objections

The robot reply (f)

Insert a computer into an autonomous robot, so that it could interact with the world.

Searle’s response

Adds a set of causal relations with the world, but still don’t have any understanding as long as the formal inputs and

  • utputs go uninterpreted.

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 12

slide-4
SLIDE 4

Objections

The brain simulator reply (g)

Change the character of the computational system so that it simulates the neuronal firings in the brain.

Searle’s response

  • 1. Problematic for the basis thesis of functionalism –

shouldn’t have to know anything about how the brain works to understand cognition.

  • 2. Simulates wrong things about the brain: modeling the

computational properties of the brain still won’t give us intentionality, because we can’t simulate causal properties.

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 13

Objections

The other minds reply

Only know other minds via their behavior - can’t verify that machines or people really understand something outside of their behavior. What else is there?

Searle’s response (= flat-out rejection)

“It is no answer to this argument to feign anesthesia. In ‘cognitive sciences’ one presupposes the reality and knowability of the mental in the same way that in physical sciences one has to presuppose the reality and knowability of physical objects.” (from original 1980 paper, p.422L)

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 14

Objections

Many mansions reply

Eventually, the technology will develop so that we can build in the necessary causal processes for intentionality.

Searle’s response

  • 1. Crucially weakens Strong AI thesis that mind is a formal

symbol manipulating device.

  • 2. No purely formal system could ever give rise to cognitive

states, because it necessarily lacks real causal import, except to cause the next set of formal processes.

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 15

Against symbol-pushing

The argument in brief:

  • 1. Programs are purely formal (syntactic).
  • 2. Human minds have mental contents (semantics).
  • 3. Syntax by itself is neither constitutive of, nor sufficient

for, semantic content.

  • 4. Therefore, programs by themselves are not constitutive of

nor sufficient for minds.

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 16

slide-5
SLIDE 5

Wetware and mental contents

Dependence on brains

“ ...mental phenomena might be dependent on actual physical-chemical properties of actual human brains.” Intrinsic connection between the wetware (brain) and the kinds of functions it performs, namely, cognitive states.

Intentionality

“Whatever else intentionality is, it is a biological phenomenon, and it is as likely to be as causally dependent on the specific biochemistry of its origins as lactation, photosynthesis, or any

  • ther biological phenomena.”

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 17

Simulation and duplication

◮ Simulation: all you need for simulation is the right input

and output and a program to get from input to output.

◮ No one would mistake a simulation of digestion for the

real thing. “Brains are specific biological organs, and their specific biochemical properties enable them to cause consciousness and other sorts of mental phenomena. Computer simulations of brain processes provide models

  • f the formal aspects of these processes. But the

simulation should not be confused with duplication. “ (Searle, 1990:29R)

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 18

Simulation and duplication

◮ What about an artificial organ? ◮ Artificial hearts can be hearts,

despite being constructed of a different set of materials.

◮ Does Searle’s point still stand?

What’s different between the case of artificial hearts and the mind?

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 19

Open discussion

  • 1. Are you convinced by the Chinese room argument?
  • 2. Do we have to give up Strong AI?
  • 3. What about Searle’s positive views regarding

intentionality and the brain? Do you find those convincing?

  • 4. What other metrics do we have for determining whether

an individual is capable of conscious thought or genuine mental states?

Jesse A. Harris: LCS 11: Cognitive Science, Chinese room argument 20