The Language of Thought Folk Psychology The psychological theory - - PowerPoint PPT Presentation

the language of thought folk psychology the psychological
SMART_READER_LITE
LIVE PREVIEW

The Language of Thought Folk Psychology The psychological theory - - PowerPoint PPT Presentation

The Language of Thought Folk Psychology The psychological theory that ordinary people (the folk) use to predict and explain one anothers action. The most important posits of this theory are mental states like beliefs, desires, intentions,


slide-1
SLIDE 1

The Language of Thought

slide-2
SLIDE 2

Folk Psychology The psychological theory that ordinary people (the folk) use to predict and explain one another’s action. The most important posits of this theory are mental states like beliefs, desires, intentions, hopes, fears, pains, etc.

slide-3
SLIDE 3

Propositional Attitudes

A propositional content

  • that dogs are better

than cats

  • that the midterm is
  • ver
  • that

An agent

  • you
  • me
  • Beyoncé

An attitude

  • belief
  • desire
  • intention
  • hope
  • fear

+ +

slide-4
SLIDE 4

Practical Syllogism

For an arbitrary agent, A: if: (1) A desires p, (2)A believes that doing x is a good way to get p, (3)A doesn’t have any other stronger desires that, in light of A’s beliefs, conflict with doing x; then: (4) A will do x (or at least form an intention to do x).

slide-5
SLIDE 5

The Physical Stance

  • From which we explain the behaviors of a system in terms of

the physical forces acting on it.

  • There’s no distinction between correct and incorrect behavior

from this stance. Whatever happens happens.

The Design Stance

  • Explains the behaviors of a system in terms of the functions

for which it was designed, or for which it evolved.

  • From this stance, something going wrong is a malfunction.

The Intentional Stance

  • From which we explain the behaviors of a system in terms of

the functions for which it was designed.

  • From this stance, something going wrong is a irrationality.
slide-6
SLIDE 6

The Intentional Stance

  • A system has whichever beliefs and desires (etc.) it

would make the most sense to interpret it as having.

  • Beliefs and desires are real because interpreters pick

up on real patterns of thought and behavior when ascribing them.

slide-7
SLIDE 7

The Intentional Stance

  • A belief needn’t be identical to any particular neural
  • state. Beliefs aren’t (always) “sentences written in the

brain”.

  • They are holistic properties of systems.
  • It doesn’t make sense to ask how many beliefs

someone has.

  • There’s nothing really wrong with saying that groups,

thermometers, Google, etc., have beliefs.

  • It’s just a question of how useful it would be to do so,

and how genuine the pattern being picked up on is.

slide-8
SLIDE 8

“The Occam’s Razor answer is that maybe Ryan has calculated, in light of Obamacare’s surprising robust poll numbers, that getting rid of it would be worse than keeping it, because getting rid of it would give the

  • pposition a cause. It would

create millions of angry voters who’d march to the polls in 2018 to vote against the Republicans.”

slide-9
SLIDE 9

“So, this was our routine — when he wants out, he goes to the front door, and licks it. And then we moved house, and he got very, very confused. We trained the dog so that when he wants out, he goes to the front door and waits. Somehow in his little golden retriever brain, he interpreted this to mean “go to the front door, and lick it.” If he’s at the door, but isn’t licking it, he doesn’t need out, he’s just chilling.”

https://twitter.com/PastelPouts/status/789203468477169664

slide-10
SLIDE 10

“He knew he had to go to the front door when he wants out, but this was a new house with obviously a door that was completely new to him. Despite our condo having only one door that leads outside, and him going

  • ut this very same door literally at

least five times a day, every day, for about a year…he still has no idea where the front door is in this house. Absolutely no idea at all. Now whenever he needs out, he will go to any random door and start licking it. And I mean any door - the bathroom door, my bedroom door, my closet, the goddamn door of a kitchen cabinet, even.”

https://twitter.com/PastelPouts/status/789203468477169664

slide-11
SLIDE 11
slide-12
SLIDE 12
slide-13
SLIDE 13

“If Nest thinks you’ll be home in the afternoon, it’ll pause Auto-Away and turn down the temperature at 2pm so you’ll come home to a cool house.”

slide-14
SLIDE 14
slide-15
SLIDE 15
slide-16
SLIDE 16

The Language-of-Thought Hypothesis

  • The mind is literally an information-processing device

—a piece of software running on the brain.

  • Beliefs, desires, etc. are literally tokens of sentences

that our mind uses to represent, store, and compute information.

  • Depending on “where” in the system these sentences

are tokened, they count as beliefs, desires, etc.

  • (Fodor often talks about the “desire box” and the

“belief box”, etc. Kukla and Walmsley talk about “bins” instead”)

slide-17
SLIDE 17

belief desire executive (intention)

slide-18
SLIDE 18

belief desire executive control that I will get a good grade on the test that I will get a good grade on the test

  • nly if I will study

I will study

slide-19
SLIDE 19

belief desire executive control that P that P only if I do X I do X

slide-20
SLIDE 20

Question Why think that thoughts are like sentences?

slide-21
SLIDE 21

Analogy 1: Productivity and Recursivity

If you speak a natural language, you can use and understand infinitely many sentences:

  • John loves his mother.
  • John loves his mother’s mother.
  • John loves his mother’s mother’s mother. […]

Similarly, you can think an infinite number of thoughts.

  • the thought that John loves his mother
  • the thought that John loves his mother’s mother.
  • the thought that John loves his mother’s mother’s mother. […]
slide-22
SLIDE 22

Analogy 2: Systematicity

If you understand this sentence:

  • Jay loves Bey.

Then you also understand this sentence:

  • Bey loves Jay.

Similarly, if you can have this thought:

  • The thought that Jay loves Bey

Then you can also have this thought:

  • The thought that Bey loves Jay
slide-23
SLIDE 23

Analogy 3: Vocabulary and Conceptual Repertoire

Socrates didn’t have the following words in his vocabulary

  • dog
  • therefore

So, he couldn’t understand sentences that contained those words. Socrates didn’t possess the following concepts:

  • carburator
  • cell phone

So, he couldn’t have thoughts about things of these kinds.

slide-24
SLIDE 24

Analogy 4: Parts and Structure

Jay NP VP loves V Bey VP S Bey NP VP loves V Jay VP S Bey NP VP loves V Blue Ivy VP S

slide-25
SLIDE 25

Analogy 5: Logical Relations

In a language, logical relationships depend on internal sentence

  • structure. Consider the following argument:
  • If Fodor is right, we have computers in our heads.
  • Fodor is right
  • Therefore: we have computers in our heads.
slide-26
SLIDE 26

Analogy 5: Logical Relations

Practical syllogism is sensitive to the structure of our thoughts in the same way:

belief desire executive control that I will get a good grade on the test that I will get a good grade on the test

  • nly if I will study

I will study

slide-27
SLIDE 27

The Explanation: Compositionality

The meaning of a sentence is systematically determined by the meanings of its basic parts (~words), together with the syntactic structure in which they’re arranged. The propositional content of a thought is determined by the contents

  • f its parts (concepts) together with the way in which the thought is

structured.

slide-28
SLIDE 28

Levels of Explanation: Four Perspectives

  • 1. Dennett’s three stances
  • 2. Fodor: semantic contents and syntactic vehicles
  • 3. Marr: levels of abstraction

4.Fodor: autonomous special sciences

slide-29
SLIDE 29

Contents and Vehicles

belief that P only if I do X

cat 099 097 116 01100011 01100001 01110100

slide-30
SLIDE 30

Marr’s Three Levels of Abstraction

Three levels at which we can describe how a computational system performs a computation. The Computational Level tells us the function being

  • computed. This might consist of merely its input and its
  • utput.

The Algorithmic Level spells out each of the steps that the system takes in order to get from inputs to outputs. The Implementational Level tells us how the algorithmic process is physically realized in a particular system.

slide-31
SLIDE 31

Special Sciences

  • The cognitive sciences: psychology, linguistics, etc.
  • The social sciences: economics, sociology, etc.
  • Biology?
  • Chemistry (i.e., everything other than physics?)
slide-32
SLIDE 32

Scientific Theories …consist of a set of statements of the form: S1x → S2x “All S1 situations bring about S2 situations.”

slide-33
SLIDE 33

Scientific Theories S1x → S2x “S1“ and “S2“ are predicates: they stand for the properties that our theory deals in. For example:

—physics will include “has x mass” —folk psychology will include: “believes that Santa Claus exists” —economics will include: “has an exchange value of $8”

slide-34
SLIDE 34

Natural Kinds The predicates/properties of a science are its natural kinds. These are the kinds that have to exist if the science is true. More generally: a natural kind is a kind of thing that is real according to some science—i.e., that plays a role in some scientific law.

slide-35
SLIDE 35

Scientific Reductionism The laws of special sciences can be reduced to physics by discovering the appropriate bridge laws: S1x → S2x ⇵ ⇵ P1x → P2x “S1x ⇄ P1x” says something like: “whenever you have an S1 situation, you also have a P1 situation”.

slide-36
SLIDE 36

Scientific Reductionism The laws of special sciences can be reduced to physics by discovering the appropriate bridge laws: S1x → S2x ⇵ ⇵ P1x → P2x “S1x ⇄ P1x” says something like: “whenever you have an S1 situation, you also have a P1 situation”.

slide-37
SLIDE 37

Why Reductionism Fails: Multiple Realizability The natural kinds of one science may correspond to things of many different kinds, described at the level of another science. E.g. there is no general physical characterization of “currency”, “pain”, etc.

slide-38
SLIDE 38

Why Reductionism Fails: Exceptions The laws of special sciences are designed to have exceptions. They are normally stated with “ceteris paribus” clauses at the end. This means, roughly: “all other things being equal”. E.g. the practical syllogism. But, Fodor maintains: this does not make the laws false or useless. It’s just how non-physical laws work.