Induction and interaction in the evolution of language and - - PowerPoint PPT Presentation

induction and interaction in the evolution of language
SMART_READER_LITE
LIVE PREVIEW

Induction and interaction in the evolution of language and - - PowerPoint PPT Presentation

Induction and interaction in the evolution of language and conceptual structure Jon W. Carr Kinship terms are simple and informative Kemp & Regier (2012) English Northern Paiute Kinship terms are simple and informative Kemp & Regier


slide-1
SLIDE 1

Induction and interaction in the evolution

  • f language and conceptual structure

Jon W. Carr

slide-2
SLIDE 2

English Northern Paiute

Kinship terms are simple and informative

Kemp & Regier (2012)

slide-3
SLIDE 3 English Northern Paiute

Kinship terms are simple and informative

Kemp & Regier (2012)

⬅ Informative ⬅ Simple

slide-4
SLIDE 4

Pressures operating in simplicity–informativeness space

⬅ Informative ⬅ Simple

  • p

t i m a l f r

  • n

t i e r

slide-5
SLIDE 5

Pressures operating in simplicity–informativeness space

⬅ Informative ⬅ Simple

  • p

t i m a l f r

  • n

t i e r

  • p

t i m a l f r

  • n

t i e r

Learning exerts pressure for simplicity

tuge tuge tuge tuge tuge tuge tuge tuge tuge tupim tupim tupim miniku miniku miniku tupin tupin tupin poi poi poi poi poi poi poi poi poi

Kirby, Cornish, & Smith (2008)

slide-6
SLIDE 6

Pressures operating in simplicity–informativeness space

⬅ Informative ⬅ Simple

  • p

t i m a l f r

  • n

t i e r

  • p

t i m a l f r

  • n

t i e r

Communication exerts pressure for informativeness

newhomo kamone gaku hokako kapa gakho wuwele nepi pihino nemone piga kawake Kirby, Tamariz, Cornish, & Smith (2015)

slide-7
SLIDE 7

Pressures operating in simplicity–informativeness space

⬅ Informative ⬅ Simple

  • p

t i m a l f r

  • n

t i e r

  • p

t i m a l f r

  • n

t i e r

Learning + Communication exerts pressure for simplicity and informativeness

gamenewawu gamenewawa gamenewuwu gamene mega megawawa megawuwu wulagi egewawu egewawa egewuwu ege Kirby, Tamariz, Cornish, & Smith (2015)

slide-8
SLIDE 8

Semantic category systems

slide-9
SLIDE 9

Semantic category systems

slide-10
SLIDE 10

Semantic category systems

Compactness

slide-11
SLIDE 11

Simplicity and informativeness of semantic category systems

Simplicity Informativeness

– Compact – Random

slide-12
SLIDE 12

Simplicity and informativeness of semantic category systems

Simplicity Informativeness

– Compact – Random

slide-13
SLIDE 13

Hallmark features of simple and informative category systems

Simplicity pressure from induction favours Few categories Compactness Informativeness pressure from interaction favours Many categories Compactness

  • Compact
  • Random
slide-14
SLIDE 14

Hallmark features of simple and informative category systems

Simplicity pressure from induction favours Few categories Compactness Informativeness pressure from interaction favours Many categories Compactness Are semantic categories compact because of simplicit y

  • r informativeness?
  • Compact
  • Random
slide-15
SLIDE 15

Part 2

slide-16
SLIDE 16

Can iterated learning give rise to informative languages?

Carstensen, Xu, Smith, & Regier (2015)

slide-17
SLIDE 17

Can iterated learning give rise to informative languages?

Carstensen, Xu, Smith, & Regier (2015)

slide-18
SLIDE 18

Can iterated learning give rise to informative languages?

Carstensen, Xu, Smith, & Regier (2015)

slide-19
SLIDE 19

Modelling a Bayesian learner

S I

Simplicity bias Informativeness bias

slide-20
SLIDE 20

Bayesian inference

L = { · · ·}

slide-21
SLIDE 21

Bayesian inference

L = { · · ·} D = [⟨m1, s1⟩, ⟨m2, s2⟩, ⟨m3, s3⟩, ..., ⟨mn, sn⟩]

slide-22
SLIDE 22

Bayesian inference

L = { · · ·} D = [⟨m1, s1⟩, ⟨m2, s2⟩, ⟨m3, s3⟩, ..., ⟨mn, sn⟩]

=

likelihood(D|L) =

  • ⟨m,s⟩

P(s|L, m)

slide-23
SLIDE 23

Bayesian inference

L = { · · ·} D = [⟨m1, s1⟩, ⟨m2, s2⟩, ⟨m3, s3⟩, ..., ⟨mn, sn⟩]

= <

prior(L) ∝ 2−complexity(L)

likelihood(D|L) =

  • ⟨m,s⟩

P(s|L, m)

S
slide-24
SLIDE 24

Bayesian inference

L = { · · ·} D = [⟨m1, s1⟩, ⟨m2, s2⟩, ⟨m3, s3⟩, ..., ⟨mn, sn⟩]

= < >

prior(L) ∝ 2−complexity(L) prior(L) ∝ 2−cost(L)

likelihood(D|L) =

  • ⟨m,s⟩

P(s|L, m)

S I
slide-25
SLIDE 25

Bayesian iterated learning under a simplicity prior

S S S

slide-26
SLIDE 26

Bayesian iterated learning under a simplicity prior

S S S

slide-27
SLIDE 27

Bayesian iterated learning under a simplicity prior

S S S

slide-28
SLIDE 28

Bayesian iterated learning under a simplicity prior

S S S

slide-29
SLIDE 29

Bayesian iterated learning under a simplicity prior

S S S

slide-30
SLIDE 30

Bayesian iterated learning under a simplicity prior

S S S

slide-31
SLIDE 31

Bayesian iterated learning under an informativeness prior

I I I

slide-32
SLIDE 32

Bayesian iterated learning under an informativeness prior

I I I

slide-33
SLIDE 33

Model results

slide-34
SLIDE 34

Experimental stimuli

Angle Size

slide-35
SLIDE 35

Iterated learning with humans

slide-36
SLIDE 36

Iterated learning with humans

slide-37
SLIDE 37

Iterated learning with humans

slide-38
SLIDE 38

Iterated learning with humans

slide-39
SLIDE 39

Iterated learning with humans

slide-40
SLIDE 40

Iterated learning with humans

slide-41
SLIDE 41
slide-42
SLIDE 42
slide-43
SLIDE 43

1 category (2/12) 2 categories (1/12) 4 categories (1/12) 3 categories (8/12)

Converged-on category systems

slide-44
SLIDE 44

Model results under best-fit parameters

slide-45
SLIDE 45

Can iterated learning give rise to informative languages?

Carstensen, Xu, Smith, & Regier (2015)

slide-46
SLIDE 46

Model results under best-fit parameters

slide-47
SLIDE 47

Hallmark features of simple and informative category systems

Simplicity pressure from induction favours Few categories Compactness Informativeness pressure from interaction favours Many categories Compactness

  • Compact
  • Random
slide-48
SLIDE 48

Part 3

slide-49
SLIDE 49

A pressure for informativeness prevents degeneration

tuge tuge tuge tuge tuge tuge tuge tuge tuge tupim tupim tupim miniku miniku miniku tupin tupin tupin poi poi poi poi poi poi poi poi poi n-ere-ki l-ere-ki renana n-ehe-ki l-aho-ki r-ene-ki n-eke-ki l-ake-ki r-ahe-ki n-ere-plo l-ane-plo r-e-plo n-eho-plo l-aho-plo r-eho-plo n-eki-plo l-aki-plo r-aho-plo n-e-pilu l-ane-pilu r-e-pilu n-eho-pilu l-aho-pilu r-eho-pilu n-eki-pilu l-aki-pilu r-aho-pilu

Experiment 1 Iterated learning Experiment 2 Iterated learning with an informativeness pressure Kirby, Cornish, & Smith (2008)

slide-50
SLIDE 50

Continuous, open-ended stimulus space

slide-51
SLIDE 51
slide-52
SLIDE 52

Transmission design

Dynamic set 1 Dynamic set 2 Dynamic set 0

Generation 1 Generation 2 Generation 3

Dynamic set 3

Iterated learning

slide-53
SLIDE 53

Transmission design

Dynamic set 1 Static set Dynamic set 2 Static set Dynamic set 0

Generation 1 Generation 2 Generation 3

Dynamic set 3 Static set

Iterated learning

slide-54
SLIDE 54

Transmission design

Dynamic set 1 Static set Dynamic set 2 Static set Dynamic set 0

Generation 1 Generation 2 Generation 3

Dynamic set 3 Static set Dynamic set 1 Static set Dynamic set 2 Static set Dynamic set 0

Generation 1 Generation 2 Generation 3

Dynamic set 3 Static set

Iterated learning Iterated learning with communicative interaction

slide-55
SLIDE 55

Iterated learning gives rise to discrete categories

slide-56
SLIDE 56

Iterated learning gives rise to discrete categories

slide-57
SLIDE 57

Iterated learning gives rise to discrete categories

slide-58
SLIDE 58

Iterated learning gives rise to discrete categories

slide-59
SLIDE 59

Iterated learning gives rise to discrete categories

slide-60
SLIDE 60

Iterated learning gives rise to discrete categories

slide-61
SLIDE 61

Iterated learning gives rise to discrete categories

fama

slide-62
SLIDE 62

Iterated learning gives rise to discrete categories

fama

slide-63
SLIDE 63

Iterated learning gives rise to discrete categories

fama p a m a

slide-64
SLIDE 64

Iterated learning gives rise to discrete categories

fama p a m a fod

slide-65
SLIDE 65

Iterated learning gives rise to discrete categories

fama p a m a fod muaki

slide-66
SLIDE 66

Iterated learning gives rise to discrete categories

fama p a m a fod muaki kazizui

kazizizui k a z i z i z u

slide-67
SLIDE 67

Communicative interaction gives rise to sublexical structure

slide-68
SLIDE 68

Communicative interaction gives rise to sublexical structure

1 2 3 4 5 6 7 8 9 10 100 200 300 400 500

Complexity

1 2 3 4 5 6 7 8 9 10 1 2 3 4 5 6 7 8 9 10

Generation number

1 2 3 4 5 6

Communicative cost

1 2 3 4 5 6 7 8 9 10

Generation number Chain A Chain C Chain B Chain D Chain I Chain K Chain J Chain L

Iterated learning with interaction Iterated learning

slide-69
SLIDE 69

Conclusions

slide-70
SLIDE 70

Conclusions

  • Languages are shaped by competing pressures from induction and interaction
  • The human inductive bias is best characterized by a preference for simplicity
  • Therefore, iterated learning gives rise to simple, inexpressive category systems

with simple, compact structure


Side-note: Compact structure also happens to be a feature of informativeness, obscuring the mechanism

  • But! The presence of communicative interaction prevents this process getting out
  • f hand by permitting the emergence of higher-level forms of linguistic structure
  • The framework developed in the CLE (which, by the way, has many parallels with a body of

work from Regier and colleagues) is resilient to more realistic assumptions about meaning

slide-71
SLIDE 71

Thanks!