Models of Language Evolution Iterated learning Michael Franke - - PowerPoint PPT Presentation

models of language evolution
SMART_READER_LITE
LIVE PREVIEW

Models of Language Evolution Iterated learning Michael Franke - - PowerPoint PPT Presentation

Models of Language Evolution Iterated learning Michael Franke Facets of EvoLang Compositionality Iterated Learning Facets of EvoLang Compositionality Iterated Learning 2 / 24 Facets of EvoLang Compositionality Iterated Learning Facets of


slide-1
SLIDE 1

Models of Language Evolution

Iterated learning Michael Franke

slide-2
SLIDE 2

Facets of EvoLang Compositionality Iterated Learning

Facets of EvoLang Compositionality Iterated Learning

2 / 24

slide-3
SLIDE 3
slide-4
SLIDE 4
slide-5
SLIDE 5
slide-6
SLIDE 6
slide-7
SLIDE 7

Facets of EvoLang Compositionality Iterated Learning

Facets of EvoLang Compositionality Iterated Learning

7 / 24

slide-8
SLIDE 8

Facets of EvoLang Compositionality Iterated Learning

Compositional Semantics

The meaning of a complex utterance depends systematically on the meaning of its parts and their way of combination. (1)

  • a. John likes Mary.
  • b. John abhors Mary.
  • c. Mary likes John.

8 / 24

slide-9
SLIDE 9
slide-10
SLIDE 10
slide-11
SLIDE 11

Facets of EvoLang Compositionality Iterated Learning

Facets of EvoLang Compositionality Iterated Learning

11 / 24

slide-12
SLIDE 12

Facets of EvoLang Compositionality Iterated Learning

Iterated Learning — Main Idea

  • language learners have some domain-general learning capability

including a (modest) capacity to generalize and extract patterns

  • competent speakers have learned from learners . . .

. . . who have learned from learners . . . . . . who have learned from learners . . . . . . who have learned from learners . . . ⇒ iterated learning can create structure which wasn’t there before

  • given capability for generalization
  • given an appropriately sized “learning bottleneck”

12 / 24

slide-13
SLIDE 13

Facets of EvoLang Compositionality Iterated Learning

Evolution of Compositionality

  • 1 learner, 1 teacher
  • teacher produces n state-signal pairs
  • learner acquires a language based on these
  • (iterate:) learner becomes teacher for new learner
  • learning model:
  • feed-forward neural network
  • backpropagation (supervised learning)
  • production strategy: “obversion”
  • production optimizes based on individual comprehension

13 / 24

(Kirby and Hurford, 2002)

slide-14
SLIDE 14

Facets of EvoLang Compositionality Iterated Learning

Learning Model: Feed-Forward Neural Network

  • 8 × 8 × 8 network for interpretation
  • input: signal

i = i1, . . . , i8 ∈ {0, 1}8

  • output: meaning
  • = o1, . . . , o8 ∈ {0, 1}8
  • initially arbitrary weights

14 / 24

slide-15
SLIDE 15

Facets of EvoLang Compositionality Iterated Learning

Backpropagation

  • training items i, o are presented
  • network computes its output o′ for given i
  • error δ = o − o′ is propagated back through all layers
  • weights are adjusted accordingly

15 / 24 picture from http://galaxy.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html

slide-16
SLIDE 16

Facets of EvoLang Compositionality Iterated Learning

Obverter Strategy

  • feed-forward net only defines interpretation strategy
  • production as best choice given the speaker’s own interpretation:
  • suppose teacher wants to express meaning o ∈ {0, 1}8
  • she then chooses a ic ∈ {0, 1}8 that triggers network output o′ ∈ [0, 1]8 if ic maximizes

confidence: ic = arg max

i∈{0,1}8 C(o|i)

defined as: C(o|i) =

8

k=1

C(ok|o′

k)

C(ok|o′

k) =

k

if ok = 1 1 − o′

k

if ok = 0

16 / 24

slide-17
SLIDE 17

Facets of EvoLang Compositionality Iterated Learning

Results (20 Trainings Items)

dotted: difference teacher-learner language solid: proportion of meaning space covered 17 / 24

(Kirby and Hurford, 2002)

slide-18
SLIDE 18

Facets of EvoLang Compositionality Iterated Learning

Results (2000 Trainings Items)

dotted: difference teacher-learner language solid: proportion of meaning space covered 18 / 24

(Kirby and Hurford, 2002)

slide-19
SLIDE 19

Facets of EvoLang Compositionality Iterated Learning

Results (50 Trainings Items)

dotted: difference teacher-learner language solid: proportion of meaning space covered 19 / 24

(Kirby and Hurford, 2002)

slide-20
SLIDE 20

Facets of EvoLang Compositionality Iterated Learning

Compositionality

  • compositionality arises for medium-sized bottlenecks, e.g.:
  • 1 = 1

↔ i3 = 0

  • 2 = 1

↔ i5 = 0

  • 3 = 1

↔ i6 = 0

  • 4 = 1

↔ i1 = 0

  • 5 = 1

↔ i4 = 1

  • 6 = 1

↔ i8 = 1

  • 7 = 1

↔ i2 = 0

  • 8 = 1

↔ i7 = 1

20 / 24

slide-21
SLIDE 21

Facets of EvoLang Compositionality Iterated Learning

Summary

  • iterated learning “creates” compositional meaning . . .
  • if bottleneck size is appropriate
  • by generalizing over sparse training data
  • by informed innovation (where necessary)
  • other learning mechanisms possible:
  • other kinds of neural networks

(e.g. Smith et al., 2003)

  • finite state transducers

(e.g. Brighton, 2002)

21 / 24

slide-22
SLIDE 22

Homework

solve the mock exam and prepare questions for midterm exam

slide-23
SLIDE 23

References

Brighton, Henry (2002). “Compositional Synatx from Cultural Transmission”. In: Artificial Life 8, pp. 25–54. Kirby, Simon (2007). “The Evolution of Language”. In: Oxford Handbook of Evolutionary

  • Psychology. Ed. by Robin Dunbar and Louise Barrett. Oxford University Press,
  • pp. 669–681.

Kirby, Simon, Tom Griffith, et al. (2014). “Iterated Learning and the Evolution of Language”. In: Current Opinion in Neurobiology 28, pp. 108–114. Kirby, Simon and James R. Hurford (2002). “The Emergence of Linguistic Structure: An Overview of the Iterated Learning Model”. In: Simulating the Evolution of Language.

  • Ed. by A. Cangelosi and D. Parisi. Springer, pp. 121–148.

Smith, Kenny et al. (2003). “Iterated Learning: A Framework for the Emergence of Language”. In: Artificial Life 9, pp. 371–386.