Models of Language Evolution Session 10 : Iterated Learning and the - - PowerPoint PPT Presentation

models of language evolution
SMART_READER_LITE
LIVE PREVIEW

Models of Language Evolution Session 10 : Iterated Learning and the - - PowerPoint PPT Presentation

Models of Language Evolution Session 10 : Iterated Learning and the Evolution of Compositionality & Recursion Michael Franke Seminar f ur Sprachwissenschaft Eberhard Karls Universit at T ubingen Compositionality, Recursion &


slide-1
SLIDE 1

Models of Language Evolution

Session 10: Iterated Learning and the Evolution of Compositionality & Recursion Michael Franke

Seminar f¨ ur Sprachwissenschaft Eberhard Karls Universit¨ at T¨ ubingen

slide-2
SLIDE 2

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Course Overview (definite?) date content 20-4 MoLE: Aims & Challenges 27-4 Evolutionary Game Theory 1: Statics 04-5 egt 2:: Macro-Dynamics 11-5 egt 3: Signaling Games (Gerhard J¨ ager) 18-5 egt 4: Micro-Dynamics & Multi-Agent Systems 25-5 Evolution of Semantic Meaning 01-6 Semantic Meaning & Conceptual Space 08-6 Evolution of Pragmatic Strategies (Roland M¨ uhlenbernd) 15-6 Pentecost — no class 22-6 Iterated Learning & Compositionality 29-6 assignment of projects 06-7 work on student projects — no class 13-7 work on student projects — consultations, no class 20-7 presentations

2 / 30

slide-3
SLIDE 3

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Compositional Semantics The meaning of a complex utterance depends systematically on the meaning of its parts and their way of combination. (1)

  • a. John likes Mary.
  • b. John abhors Mary.
  • c. Mary likes John.

3 / 30

slide-4
SLIDE 4

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Recursive Syntax & Semantics Complex expressions and meanings of type x can be embedded in another expression to form type x. (2)

  • a. John smokes.
  • b. Mary knows that John smokes.
  • c. Bill suspects that Mary knows that John smokes.
  • d. . . .

(3)

  • a. Hunde beißen.
  • b. Hunde, die Hunde beißen, beißen.
  • c. Hunde, die Hunde, die Hunde beißen, beißen, beißen.
  • d. . . .

4 / 30

slide-5
SLIDE 5

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Syntactic Structure Natural languages have seemingly idiosyncratic rules on what the “correct” way of forming a sentence and expressing a thought is. (4)

  • a. Hans raucht.
  • b. Susanne weiß, dass Hans raucht.

(5)

  • a. Hans raucht Pfeife.
  • b. Susanne weiß, dass Hans Pfeife raucht.

c.

∗Susanne weiß, dass Hans raucht Pfeife.

(6)

  • a. Hans radelt viel, denn das ist gesund.
  • b. Hans radelt viel, weil das gesund ist.

c.

∗Hans radelt viel, weil das ist gesund.

5 / 30

slide-6
SLIDE 6

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

The Innateness Hypothesis (Chomsky, 1965, and later) Humans are biologically endowed with some knowledge of certain universal elements of the structure of human languages. ⇒ innate “language faculty” domain-specific? ⇒ specialized“language acquisition device” (lad)? ⇒ shaped by biological evolution? (cf. Pinker and Bloom, 1990)

6 / 30

slide-7
SLIDE 7

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

The Poverty of the Stimulus Argument (Chomsky, 1980) consider a grammar G of a language L P1: children can rapidly and faithfully learn GL P2: data available during language acquisition underdetermines GL P3: adult competence matches GL also for unfamiliar expressions C: some parts of grammatical competence must be innate

7 / 30

slide-8
SLIDE 8

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Argument for Biological Evolution P1: what is innate is genetically encoded P2: what is genetically encoded must have been shaped by biological evolution C: the lad is a product of biological evolution

8 / 30

slide-9
SLIDE 9

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

The Iterated Learning Model (ilm) — Main Idea

  • poverty of the stimulus ⇔ “learning bottleneck”
  • grammatical competence passes the bottleneck repeatedly
  • repeated “bottlenecking” shapes language, not vice versa

9 / 30

slide-10
SLIDE 10

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

The Iterated Learning Model (ilm) — Main Idea (second shot)

  • language learners have some domain-general learning capability

including a (modest) capacity to generalize and extract patterns

  • competent speakers have learned from learners . . .

. . . who have learned from learners . . . . . . who have learned from learners . . . . . . who have learned from learners . . .

  • iterated learning can create structure which wasn’t there

before

  • given capability for generalization
  • given an appropriately sized bottleneck

10 / 30

slide-11
SLIDE 11

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Interdependencies in Language Evolution (from Kirby, 2007)

11 / 30

slide-12
SLIDE 12

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Evolution of Compositionality (Kirby and Hurford, 2002)

  • 1 learner, 1 teacher
  • teacher produces n state-signal pairs
  • learner acquires a language based on these
  • (iterate:) learner becomes teacher for new learner
  • learning model:
  • feed-forward neural network
  • backpropagation

(supervised learning)

  • production strategy: “obversion”
  • production optimizes based on individual comprehension

12 / 30

slide-13
SLIDE 13

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Learning Model: Feed-Forward Neural Network

  • 8 × 8 × 8 network for interpretation
  • input: signal

i = i1, . . . , i8 ∈ {0, 1}8

  • output: meaning
  • = o1, . . . , o8 ∈ {0, 1}8
  • initially arbitrary weights

13 / 30

slide-14
SLIDE 14

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Backpropagation

  • training items i, o are presented
  • network computes its output o′ for given i
  • error δ = o − o′ is propagated back through all layers
  • weights are adjusted accordingly

from http://galaxy.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html 14 / 30

slide-15
SLIDE 15

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Obverter Strategy

  • feed-forward net only defines interpretation strategy
  • production as best choice given the speaker’s own

interpretation:

  • suppose teacher wants to express meaning o ∈ {0, 1}8
  • she then chooses a ic ∈ {0, 1}8 that triggers network output
  • ′ ∈ [0, 1]8 if ic maximizes confidence:

ic = arg max

i∈{0,1}8 C(o|i)

defined as: C(o|i) =

8

k=1

C(ok|o′

k)

C(ok|o′

k) =

k

if ok = 1 1 − o′

k

if ok = 0

15 / 30

slide-16
SLIDE 16

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Results (20 Trainings Items)

dotted: difference teacher-learner language from Kirby and Hurford (2002) solid: proportion of meaning space covered 16 / 30

slide-17
SLIDE 17

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Results (2000 Trainings Items)

dotted: difference teacher-learner language from Kirby and Hurford (2002) solid: proportion of meaning space covered 17 / 30

slide-18
SLIDE 18

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Results (50 Trainings Items)

dotted: difference teacher-learner language from Kirby and Hurford (2002) solid: proportion of meaning space covered 18 / 30

slide-19
SLIDE 19

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Compositionality

  • compositionality arises for medium-sized bottlenecks, e.g.:
  • 1 = 1

↔ i3 = 0

  • 2 = 1

↔ i5 = 0

  • 3 = 1

↔ i6 = 0

  • 4 = 1

↔ i1 = 0

  • 5 = 1

↔ i4 = 1

  • 6 = 1

↔ i8 = 1

  • 7 = 1

↔ i2 = 0

  • 8 = 1

↔ i7 = 1

19 / 30

slide-20
SLIDE 20

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Summary: Compositionality from Iterated Learning

  • iterated learning creates compositionality . . .
  • if bottleneck size is appropriate given size of meaning and signal

spaces

  • by generalizing over sparse training data
  • by informed innovation (where necessary)
  • other learning mechanisms possible:
  • other kinds of neural networks

(e.g. Smith et al., 2003)

  • finite state transducers

(e.g. Brighton, 2002)

20 / 30

slide-21
SLIDE 21

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Evolution of Recursive Structure (Kirby and Hurford, 2002)

  • ilm mainly as before
  • state space: small logical language with
  • individual constants C

(“John”, “Mary”, dots)

  • 2-placed predicates P

(“loves(·,·)”, . . . )

  • propositional attitude predicates Q

(“thinks(·,·)”)

  • |C| = |P| = |Q| = 5
  • language: S

= P(c, c) | Q(c, S)

  • signal space: finite strings from alphabet Σ = {a, b, c, . . . , z}

21 / 30

slide-22
SLIDE 22

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Representation of Production Competence (≈ Teacher Behavior)

  • definite clause grammar, e.g.:

S/P(c1, c2) → N/c1 V/P N/c2 V/love → g N/John → ff N/Mary → h S/love(Mary, Mary) → lkjaa

  • informed innovation: if no rule available for coding a given

meaning, then . . .

  • choose the most similar meaning that you can express
  • make the smallest (“lowest”) change to the parse tree to express

new meaning

22 / 30

slide-23
SLIDE 23

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Grammar Induction (≈ Learner Behavior)

  • input: string-meaning pairs

e.g.: observe ffgh, love(John, Mary)

  • add holistic rule

e.g.: add rule S/love(John, Mary) → ffgh

  • if possible, merge or chunk rules

23 / 30

slide-24
SLIDE 24

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Chunk (example)

  • suppose we have rules:

S/love(John, Mary) → abc S/love(John, Sue) → abd

  • replace by least general rule that subsumes both:

S/love(John, c1) → ab N/c1

  • add appropriate rule for invented category N:

N/Mary → c N/Sue → d

24 / 30

slide-25
SLIDE 25

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Merge (example)

  • suppose we have rules:

N/Mary → g M/Mary → g

  • then drop, e.g., the M-based rule
  • replace non-terminal M with N throughout

25 / 30

slide-26
SLIDE 26

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Results

from Kirby and Hurford (2002) 26 / 30

slide-27
SLIDE 27

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Reflection How good and account of emerging structure is the ilm?

27 / 30

slide-28
SLIDE 28

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Homework

  • read my project proposals:
  • http://www.sfs.uni-tuebingen.de/~mfranke/MoLE2011/

material/project_proposals.pdf

  • find out more about your favorites
  • if none of my proposals suits you, prepare a proposal yourself

to present in class next week Background Reading Material for this Session

  • Simon Kirby and James R. Hurford (2002). “The Emergence of

Linguistic Structure: An Overview of the Iterated Learning Model”. In: Simulating the Evolution of Language. Ed. by

  • A. Cangelosi and D. Parisi. Springer, pp. 121–148

28 / 30

slide-29
SLIDE 29

Compositionality, Recursion & Syntactic Knowledge The Iterated Learning Model Homework & Stuff

Please Join Our Experiment!

  • 8 Euro for ca. 40min
  • make appointment: versucheb1@sfb833.uni-tuebingen.de

“Genau eine Schere ist mit keiner ihrer blauen Formen verbunden.”

  • Q
  • Q

Wahr Falsch

29 / 30

slide-30
SLIDE 30

References

Brighton, Henry (2002). “Compositional Synatx from Cultural Transmission”. In: Artificial Life 8, pp. 25–54. Chomsky, Noam (1965). Aspects of the Theory of Syntax. Cambridge University Press. – (1980). Rules and Representations. Columbia University Press. Kirby, Simon (2007). “The Evolution of Language”. In: Oxford Handbook of Evolutionary Psychology. Ed. by Robin Dunbar and Louise Barrett. Oxford University Press, pp. 669–681. Kirby, Simon and James R. Hurford (2002). “The Emergence of Linguistic Structure: An Overview of the Iterated Learning Model”. In: Simulating the Evolution of Language. Ed. by

  • A. Cangelosi and D. Parisi. Springer, pp. 121–148.

Pinker, Steven and Paul Bloom (1990). “Natural Language and Natural Selection”. In: Behavioral and Brain Sciences 13.4,

  • pp. 707–784.

Smith, Kenny et al. (2003). “Iterated Learning: A Framework for the Emergence of Language”. In: Artificial Life 9, pp. 371–386.