models of language evolution
play

Models of Language Evolution Iterated learning Michael Franke - PowerPoint PPT Presentation

Models of Language Evolution Iterated learning Michael Franke Facets of EvoLang Compositionality Iterated Learning Facets of EvoLang Compositionality Iterated Learning 2 / 24 Facets of EvoLang Compositionality Iterated Learning Facets of


  1. Models of Language Evolution Iterated learning Michael Franke

  2. Facets of EvoLang Compositionality Iterated Learning Facets of EvoLang Compositionality Iterated Learning 2 / 24

  3. Facets of EvoLang Compositionality Iterated Learning Facets of EvoLang Compositionality Iterated Learning 7 / 24

  4. Facets of EvoLang Compositionality Iterated Learning Compositional Semantics The meaning of a complex utterance depends systematically on the meaning of its parts and their way of combination. ( 1 ) a. John likes Mary. b. John abhors Mary. c. Mary likes John. 8 / 24

  5. Facets of EvoLang Compositionality Iterated Learning Facets of EvoLang Compositionality Iterated Learning 11 / 24

  6. Facets of EvoLang Compositionality Iterated Learning Iterated Learning — Main Idea • language learners have some domain-general learning capability including a (modest) capacity to generalize and extract patterns • competent speakers have learned from learners . . . . . . who have learned from learners . . . . . . who have learned from learners . . . . . . who have learned from learners . . . ⇒ iterated learning can create structure which wasn’t there before • given capability for generalization • given an appropriately sized “learning bottleneck” 12 / 24

  7. Facets of EvoLang Compositionality Iterated Learning Evolution of Compositionality • 1 learner, 1 teacher • teacher produces n state-signal pairs • learner acquires a language based on these • (iterate:) learner becomes teacher for new learner • learning model: • feed-forward neural network • backpropagation (supervised learning) • production strategy: “obversion” • production optimizes based on individual comprehension (Kirby and Hurford, 2002 ) 13 / 24

  8. Facets of EvoLang Compositionality Iterated Learning Learning Model: Feed-Forward Neural Network • 8 × 8 × 8 network for interpretation • input: signal i = � i 1 , . . . , i 8 � ∈ { 0 , 1 } 8 • output: meaning o = � o 1 , . . . , o 8 � ∈ { 0 , 1 } 8 • initially arbitrary weights 14 / 24

  9. Facets of EvoLang Compositionality Iterated Learning Backpropagation • training items � i , o � are presented • network computes its output o ′ for given i • error δ = o − o ′ is propagated back through all layers • weights are adjusted accordingly picture from http://galaxy.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html 15 / 24

  10. Facets of EvoLang Compositionality Iterated Learning Obverter Strategy • feed-forward net only defines interpretation strategy • production as best choice given the speaker’s own interpretation: • suppose teacher wants to express meaning o ∈ { 0 , 1 } 8 • she then chooses a i c ∈ { 0 , 1 } 8 that triggers network output o ′ ∈ [ 0 , 1 ] 8 if i c maximizes confidence : i c = arg max i ∈{ 0 , 1 } 8 C ( o | i ) defined as: 8 C ( o k | o ′ ∏ C ( o | i ) = k ) k = 1 � o ′ if o k = 1 C ( o k | o ′ k k ) = 1 − o ′ if o k = 0 k 16 / 24

  11. Facets of EvoLang Compositionality Iterated Learning Results ( 20 Trainings Items) dotted: difference teacher-learner language (Kirby and Hurford, 2002 ) solid: proportion of meaning space covered 17 / 24

  12. Facets of EvoLang Compositionality Iterated Learning Results ( 2000 Trainings Items) dotted: difference teacher-learner language (Kirby and Hurford, 2002 ) solid: proportion of meaning space covered 18 / 24

  13. Facets of EvoLang Compositionality Iterated Learning Results ( 50 Trainings Items) dotted: difference teacher-learner language (Kirby and Hurford, 2002 ) solid: proportion of meaning space covered 19 / 24

  14. Facets of EvoLang Compositionality Iterated Learning Compositionality • compositionality arises for medium-sized bottlenecks, e.g.: o 1 = 1 ↔ i 3 = 0 o 2 = 1 ↔ i 5 = 0 o 3 = 1 ↔ i 6 = 0 o 4 = 1 ↔ i 1 = 0 o 5 = 1 ↔ i 4 = 1 o 6 = 1 ↔ i 8 = 1 o 7 = 1 ↔ i 2 = 0 o 8 = 1 ↔ i 7 = 1 20 / 24

  15. Facets of EvoLang Compositionality Iterated Learning Summary • iterated learning “creates” compositional meaning . . . • if bottleneck size is appropriate • by generalizing over sparse training data • by informed innovation (where necessary) • other learning mechanisms possible: • other kinds of neural networks (e.g. Smith et al., 2003 ) • finite state transducers (e.g. Brighton, 2002 ) 21 / 24

  16. Homework solve the mock exam and prepare questions for midterm exam

  17. References Brighton, Henry ( 2002 ). “Compositional Synatx from Cultural Transmission”. In: Artificial Life 8 , pp. 25 – 54 . Kirby, Simon ( 2007 ). “The Evolution of Language”. In: Oxford Handbook of Evolutionary Psychology . Ed. by Robin Dunbar and Louise Barrett. Oxford University Press, pp. 669 – 681 . Kirby, Simon, Tom Griffith, et al. ( 2014 ). “Iterated Learning and the Evolution of Language”. In: Current Opinion in Neurobiology 28 , pp. 108 – 114 . Kirby, Simon and James R. Hurford ( 2002 ). “The Emergence of Linguistic Structure: An Overview of the Iterated Learning Model”. In: Simulating the Evolution of Language . Ed. by A. Cangelosi and D. Parisi. Springer, pp. 121 – 148 . Smith, Kenny et al. ( 2003 ). “Iterated Learning: A Framework for the Emergence of Language”. In: Artificial Life 9 , pp. 371 – 386 .

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend