grammatical inference and subregular phonology
play

Grammatical inference and subregular phonology Adam Jardine - PowerPoint PPT Presentation

Grammatical inference and subregular phonology Adam Jardine Rutgers University December 11, 2019 Tel Aviv University Review Outline of course Day 1: Learning, languages, and grammars Day 2: Learning strictly local grammars Day 3:


  1. Grammatical inference and subregular phonology Adam Jardine Rutgers University December 11, 2019 · Tel Aviv University

  2. Review

  3. Outline of course • Day 1: Learning, languages, and grammars • Day 2: Learning strictly local grammars • Day 3: Automata and input strictly local functions • Day 4: Learning functions and stochastic patterns, other open questions 2

  4. Review of days 1 & 2 • Phonological patterns are governed by restrictive computational universals • We studied one such universal of strict locality 3

  5. Review of days 1 & 2 • We studied learning SL languages under the paradigm of identification in the limit from positive data t p ( t ) L ⋆ 0 abab 1 ababab 2 ab A G i . . . . . . p [ i ] i λ . . . . . . 4

  6. Today • Learning with finite-state automata for – strictly local languages – input-strictly local functions 5

  7. Strictly local acceptors

  8. Strictly local acceptors Engelfriet & Hoogeboom, 2001 “It is always a pleasant surprise when two formalisms, intro- duced with different motivations, turn out to be equally pow- erful, as this indicates that the underlying concept is a natural one.” (p. 216) 6

  9. Strictly local acceptors • A finite-state acceptor (FSA) is a set of states and transitions between states b b a 0 1 a 7

  10. Strictly local acceptors b b a 0 1 a a b b a 8

  11. Strictly local acceptors b b a 0 1 a a b b a 0 → 1 8

  12. Strictly local acceptors b b a 0 1 a a b b a 0 → 1 → 1 8

  13. Strictly local acceptors b b a 0 1 a a b b a 0 → 1 → 1 → 1 8

  14. Strictly local acceptors b b a 0 1 a a b b a 0 → 1 → 1 → 1 → 0 8

  15. Strictly local acceptors b b a 0 1 a a b b a 0 → 1 → 1 → 1 → 0 � 8

  16. Strictly local acceptors b b a 0 1 a b a a b b a 9

  17. Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 9

  18. Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 → 1 9

  19. Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 → 1 → 0 9

  20. Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 → 1 → 0 → 0 9

  21. Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 → 1 → 0 → 0 → 0 9

  22. Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 → 1 → 0 → 0 → 0 → 1 9

  23. Strictly local acceptors b b a 0 1 a b a a b b a 0 → 0 → 1 → 0 → 0 → 0 → 1 ✗ 9

  24. Strictly local acceptors • A SL k FSA ’s states represent the k − 1 factors of Σ ∗ b b a b 0 1 0 1 a a Not SL k for any k SL 2 ; 0 = b , 1 = a 10

  25. Strictly local acceptors • Traversing a SL k FSA is equivalent to scanning for k factors b a b a ⋊ a b a b ⋉ 11

  26. Strictly local acceptors • Traversing a SL k FSA is equivalent to scanning for k factors b a b a ⋊ a b a b ⋉ 11

  27. Strictly local acceptors • Traversing a SL k FSA is equivalent to scanning for k factors b a b a ⋊ a b a b ⋉ 11

  28. Strictly local acceptors • Traversing a SL k FSA is equivalent to scanning for k factors b a b a ⋊ a b a b ⋉ 11

  29. Strictly local acceptors • Traversing a SL k FSA is equivalent to scanning for k factors b a b a ⋊ a b a b ⋉ 11

  30. Strictly local acceptors • Traversing a SL k FSA is equivalent to scanning for k factors b a b a ⋊ a b a b ⋉ 11

  31. Strictly local acceptors • Forbidden k -factors are expressed by missing transitions/accepting states b a b a ⋊ a b b ⋉ ✗ 12

  32. Strictly local acceptors • SLFSAs describe exactly the SL languages • Thus, they capture the same concept of locality as SL grammars, but in a different way 13

  33. Learning with strictly local acceptors

  34. Learning with strictly local acceptors • Finite-state automata are useful because they have a number of learning techniques (de la Higuera, 2010) • We’ll use a ‘transition filling’ of Heinz and Rogers (2013) 14

  35. Learning with strictly local acceptors b a a 0 1 b 15

  36. Learning with strictly local acceptors ⊥ ⊥ b : ⊥ ⊥ a : ⊥ ⊥ ⊤ a : ⊤ ⊤ 0 : ⊥ ⊥ ⊥ 1 : ⊤ ⊤ ⊤ ⊤ b : ⊤ ⊤ • output function Q × Σ → {⊤ , ⊥} • ending function Q → {⊤ , ⊥} 15

  37. Learning with strictly local acceptors C : ⊥ C : ⊥ C : ⊥ V : ⊥ C : ⊥ ⋊ : ⊥ V : ⊥ V : ⊥ V : ⊥ Learning procedure: • Start with ‘empty’ SL k FSA • Change ⊥ transitions to ⊤ when traversed by input data 16

  38. Learning with strictly local acceptors data C : ⊥ 0 CV C : ⊥ C : ⊥ V : ⊥ C : ⊥ ⋊ : ⊥ V : ⊥ V : ⊥ V : ⊥ Learning procedure: • Start with ‘empty’ SL k FSA • Change ⊥ transitions to ⊤ when traversed by input data 16

  39. Learning with strictly local acceptors data C : ⊥ 0 CV C : ⊥ C : ⊤ V : ⊤ C : ⊥ ⋊ : ⊥ V : ⊤ V : ⊥ V : ⊥ Learning procedure: • Start with ‘empty’ SL k FSA • Change ⊥ transitions to ⊤ when traversed by input data 16

  40. Learning with strictly local acceptors data C : ⊥ 0 CV 1 V C : ⊥ C : ⊤ V : ⊤ C : ⊥ ⋊ : ⊥ V : ⊤ V : ⊥ V : ⊤ Learning procedure: • Start with ‘empty’ SL k FSA • Change ⊥ transitions to ⊤ when traversed by input data 16

  41. Learning with strictly local acceptors data C : ⊥ 0 CV 1 V C : ⊥ C : ⊤ V : ⊤ 2 CV CV C : ⊤ ⋊ : ⊥ V : ⊤ V : ⊥ V : ⊤ Learning procedure: • Start with ‘empty’ SL k FSA • Change ⊥ transitions to ⊤ when traversed by input data 16

  42. Learning with strictly local acceptors data C : ⊥ 0 CV 1 V C : ⊥ C : ⊤ V : ⊤ 2 CV CV C : ⊤ ⋊ : ⊥ V : ⊤ V : ⊥ V : ⊤ Learning procedure: • Start with ‘empty’ SL k FSA • Change ⊥ transitions to ⊤ when traversed by input data 16

  43. Learning with strictly local acceptors C : ⊥ C : ⊥ C : ⊥ V : ⊥ C : ⊥ ⋊ : ⊥ V : ⊥ V : ⊥ V : ⊥ • Any SL 2 language can be described by varying {⊤ , ⊥} on this structure 17

  44. Learning with strictly local acceptors C : ⊥ C : ⊤ C : ⊤ V : ⊤ C : ⊤ ⋊ : ⊥ V : ⊤ V : ⊤ V : ⊤ • Any SL 2 language can be described by varying {⊤ , ⊥} on this structure 17

  45. Learning with strictly local acceptors C : ⊥ C : ⊥ C : ⊤ V : ⊤ C : ⊤ ⋊ : ⊥ V : ⊤ V : ⊥ V : ⊤ • Any SL 2 language can be described by varying {⊤ , ⊥} on this structure 17

  46. Learning with strictly local acceptors C : ⊥ C : ⊥ CC : ⊥ V : ⊥ C : ⊥ C : ⊥ C : ⊥ CV : ⊥ V : ⊥ ⋊ : ⊥ V : ⊥ C : ⊥ C : ⊥ V C : ⊥ V : ⊥ V : ⊥ C : ⊥ V : ⊥ V V : ⊥ V : ⊥ V : ⊥ • Any SL 3 language can be described by this structure 18

  47. Learning with strictly local acceptors • This procedure ILPD-learns any SL k language for a given k • It is distinct, yet based on the same notion of locality 19

  48. Input strictly local functions

  49. Input strictly local functions • Generative phonology is primarily interested in maps /kam-pa/ → [kamba] /kam-pa/ F aith *NC ˇ ID (voi) /kam-pa/ *! C → [+ voi ] / N [kampa] b *! [kama] [kamba] * ☞ [kamba] 20

  50. Input strictly local functions • Maps are (functional) relations /NC / → [NC ˇ ] ˚ { ( an , an ), ( anda , anda ), ( anta , anda ), ( lalalalampa , lalalalamba ),... } • We can study classes of relations like we studied classes of formal languages 21

  51. Input strictly local functions • Johnson (1972); Kaplan and Kay (1994): phonological maps are regular memory memory length of w length of w regular non-regular • Regular functions � = regular languages! 22

  52. Input strictly local functions computable functions Reg • How do we extend strict locality to functions? 23

  53. Input strictly local functions computable functions Subseq Reg • How do we extend strict locality to functions? • Phonological maps are subsequential ... (Mohri, 1997; Heinz and Lai, 2013, et seq.) 23

  54. Subsequential transducers ⊥ ⊥ b : ⊥ ⊥ a : ⊥ ⊥ ⊤ a : ⊤ ⊤ 0 : ⊥ ⊥ ⊥ 1 : ⊤ ⊤ ⊤ ⊤ b : ⊤ ⊤ Deterministic acceptor: • output function Q × Σ → {⊤ , ⊥} • ending function Q → {⊤ , ⊥} 24

  55. Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c Subsequential transducer: • output function Q × Σ → Γ ∗ • ending function Q → Γ ∗ 24

  56. Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c b a b b 25

  57. Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c b a b b 0 → 0 b 25

  58. Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c b a b b 0 → 0 → 1 b a 25

  59. Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c b a b b 0 → 0 → 1 → 0 b a c 25

  60. Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c b a b b 0 → 0 → 1 → 0 → 0 b a c b 25

  61. Subsequential transducers a : a b : b a : a 0 : d 1 : λ b : c b a b b 0 → 0 → 1 → 0 → 0 b a c b d 25

  62. Subsequential transducers Let’s do some examples... 26

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend