cognitive compositional semantics using continuation
play

Cognitive Compositional Semantics using Continuation Dependencies - PowerPoint PPT Presentation

Cognitive Compositional Semantics using Continuation Dependencies William Schuler, Adam Wheeler Dept Linguistics, The Ohio State University August 25, 2014 William Schuler, Adam Wheeler Compositional Semantics using Continuation Dependencies


  1. Cognitive Compositional Semantics using Continuation Dependencies William Schuler, Adam Wheeler Dept Linguistics, The Ohio State University August 25, 2014 William Schuler, Adam Wheeler Compositional Semantics using Continuation Dependencies

  2. Introduction Goal: model how brains represent complex scoped quantified propositions ◮ Use only cued associations (dependencies from cue to target state) [Marr, 1971, Anderson et al., 1977, Murdock, 1982, McClelland et al., 1995, Howard and Kahana, 2002] (no direct implementation of unconstrained beta reduction) ◮ Interpret by traversing cued associations in sentence, match to memory (assume learned traversal process, sensitive to up/down entailment) ◮ Despite austerity, can model scope using ‘continuation’ dependencies ◮ Seems to make reassuring predictions: ◮ conjunct matching is easy, even in presence of quantifiers ◮ quantifier upward/downward entailment (monotone incr/decr) is hard ◮ disjunction is as hard as quantifier upward/downward entailment ◮ Empirical evaluation shows no coverage or learnability gaps ◮ cognitively motivated model is about as accurate as state of art William Schuler, Adam Wheeler Compositional Semantics using Continuation Dependencies

  3. Background: why dependencies? Model connections in associative memory w. matrix [Anderson et al., 1977]: v = M u (1) def = � J (1 ′ ) ( M u ) [ i ] j = 1 M [ i , j ] · u [ j ] Build cued associations using outer product [Marr, 1971]: M t = M t − 1 + v ⊗ u (2) def (2 ′ ) ( v ⊗ u ) [ i , j ] = v [ i ] · u [ j ] Merge results of cued associations using pointwise / diagonal product: w = diag ( u ) v (3) def (3 ′ ) ( diag ( v ) u ) [ i ] = v [ i ] · u [ i ] William Schuler, Adam Wheeler Compositional Semantics using Continuation Dependencies

  4. Background: why dependencies? Dependency relations with label ℓ i from u i to v i can be stored as vectors r i : R def = � i v i ⊗ r i (4a) R ′ def = � i r i ⊗ ℓ i (4b) R ′′ def = � i r i ⊗ u i (4c) And retrieved/traversed using accessor matrices R , R ′ , R ′′ [Schuler, 2014]: v i ≈ R diag ( R ′ ℓ i ) R ′′ u i (5) This cue sequence can be simplified as dependency function: v i = ( f ℓ i u i ) (6) William Schuler, Adam Wheeler Compositional Semantics using Continuation Dependencies

  5. Background: predications and graph matching Dependencies can combine into predications [Copestake et al., 2005]: ( f u v 1 v 2 v 3 . . . ) ⇔ ( f 0 u ) = v f ∧ ( f 1 u ) = v 1 ∧ ( f 2 u ) = v 2 ∧ ( f 3 u ) = v 3 ∧ . . . (7) For example: ( C ontain u v 1 v 2 ) ⇔ ( f 0 u ) = v C ontain ∧ ( f 1 u ) = v 1 ∧ ( f 2 u ) = v 2 (8) Dependencies incrementally matched to memory during comprehension: v t = R R ′′ v t − (9a) 1 1 + R diag ( R ′ R ′⊤ R ′′ v t − 1 ) R ′′ A t − A t = A t − 1 v t − 1 ⊗ v t (9b) (or reverse, during production). Need conditional traversal for entailment [MacCartney and Manning, 2009]. William Schuler, Adam Wheeler Compositional Semantics using Continuation Dependencies

  6. Scoped quantified predications: ‘direct’ style Can implement a ‘direct’ semantics based on lambda calculus [Koller, 2004]: ( E very p L s L s ′ L ) ∧ ( S et s L d L e L ) ∧ ( L ine e L d L ) ∧ ( S et s ′ L d ′ L p N ) ∧ ( T wo p N s N s ′ N ) ∧ ( S et s N d N e N ) ∧ ( N umber e N d N ) ∧ ( S et s ′ N d ′ N e C ) ∧ ( C ontain e C d ′ L d ′ N ) p L 0 2 1 s ′ p L s L E very L 0 0 1 2 1 2 0 1 2 s ′ d ′ s L e L p A d L E very λ λ L L 1 1 0 1 2 0 1 2 0 1 0 2 d ′ p N p S p N d L e L λ λ L ine A nd L 1 1 1 0 0 0 1 2 0 1 2 1 2 s ′ s ′ s ′ s N s S s N L ine T wo A T wo N S N 0 1 0 0 1 0 0 1 0 2 1 2 2 1 2 2 1 2 d ′ d ′ d ′ e N e C e S e B e N e C λ d N λ λ d S λ λ d N λ N S N 1 2 1 2 1 2 0 0 0 0 0 0 N umber C ontain S pace B egins W ith N umber C ontain Hard to learn to match conjunct (left) in conjoined representation (right). William Schuler, Adam Wheeler Compositional Semantics using Continuation Dependencies

  7. Scoped quantified predications: ‘continuation’ style Change redundant dependency ‘2’ at lambdas to instead point up to context: E very 0 0 S ome 0 p L p N p C T wo 2 2 1 2 1 1 2 2 s ′ s ′ s ′ s L s N s C L N C 0 1 0 1 0 1 0 1 0 1 0 1 d ′ d ′ e ′ e L d L e N d N e C λ λ λ λ λ λ L N C 2 1 1 0 1 L ine N umber C ontain E very 0 0 S ome 0 0 S ome 0 p L p S p B p N p C A T wo 2 1 2 1 2 1 2 1 2 1 2 2 2 2 s ′ s ′ s ′ s ′ s ′ s L s S s B s N s C L S B N C 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 d ′ e ′ d ′ e ′ d ′ e L d L e S d S e B e N d N e C λ λ λ λ λ λ λ λ λ λ L S B N C 2 2 1 1 1 1 0 0 1 L ine S pace B egin W ith N umber C ontain Upward dependencies look like ‘continuation-passing’ style [Barker, 2002]. William Schuler, Adam Wheeler Compositional Semantics using Continuation Dependencies

  8. Bestiary of referential states Set referents are now context-sensitive. . . ◮ ordinary discourse referents d ∈ D [Karttunen, 1976]: ◮ referents with no arguments ◮ eventualities e ∈ E [Davidson, 1967, Parsons, 1990]: ◮ referents with beginning, end, duration ◮ one argument for each participant, ordered arbitrarily ◮ reified sets or groups s ∈ S [Hobbs, 1985]: ◮ referents with cardinalities, can be co-referred by plural anaphora ◮ has iterator argument d 1 ◮ has scope argument s 2 , sim. to continuation parameters [Barker, 2002] ◮ has superset argument s 3 specifying superset ◮ propositions p ∈ P [Thomason, 1980]: ◮ referents that can be believed or doubted ◮ form of generalized quantifier [Barwise and Cooper, 1981] ◮ has restrictor argument s 1 ◮ has nuclear scope argument s 2 William Schuler, Adam Wheeler Compositional Semantics using Continuation Dependencies

  9. Translation to lambda calculus Lambda calculus terms ∆ can be derived from predications Γ : ◮ Initialize ∆ with lambda terms (sets) that have no outscoped sets in Γ : Γ , ( S et s i ) ; ∆ ) ; ( λ i T rue ) , ∆ ( S et s ) � Γ Γ , ( S et s i ◮ Add constraints to appropriate sets in ∆ : Γ , ( f i 0 .. i .. i N ) ; ( λ i o ) , ∆ Γ ; ( λ i o ∧ ( h f i 0 .. i .. i N )) , ∆ i 0 ∈ E ◮ Add constraints of supersets as constraints on subsets in ∆ : ) , ( S et s ′ i ′ s ′′ s ) ; ( λ i o ∧ ( h f i 0 .. i .. i N )) , ( λ i ′ o ′ ) , ∆ Γ , ( S et s i ) , ( S et s ′ i ′ s ′′ s ) ; ( λ i o ∧ ( h f i 0 .. i .. i N )) , ( λ i ′ o ′ ∧ ( h f i 0 .. i ′ .. i N )) , ∆ Γ , ( S et s i ◮ Add quantifiers over completely constrained sets in ∆ : ) , ( f p s ′ s ′′ ) , ( S et s ′ i ′ s ) , ( S et s ′′ i ′′ s ′ s ′ ) ; Γ , ( S et s i ( λ i o ) , ( λ i ′ o ′ ) , ( λ i ′′ o ′′ ) , ∆ p ∈ P , ( f ′ .. i ′ .. ) � Γ , ( f ′′ .. i ′′ .. ) � Γ . ) ; ( λ i o ∧ ( h f ( λ i ′ o ′ ) ( λ i ′′ o ′′ ))) , ∆ Γ , ( S et s i For example: ( E very ( λ d L S ome ( λ e L B eing AL ine e L d L )) ( λ d ′ L T wo ( λ d N S ome ( λ e N B eing AN um e N d N )) N S ome ( λ e H H aving e H d ′ L d ′ ( λ d ′ N )))) William Schuler, Adam Wheeler Compositional Semantics using Continuation Dependencies

  10. Predictions This model makes reassuring predictions (to be evaluated in future work). . . ◮ Conjunct matching is easy, automatic, learned early. Evidence: errors until about 21 months [Gertner and Fisher, 2012]. ◮ Upward/downward entailment on 1st/2nd argument is much harder: More than two perl scripts work. ⊢ More than two scripts work. Fewer than two scripts work. ⊢ Fewer than two perl scripts work. Not simple matching; speaker must learn conditional matching rules. Evidence: ‘quantifier spreading’ [Inhelder and Piaget, 1958, Philip, 1995] (children until ∼ 10yrs don’t reliably constrain restrictor with noun, etc.). ◮ Disjunction is similarly difficult: Every line begins with at least 1 space or contains at least 2 dashes. Can be translated to conjunction using de Morgan’s law: No line begins with less than 1 space and contains less than 2 dashes. Yields downward-entailing quantifiers, requiring conditional matching. ◮ Other phenomena? Evaluation shows no coverage/learnability gaps. William Schuler, Adam Wheeler Compositional Semantics using Continuation Dependencies

  11. Dependency graph composition: lexical items Semantics here extends categorial grammar of [Nguyen et al., 2012]. . . Lexical items associate syntactic arguments with semantic arguments: x ⇒ u ϕ 1 ...ϕ n : λ i ( f 0 i ) = x with ∧ ( f 0 ( f 1 ( f 1 ( f 1 i )))) = x 0 ∧ ( f 1 ( f 1 ( f 1 ( f 1 i )))) = ( f 1 ( f 3 i )) ∧ . . . ∧ ( f n ( f 1 ( f 1 ( f 1 i )))) = ( f 1 ( f 2 n + 1 i )) i 1 3 5 p For example: 1 s ′ s s ′′ with ⇒ A-aN-bN : λ i ( f 0 i ) =with 1 1 1 d ′ d ′′ e ∧ ( f 0 ( f 1 ( f 1 ( f 1 i )))) =W ith 1 2 ∧ ( f 1 ( f 1 ( f 1 ( f 1 i )))) = ( f 1 ( f 3 i )) 0 ∧ ( f 2 ( f 1 ( f 1 ( f 1 i )))) = ( f 1 ( f 5 i )) . W ith William Schuler, Adam Wheeler Compositional Semantics using Continuation Dependencies

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend