cross domain scruffy inference
play

Cross-Domain Scruffy Inference Kenneth C. Arnold Henry Lieberman - PowerPoint PPT Presentation

Cross-Domain Scruffy Inference Kenneth C. Arnold Henry Lieberman MIT Mind Machine Project Software Agents Group, MIT Media Laboratory AAAI Fall Symposium on Common Sense Knowledge November 2010 Kenneth C. Arnold, Henry Lieberman (MIT)


  1. Cross-Domain Scruffy Inference Kenneth C. Arnold Henry Lieberman MIT Mind Machine Project Software Agents Group, MIT Media Laboratory AAAI Fall Symposium on Common Sense Knowledge November 2010 Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 1 / 16

  2. Vision Informal (“scruffy”) inductive reasoning over non-formalized knowledge Use multiple knowledge bases without tedious alignment. Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 2 / 16

  3. We’ve Done This. . . ConceptNet and WordNet (Havasi et al. 2009) Topics and Opinions in Text (Speer et al. 2010) Code and Descriptions of Purpose (Arnold and Lieberman 2010) but how does it work? Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 3 / 16

  4. This Talk Background Blending is Collective Matrix Factorization. Singular vectors rotate. Other blending layouts work too. Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 4 / 16

  5. Background Matrix Representations of Knowledge Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 5 / 16

  6. Background Factored Inference Filling in missing values is inference. Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 6 / 16

  7. Background Factored Inference Represent each concept i and each feature j by k -dimensional c i and � vectors � f j such that when A ( i , j ) is known, c i · � A ( i , j ) ≈ � f j . c i · � If A ( i , j ) is unknown, infer � f j . Equivalently, stack each � c i in rows of C , same for F , then A ≈ CF T . Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 7 / 16

  8. Collective Factorization Quantifying factorization quality Quantify the “ ≈ ” in A ≈ CF T as a divergence: D ( XY T | A ) Minimizing loss ensures that the factorization fits the data Many functions possible, e.g., SVD minimizes squared error: D x 2 (ˆ � a ij ) 2 . ( a ij − ˆ A | A ) = ij Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 8 / 16

  9. Collective Factorization Collective Matrix Factorization An analogy. . . Let people p rate restaurants r , represented by positive or negative values in � p � × � r � matrix A . Restaurants also have characteristics c (e.g., “serves vegetarian food”, “takes reservations”, etc.), represented by matrix B . Incorporating characteristics may improve rating prediction. Use the same restaurant vector to factor preferences and characteristics: A ≈ PR T B ≈ RC T Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 9 / 16

  10. Collective Factorization Collective Matrix Factorization A ≈ PR T B ≈ RC T ( A is person by restaurant, B is restaurant by characteristics) Collective Matrix Factorization (Singh and Gordon 2008) gives a framework for solving this type of problem Spread out the approximation loss: α D ( PR T | A ) + ( 1 − α ) D ( RC T | B ) At α = 1, factors as if characteristics were just patterns of ratings. At α = 0, factors as if only qualities, not individual restaurants, mattered for ratings. Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 10 / 16

  11. Collective Factorization Blending is a CMF A ≈ PR T B ≈ RC T ( A is person by restaurant, B is restaurant by characteristics) Can also solve with Blending: � T α A T � � � P Z = ≈ R ( 1 − α ) B C If decomposition is SVD, loss is seperable by component: � � T � � P = D ( RP T | α A T ) + D ( RC T | ( 1 − α ) B ) | Z D R C ⇒ Blending is a kind of Collective Matrix Factorization Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 11 / 16

  12. Blended Data Rotates the Factorization Veering Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 12 / 16

  13. Blended Data Rotates the Factorization Blended Data Rotates the Factorization What happens at an intersection point? Consider you’re blending X and Y . Start with X ≈ AB T ; what happens as you add in Y ? First add in the new space that only Y covered. Now data is off-axis, so rotate the axes to align with the data. a = 1 , θ = 0 π 1 . 0 a = 1 , θ = 0 . 25 π a = 1 , θ = 0 . 5 π a = 0 , θ = 0 . 5 π 0 . 5 0 . 0 − 0 . 5 − 1 . 0 − 1 . 5 − 1 . 0 − 0 . 5 0 . 0 0 . 5 1 . 0 1 . 5 Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 13 / 16

  14. Blended Data Rotates the Factorization Veering “Veering” is caused by singular vectors of the blend rotating between corresponding singular vectors of the source matrices. 2 . 5 θ U = 1.1 θ U = 0.26 θ U = 1.3 2 . 0 1 . 5 singular values 1 . 0 0 . 5 0 . 0 0 . 0 0 . 5 1 . 0 1 . 5 2 . 0 α Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 14 / 16

  15. Layout Tricks Bridge Blending General bridge blend: � T � U XY V T U XY V T � X � � U XY � � V X 0 � Y X 0 YZ ≈ = U 0 Z V T U 0 Z V T English Features French Concepts Z U 0 Z V YZ X 0 YZ English Concepts En ↔ Fr English Again, loss factors: ConceptNet Dictionary D (ˆ A | A ) = D ( U XY V T X 0 | X ) + D ( U XY V T X YZ | Y )+ French Features D ( U 0 Z V T X 0 | 0 ) + D ( U 0 Z V T YZ | Z ) French T ConceptNet V YZ ties factorization of X and Z together T Y through bridge data Y . Could use weighted loss in empty corner. Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 15 / 16

  16. Summary Summary Blending is a Collective Matrix Factorization “Veering” indicates singular vectors rotating between datasets What’s next? CMF permits many objective functions, even different ones for different input data. What’s appropriate for commonsense inference? Incremental? Can CMF do things we thought we needed 3rd-order for? Kenneth C. Arnold, Henry Lieberman (MIT) Cross-Domain Scruffy Inference CSK 2010 16 / 16

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend