Tensor-Based Abduction in Horn Propositional Programs Yaniv Aspis, - - PowerPoint PPT Presentation

β–Ά
tensor based abduction
SMART_READER_LITE
LIVE PREVIEW

Tensor-Based Abduction in Horn Propositional Programs Yaniv Aspis, - - PowerPoint PPT Presentation

September 2018 Tensor-Based Abduction in Horn Propositional Programs Yaniv Aspis, Krysia Broda, Alessandra Russo Department of Computing, Imperial College London {yaniv.aspis17,k.broda,a.russo}@imperial.ac.uk Why Linear Algebra? AI


slide-1
SLIDE 1

Tensor-Based Abduction in Horn Propositional Programs

Yaniv Aspis, Krysia Broda, Alessandra Russo

September 2018

Department of Computing, Imperial College London

{yaniv.aspis17,k.broda,a.russo}@imperial.ac.uk

slide-2
SLIDE 2

Why Linear Algebra?

  • AI software is moving towards GPU-based solutions
  • Optimized for matrix/tensor multiplication
  • Highly-Parallel computations
  • Develop algorithms for GPU
slide-3
SLIDE 3

Abduction

  • Logical Inference through Explanations

𝑄, 𝑕, 𝐡𝑐

  • Assume 𝑄 is a Horn Logic Program
  • Observation 𝑕
  • Find Ξ” βŠ† 𝐡𝑐 such that 𝑕 ∈ 𝑀𝐼𝑁(𝑄 βˆͺ Ξ”)
  • 𝑄 βˆͺ Ξ” is consistent
slide-4
SLIDE 4

Background: Embedding Atoms[1]

𝑕 ← π‘ž ∧ π‘Ÿ π‘ž ← π‘Ÿ π‘Ÿ ← ← 𝑕

𝑀 𝑕 = 1 𝑀 π‘ž = 1 𝑀 π‘Ÿ = 1 𝑀 βŠ₯ = 1 𝑀 ⊀ = 1 𝑀 𝑕,π‘Ÿ = 1 1 1

  • 1. Sakama, C., Inoue, K., Sato, T.: Linear Algebraic Characterization of Logic Programs. In: KSEM. Lecture

Notes in Computer Science, vol. 10412, pp. 520-533. Springer (2017)

slide-5
SLIDE 5

Background: Embedding Programs[1]

𝑕 ← π‘ž ∧ π‘Ÿ π‘ž ← π‘Ÿ π‘Ÿ ← ← 𝑕 𝐸𝑄 = 1 1 1 1 1 1 1 1 1 1 2 1 2 1 1 1 1 1 1 βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ 𝐾 = π‘ˆπ‘„ 𝐽 βˆͺ 𝐽 ⇔ 𝑀 𝐾 = 𝐼1 𝐸𝑄 β‹… 𝑀 𝐽

𝐼1 𝑦 = 0, 𝑦 < 1 1, 𝑦 β‰₯ 1 Immediate Consequence via matrix multiplication:

  • 1. Sakama, C., Inoue, K., Sato, T.: Linear Algebraic Characterization of Logic Programs. In: KSEM. Lecture

Notes in Computer Science, vol. 10412, pp. 520-533. Springer (2017)

slide-6
SLIDE 6

Invert implications:

𝑕 ← π‘ž ∧ π‘Ÿ 𝑕 ← π‘Ÿ 𝑕 ← 𝑒

Abduction - Main Idea

𝑕 π‘ž, π‘Ÿ π‘Ÿ 𝑒

π‘ž, π‘Ÿ , π‘Ÿ , 𝑒 , 𝑕 Potential solutions:

slide-7
SLIDE 7

Tensor Embedding

1 1 1 1 1 1 1

βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ 𝑒 βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ 𝑒

𝑕 β†’ π‘ž ∧ π‘Ÿ

slide-8
SLIDE 8

𝐡∷1 = 1 1 1 1 1 1 1

βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ 𝑒

𝑕 β†’ π‘ž ∧ π‘Ÿ

𝐡∷2 = 1 1 1 1 1 1

βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ 𝑒

𝑕 β†’ π‘Ÿ

𝐡∷3 = 1 1 1 1 1 1

βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ 𝑒

𝑕 β†’ 𝑒

𝐡∷4 = 1 1 1 1 1 1

βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ 𝑒

Keep 𝑕

slide-9
SLIDE 9

Frontal Slices

π΅βˆ·π‘™ Third-Order Tensor 𝐡

1 1 1 1 1 1 1

slide-10
SLIDE 10

Abductive Step

𝐼1 𝐡 Γ—2 1 1 = 1 1 1 1 1 1 1 1 1

βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ 𝑒

𝑕 β‡’ π‘ž, π‘Ÿ , π‘Ÿ , 𝑒 , 𝑕

slide-11
SLIDE 11

Inconsistencies

𝑕 ← π‘ž ∧ π‘Ÿ 𝑕 ← π‘Ÿ 𝑕 ← 𝑒 ← 𝑒

1 1 1 1 1 1 1 1 1

So far:

Idea: Compute 𝑀𝐼𝑁(𝑄 βˆͺ Ξ”) for each column

βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ 𝑒

slide-12
SLIDE 12

Inconsistencies

1 1 1 1 1 1 1 1 1

βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ 𝑒

β‡’

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

𝑀𝐼𝑁

slide-13
SLIDE 13

Inconsistencies

1 1 1 1 1 1 1 1 1

βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ 𝑒

β‡’

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

𝑀𝐼𝑁 Inconsistent!

slide-14
SLIDE 14

Inconsistencies

1 1 1 1 1 1 1 1 1

βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ 𝑒

β‡’

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1

𝑀𝐼𝑁 Inconsistent! Remove inconsistencies, duplicates, and continue to Abductive Step…

slide-15
SLIDE 15

Abducibles

  • Ξ” βŠ† 𝐡𝑐 if and only if:

𝑀 Ξ” Γ— 𝑀 𝐡𝑐 = 𝑀 Ξ”

  • Post-Filtering
  • If Abducibles are not defined, then Ξ” βŠ† 𝐡𝑐 implies

𝐼1 𝐡𝑄 Γ—2 𝑀 Ξ” = 𝑀 Ξ” after duplicates are removed (Filter during run)

slide-16
SLIDE 16

Discussion and Future Work

  • Clauses with negation
  • First-Order Predicate Logic
  • Optimisation and Scalability Testing

Future Work:

  • Proof of correctness has been completed
  • An unoptimised implementation has been made
slide-17
SLIDE 17

Tensor Multiplication

Γ— = Γ— = =

Flatten

slide-18
SLIDE 18

Multiple Definitions

𝑕 ← π‘ž ∧ π‘Ÿ 𝑕 ← π‘ž ∧ 𝑠

𝐼 1 1 1 1 1 1 1 1 1 2 1 2 1 2 1 1 1 β‹… 1 1 1 = 1 1

Introduce auxiliary variables:

𝑕1 ← π‘ž ∧ π‘Ÿ 𝑕2 ← π‘ž ∧ 𝑠 𝑕 ← 𝑕1 𝑕 ← 𝑕2

βŠ₯ ⊀ 𝑕 π‘ž π‘Ÿ π‘