probabilistic logic programming and its applications
play

Probabilistic Logic Programming and its Applications Luc De Raedt - PowerPoint PPT Presentation

Probabilistic Logic Programming and its Applications Luc De Raedt with many slides from Angelika Kimmig The Turing, London, September 11, 2017 1 A key question in AI: Dealing with uncertainty Reasoning with relational data ? Learning 2


  1. ProbLog by example: A bit of gambling h • toss (biased) coin & draw ball from each urn • win if (heads and a red ball) or (two balls of same color) 17

  2. ProbLog by example: A bit of gambling h • toss (biased) coin & draw ball from each urn • win if (heads and a red ball) or (two balls of same color) probabilistic fact : heads is true with 0.4 :: heads. 
 probability 0.4 (and false with 0.6) 17

  3. ProbLog by example: A bit of gambling h • toss (biased) coin & draw ball from each urn • win if (heads and a red ball) or (two balls of same color) annotated disjunction : first ball is red 0.4 :: heads. 
 with probability 0.3 and blue with 0.7 0.3 :: col(1,red); 0.7 :: col(1,blue). 17

  4. ProbLog by example: A bit of gambling h • toss (biased) coin & draw ball from each urn • win if (heads and a red ball) or (two balls of same color) 0.4 :: heads. 
 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 
 0.5 :: col(2,blue). 
 annotated disjunction : second ball is red with probability 0.2, green with 0.3, and blue with 0.5 17

  5. ProbLog by example: A bit of gambling h • toss (biased) coin & draw ball from each urn • win if (heads and a red ball) or (two balls of same color) 0.4 :: heads. 
 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 
 0.5 :: col(2,blue). 
 logical rule encoding win :- heads, col(_,red). background knowledge 17

  6. ProbLog by example: A bit of gambling h • toss (biased) coin & draw ball from each urn • win if (heads and a red ball) or (two balls of same color) 0.4 :: heads. 
 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 
 0.5 :: col(2,blue). 
 logical rule encoding win :- heads, col(_,red). background knowledge win :- col(1,C), col(2,C). 17

  7. ProbLog by example: A bit of gambling h • toss (biased) coin & draw ball from each urn • win if (heads and a red ball) or (two balls of same color) 0.4 :: heads. 
 probabilistic choices 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 
 0.5 :: col(2,blue). 
 win :- heads, col(_,red). consequences win :- col(1,C), col(2,C). 17

  8. De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Questions 0.4 :: heads. 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 0.5 :: col(2,blue). win :- heads, col(_,red). win :- col(1,C), col(2,C). • Probability of win ? 
 • Probability of win given col(2,green) ? 
 • Most probable world where win is true? 18

  9. De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Questions 0.4 :: heads. 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 0.5 :: col(2,blue). win :- heads, col(_,red). win :- col(1,C), col(2,C). marginal probability • Probability of win ? 
 query • Probability of win given col(2,green) ? 
 • Most probable world where win is true? 18

  10. De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Questions 0.4 :: heads. 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 0.5 :: col(2,blue). win :- heads, col(_,red). win :- col(1,C), col(2,C). marginal probability • Probability of win ? 
 conditional probability • Probability of win given col(2,green) ? 
 evidence • Most probable world where win is true? 18

  11. De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Questions 0.4 :: heads. 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 0.5 :: col(2,blue). win :- heads, col(_,red). win :- col(1,C), col(2,C). marginal probability • Probability of win ? 
 conditional probability • Probability of win given col(2,green) ? 
 • Most probable world where win is true? MPE inference 18

  12. Possible Worlds 0.4 :: heads. 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 0.5 :: col(2,blue). win :- heads, col(_,red). win :- col(1,C), col(2,C). 19

  13. Possible Worlds 0.4 :: heads. 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 0.5 :: col(2,blue). win :- heads, col(_,red). win :- col(1,C), col(2,C). 19

  14. Possible Worlds 0.4 :: heads. 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 0.5 :: col(2,blue). win :- heads, col(_,red). win :- col(1,C), col(2,C). 0.4 H 19

  15. Possible Worlds 0.4 :: heads. 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 0.5 :: col(2,blue). win :- heads, col(_,red). win :- col(1,C), col(2,C). 0.4 × 0.3 H R 19

  16. Possible Worlds 0.4 :: heads. 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 0.5 :: col(2,blue). win :- heads, col(_,red). win :- col(1,C), col(2,C). 0.4 × 0.3 × 0.3 H R G 19

  17. Possible Worlds 0.4 :: heads. 0.3 :: col(1,red); 0.7 :: col(1,blue). 0.2 :: col(2,red); 0.3 :: col(2,green); 0.5 :: col(2,blue). win :- heads, col(_,red). win :- col(1,C), col(2,C). 0.4 × 0.3 × 0.3 H R G W 19

  18. Possible Worlds 0.4 :: heads. 0.3 :: col(1,red); 0.7 :: col(1,blue) <- true. 0.2 :: col(2,red); 0.3 :: col(2,green); 0.5 :: col(2,blue) <- true. win :- heads, col(_,red). win :- col(1,C), col(2,C). (1 − 0.4) × 0.3 × 0.3 (1 − 0.4) × 0.3 × 0.2 0.4 × 0.3 × 0.3 H R G R R R G W W 20

  19. De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI All Possible Worlds 0.024 0.036 0.056 0.084 H H R R R R B R B R W W W 0.036 0.054 0.084 0.126 H H R G B G B G R G W 0.060 0.090 0.140 0.210 H H R B B B B B R B W W W 21

  20. Most likely world De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI MPE Inference where win is true? 0.024 0.036 0.056 0.084 H H R R R R B R B R W W W 0.036 0.054 0.084 0.126 H H R G B G B G R G W 0.060 0.090 0.140 0.210 H H R B B B B B R B W W W 22

  21. Most likely world De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI MPE Inference where win is true? 0.024 0.036 0.056 0.084 H H R R R R B R B R W W W 0.036 0.054 0.084 0.126 H H R G B G B G R G W 0.060 0.090 0.140 0.210 H H R B B B B B R B W W W 22

  22. De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Marginal P(win)= ? Probability 0.024 0.036 0.056 0.084 H H R R R R B R B R W W W 0.036 0.054 0.084 0.126 H H R G B G B G R G W 0.060 0.090 0.140 0.210 H H R B B B B B R B W W W 23

  23. De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Marginal P(win)= ∑ Probability 0.024 0.036 0.056 0.084 H H R R R R B R B R W W W 0.036 0.054 0.084 0.126 H H R G B G B G R G W 0.060 0.090 0.140 0.210 H H R B B B B B R B W W W 23

  24. De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Marginal P(win)= ∑ =0.562 Probability 0.024 0.036 0.056 0.084 H H R R R R B R B R W W W 0.036 0.054 0.084 0.126 H H R G B G B G R G W 0.060 0.090 0.140 0.210 H H R B B B B B R B W W W 23

  25. ? De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI P(win|col(2,green))= Conditional Probability 0.024 0.036 0.056 0.084 H H R R R R B R B R W W W 0.036 0.054 0.084 0.126 H H R G B G B G R G W 0.060 0.090 0.140 0.210 H H R B B B B B R B W W W 24

  26. De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI ∑ / ∑ P(win|col(2,green))= Conditional Probability =P(win ∧ col(2,green))/P(col(2,green)) 0.024 0.036 0.056 0.084 H H R R R R B R B R W W W 0.036 0.054 0.084 0.126 H H R G B G B G R G W 0.060 0.090 0.140 0.210 H H R B B B B B R B W W W 24

  27. De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI ∑ / ∑ P(win|col(2,green))= Conditional Probability =P(win ∧ col(2,green))/P(col(2,green)) 0.024 0.036 0.056 0.084 H H R R R R B R B R W W W 0.036 0.054 0.084 0.126 H H R G B G B G R G W 0.060 0.090 0.140 0.210 H H R B B B B B R B W W W 24

  28. De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI ∑ / ∑ P(win|col(2,green))= Conditional Probability =0.036/0.3=0.12 0.024 0.036 0.056 0.084 H H R R R R B R B R W W W 0.036 0.054 0.084 0.126 H H R G B G B G R G W 0.060 0.090 0.140 0.210 H H R B B B B B R B W W W 24

  29. Distribution Semantics De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI (with probabilistic facts) [Sato, ICLP 95] sum over possible worlds query where Q is true X Y Y P ( Q ) = p ( f ) 1 − p ( f ) f 2 F f 62 F F [ R | = Q probability of subset of Prolog possible world probabilistic rules facts 25

  30. Flexible and Compact Relational Model for Predicting Grades “Program” Abstraction: S, C logical variable representing students, courses ▪ the set of individuals of a type is called a population ▪ ▪ Int(S), Grade(S, C), D(C) are parametrized random variables Grounding: for every student s, there is a random variable Int(s) • for every course c, there is a random variable Di(c) • for every s, c pair there is a random variable Grade(s,c) • all instances share the same structure and parameters • De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI

  31. ProbLog by example: G Grading

  32. ProbLog by example: G Grading 0.4 :: int(S) :- student(S).

  33. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C).

  34. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C).

  35. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob).

  36. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs).

  37. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs).

  38. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs). gr(S,C,a) :- int(S), not diff(C).

  39. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs). gr(S,C,a) :- int(S), not diff(C). 0.3::gr(S,C,a); 0.5::gr(S,C,b);0.2::gr(S,C,c) :-

  40. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs). gr(S,C,a) :- int(S), not diff(C). 0.3::gr(S,C,a); 0.5::gr(S,C,b);0.2::gr(S,C,c) :- int(S), diff(C).

  41. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs). gr(S,C,a) :- int(S), not diff(C). 0.3::gr(S,C,a); 0.5::gr(S,C,b);0.2::gr(S,C,c) :- int(S), diff(C). 0.1::gr(S,C,b); 0.2::gr(S,C,c); 0.2::gr(S,C,f) :-

  42. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs). gr(S,C,a) :- int(S), not diff(C). 0.3::gr(S,C,a); 0.5::gr(S,C,b);0.2::gr(S,C,c) :- int(S), diff(C). 0.1::gr(S,C,b); 0.2::gr(S,C,c); 0.2::gr(S,C,f) :- student(S), course(C),

  43. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs). gr(S,C,a) :- int(S), not diff(C). 0.3::gr(S,C,a); 0.5::gr(S,C,b);0.2::gr(S,C,c) :- int(S), diff(C). 0.1::gr(S,C,b); 0.2::gr(S,C,c); 0.2::gr(S,C,f) :- student(S), course(C), not int(S), not diff(C).

  44. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs). gr(S,C,a) :- int(S), not diff(C). 0.3::gr(S,C,a); 0.5::gr(S,C,b);0.2::gr(S,C,c) :- int(S), diff(C). 0.1::gr(S,C,b); 0.2::gr(S,C,c); 0.2::gr(S,C,f) :- student(S), course(C), not int(S), not diff(C). 0.3::gr(S,C,c); 0.2::gr(S,C,f) :-

  45. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs). gr(S,C,a) :- int(S), not diff(C). 0.3::gr(S,C,a); 0.5::gr(S,C,b);0.2::gr(S,C,c) :- int(S), diff(C). 0.1::gr(S,C,b); 0.2::gr(S,C,c); 0.2::gr(S,C,f) :- student(S), course(C), not int(S), not diff(C). 0.3::gr(S,C,c); 0.2::gr(S,C,f) :- not int(S), diff(C).

  46. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs). gr(S,C,a) :- int(S), not diff(C). 0.3::gr(S,C,a); 0.5::gr(S,C,b);0.2::gr(S,C,c) :- int(S), diff(C). 0.1::gr(S,C,b); 0.2::gr(S,C,c); 0.2::gr(S,C,f) :- student(S), course(C), not int(S), not diff(C). 0.3::gr(S,C,c); 0.2::gr(S,C,f) :- not int(S), diff(C).

  47. ProbLog by example: G Grading 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs). gr(S,C,a) :- int(S), not diff(C). 0.3::gr(S,C,a); 0.5::gr(S,C,b);0.2::gr(S,C,c) :- int(S), diff(C). 0.1::gr(S,C,b); 0.2::gr(S,C,c); 0.2::gr(S,C,f) :- student(S), course(C), not int(S), not diff(C). 0.3::gr(S,C,c); 0.2::gr(S,C,f) :- not int(S), diff(C).

  48. ProbLog by example: Grading unsatisfactory(S) :- student(S), grade(S,C,f). excellent(S) :- student(S), not grade(S,C,G), below(G,a). excellent(S) :- student(S), grade(S,C,a). 0.4 :: int(S) :- student(S). 0.5 :: diff(C):- course(C). student(john). student(anna). student(bob). course(ai). course(ml). course(cs). gr(S,C,a) :- int(S), not diff(C). 0.3::gr(S,C,a); 0.5::gr(S,C,b);0.2::gr(S,C,c) :- int(S), diff(C). 0.1::gr(S,C,b); 0.2::gr(S,C,c); 0.2::gr(S,C,f) :- student(S), course(C), not int(S), not diff(C). 0.3::gr(S,C,c); 0.2::gr(S,C,f) :- not int(S), diff(C).

  49. ProbLog by example: Rain or sun? 29

  50. ProbLog by example: Rain or sun? 0.5 0.5 day 0 29

  51. ProbLog by example: Rain or sun? 0.5 0.5 day 0 0.5::weather(sun,0) ; 0.5::weather(rain,0) <- true. 29

  52. ProbLog by example: Rain or sun? 0.6 0.6 0.6 0.6 0.6 0.6 0.5 0.4 0.4 0.4 0.4 0.4 0.4 0.5 day 0 day 1 day 2 day 3 day 4 day 5 day 6 0.5::weather(sun,0) ; 0.5::weather(rain,0) <- true. 29

  53. ProbLog by example: Rain or sun? 0.6 0.6 0.6 0.6 0.6 0.6 0.5 0.4 0.4 0.4 0.4 0.4 0.4 0.5 day 0 day 1 day 2 day 3 day 4 day 5 day 6 0.5::weather(sun,0) ; 0.5::weather(rain,0) <- true. 29

  54. ProbLog by example: Rain or sun? 0.6 0.6 0.6 0.6 0.6 0.6 0.5 0.4 0.4 0.4 0.4 0.4 0.4 0.2 0.2 0.2 0.2 0.2 0.2 0.5 0.8 0.8 0.8 0.8 0.8 0.8 day 0 day 1 day 2 day 3 day 4 day 5 day 6 0.5::weather(sun,0) ; 0.5::weather(rain,0) <- true. 29

  55. ProbLog by example: Rain or sun? 0.6 0.6 0.6 0.6 0.6 0.6 0.5 0.4 0.4 0.4 0.4 0.4 0.4 0.2 0.2 0.2 0.2 0.2 0.2 0.5 0.8 0.8 0.8 0.8 0.8 0.8 day 0 day 1 day 2 day 3 day 4 day 5 day 6 0.5::weather(sun,0) ; 0.5::weather(rain,0) <- true. 0.6::weather(sun,T) ; 0.4::weather(rain,T) 
 <- T>0, Tprev is T-1, weather(sun,Tprev). 29

  56. ProbLog by example: Rain or sun? 0.6 0.6 0.6 0.6 0.6 0.6 0.5 0.4 0.4 0.4 0.4 0.4 0.4 0.2 0.2 0.2 0.2 0.2 0.2 0.5 0.8 0.8 0.8 0.8 0.8 0.8 day 0 day 1 day 2 day 3 day 4 day 5 day 6 0.5::weather(sun,0) ; 0.5::weather(rain,0) <- true. 0.6::weather(sun,T) ; 0.4::weather(rain,T) 
 <- T>0, Tprev is T-1, weather(sun,Tprev). 0.2::weather(sun,T) ; 0.8::weather(rain,T) 
 <- T>0, Tprev is T-1, weather(rain,Tprev). 29

  57. ProbLog by example: Rain or sun? 0.6 0.6 0.6 0.6 0.6 0.6 0.5 0.4 0.4 0.4 0.4 0.4 0.4 0.2 0.2 0.2 0.2 0.2 0.2 0.5 0.8 0.8 0.8 0.8 0.8 0.8 day 0 day 1 day 2 day 3 day 4 day 5 day 6 0.5::weather(sun,0) ; 0.5::weather(rain,0) <- true. 0.6::weather(sun,T) ; 0.4::weather(rain,T) 
 <- T>0, Tprev is T-1, weather(sun,Tprev). 0.2::weather(sun,T) ; 0.8::weather(rain,T) 
 <- T>0, Tprev is T-1, weather(rain,Tprev). infinite possible worlds! BUT: finitely many partial worlds suffice to answer any given ground query 29

  58. [Suciu et al 2011] De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Probabilistic Databases Dealing with uncertainty Reasoning with relational data Learning 30

  59. [Suciu et al 2011] De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Probabilistic Databases Dealing with uncertainty select x.person, y.country from bornIn x, cityIn y where x.city=y.city cityIn city country bornIn london uk person city york uk ann london paris usa bob york relational eve new york database tom paris Learning 30

  60. [Suciu et al 2011] De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Probabilistic Databases Dealing with uncertainty select x.person, y.country from bornIn x, cityIn y where x.city=y.city cityIn one world city country bornIn london uk person city york uk ann london paris usa bob york relational eve new york database tom paris Learning 30

  61. [Suciu et al 2011] De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Probabilistic Databases cityIn bornIn person city P city country P ann london 0,87 london uk 0,99 bob york 0,95 york uk 0,75 eve new york 0,9 paris usa 0,4 tuples as random tom paris 0,56 variables select x.person, y.country from bornIn x, cityIn y where x.city=y.city cityIn one world city country bornIn london uk person city york uk ann london paris usa bob york relational eve new york database tom paris Learning 30

  62. [Suciu et al 2011] De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Probabilistic Databases several possible worlds cityIn bornIn person city P city country P ann london 0,87 london uk 0,99 bob york 0,95 york uk 0,75 eve new york 0,9 paris usa 0,4 tuples as random tom paris 0,56 variables select x.person, y.country from bornIn x, cityIn y where x.city=y.city cityIn one world city country bornIn london uk person city york uk ann london paris usa bob york relational eve new york database tom paris Learning 30

  63. [Suciu et al 2011] De Raedt, Kersting, Natarajan, Poole: Statistical Relational AI Probabilistic Databases several possible worlds cityIn bornIn person city P city country P ann london 0,87 london uk 0,99 bob york 0,95 york uk 0,75 eve new york 0,9 paris usa 0,4 probabilistic tables + database queries tuples as random tom paris 0,56 → distribution over possible worlds variables select x.person, y.country from bornIn x, cityIn y where x.city=y.city cityIn one world city country bornIn london uk person city york uk ann london paris usa bob york relational eve new york database tom paris Learning 30

  64. Example: Information Extraction instances for many degree of certainty different relations NELL: http://rtw.ml.cmu.edu/rtw/ 31

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend