extending the clp engine for reasoning under uncertainty
play

Extending the CLP engine for reasoning under uncertainty Nicos - PowerPoint PPT Presentation

Extending the CLP engine for reasoning under uncertainty Nicos Angelopoulos nicos@cs.york.ac.uk http://www.cs.york.ac.uk/nicos Department of Computer Science University of York Ismis 2003 p.1 talk structure Logic Programming (LP)


  1. Extending the CLP engine for reasoning under uncertainty Nicos Angelopoulos nicos@cs.york.ac.uk http://www.cs.york.ac.uk/˜nicos Department of Computer Science University of York Ismis 2003 – p.1

  2. talk structure Logic Programming (LP) Uncertainty and LP Constraint LP clp(pfd(Y)) clp(pfd(c)) Caesar’s coding experiments Ismis 2003 – p.2

  3. logic programming Used in AI for crisp problem solving and for building executable models and intelligent systems. Programs are formed from logic based rules. member( H, [H|T] ). member( El, [H|T] ) :- member( El, T ). Ismis 2003 – p.3

  4. execution tree ?− member( X, [a,b,c] ). X = a X = b X = c member( H, [H|T] ). member( El, [H|T] ) :- member( El, T ). Ismis 2003 – p.4

  5. uncertainty in logic programming Most approaches use Probability Theory but there are fundamental questions unresolved. In general, 0.5 : member( H, [H|T] ). 0.5 : member( El, [H|T] ) :- member( El, T ). Ismis 2003 – p.5

  6. stochastic tree ?− member( X, [a,b,c] ). 1/2 1/2 1/2 1/2 1/2 : X = a 1/2 1/4 : X = b 1/8 : X = c 0.5 : member( H, [H|T] ). 0.5 : member( El, [H|T] ) :- member( El, T ). Ismis 2003 – p.6

  7. constraints in lp Logic Programming : execution model is inflexible, and its relational nature discourages use of state information. Ismis 2003 – p.7

  8. constraints in lp Logic Programming : execution model is inflexible, and its relational nature discourages use of state information. Constraints add specialised algorithms Ismis 2003 – p.7

  9. constraints in lp Logic Programming : execution model is inflexible, and its relational nature discourages use of state information. Constraints add specialised algorithms state information Ismis 2003 – p.7

  10. constraint store ?− Q. X # Y Logic Programming engine Constraint store interaction Ismis 2003 – p.8

  11. constraints inference ?− Q. X in {a,b} Y in {b,c} + X = Y => X = Y = b Ismis 2003 – p.9

  12. finite domain distributions For discrete probabilistic models clp(pfd(Y)) extends the idea of finite domains to admit distributions. from clp(fd) X in { a, b } (i.e. X = a or X = b ) to clp(pfd(Y)) p ( X = a ) + p ( X = b ) Ismis 2003 – p.10

  13. finite domain distributions For discrete probabilistic models clp(pfd(Y)) extends the idea of finite domains to admit distributions. from clp(fd) X in { a, b } (i.e. X = a or X = b ) to clp(pfd(Y)) [ p ( X = a ) + p ( X = b ) ] = 1 Ismis 2003 – p.10

  14. constraint based integration Execution, assembles the probabilistic model in the store according to program and query. Dedicated algorithms can be used for probabilistic inference on the model present in the store. Ismis 2003 – p.11

  15. probability of predicates pvars ( E ) - variables in predicate E , e - vector of finite domain elements p ( e i ) - probability of element e i S - a constraint store. E/e - E with variables replaced by e . The probability of predicate E with respect to store S is � � � P S ( E ) = P S ( e ) = p ( e i ) i ∀ e ∀ e S⊢ E/e S⊢ E/e Ismis 2003 – p.12

  16. clp(pfd(c)) Probabilistic variable definitions Coin ∈ p finite _ geometric ([ head, tail ] , 2) Conditional constraint Dependent π Qualifier Ismis 2003 – p.13

  17. labelling In clp(fd) systems labelling instantiates a group of variables to elements from their domains. In clp(pfd(Y)) the probabilities can be used to guide labelling. For instance we have implemented label ( V ars, mua, Prbs, Total ) Selector mua approximates best-first algorithm which instantiates a group of variables to most likely combinations. Ismis 2003 – p.14

  18. Caesar’s coding Each letter is encrypted to a random letter. Words drawn from a dictionary are encrypted. Programs try to decode them. We compared a clp(fd) solution to clp(pfd(c)). clp(fd) no probabilistic information, labelling in lexicographical order. clp(pfd(c)) distributions based on frequencies, labelling using mua . Ismis 2003 – p.15

  19. proximity functions X i - variable for ith encoded letter D i - dictionary letter freq() - frequency of letter 1 / | freq ( X i ) − freq ( D j ) | px ( X i , D j ) = � k 1 / | freq ( X i ) − freq ( D k ) | Ismis 2003 – p.16

  20. execution times 1200 clp(FD) pfd 1000 800 600 400 200 0 0 20 40 60 80 100 clp(pfd(c)) and clp(fd) on SICStus 3.8.6 Ismis 2003 – p.17

  21. bottom line Constraint LP based techniques can be used for frameworks that support probabilistic problem solving. clp(pfd(Y)) can be used to take advantage of probabilistic information at an abstract level. Ismis 2003 – p.18

  22. Monty Hall Three curtains hiding a car and two goats. Contestant chooses an initial curtain. A close curtain opens to reveal a goat. Contestant is asked for their final choice. What is the best strategy ? Stay or Switch ? Ismis 2003 – p.19

  23. Monty Hall solution If probability of switching is Swt, ( Swt = 0 for strategy Stay and Swt = 1 for Switch ) then probability of win is P ( γ ) = 1+ Swt . 3 Ismis 2003 – p.20

  24. Monty Hall in clp(cfd(c)) curtains( gamma, Swt, Prb ) :- Gift ∈ p uniform ([ a, b, c ]) , First ∈ p uniform ([ a, b, c ]) , Reveal ∈ p uniform ([ a, b, c ]) , Second ∈ p uniform ([ a, b, c ]) , Reveal � = Gift , Reveal � = First , Second � = Reveal , Second Swt First , Prb is p ( Second = Gift ) . Ismis 2003 – p.21

  25. Strategy γ Query Querying this program ?- curtains(gamma, 1/2, Prb) Prb = 1 / 2 . Ismis 2003 – p.22

  26. Strategy γ Query Querying this program ?- curtains(gamma, 1/2, Prb) Prb = 1 / 2 . ?- curtains(gamma, 1, Prb) Prb = 2 / 3 . Ismis 2003 – p.22

  27. Strategy γ Query Querying this program ?- curtains(gamma, 1/2, Prb) Prb = 1 / 2 . ?- curtains(gamma, 1, Prb) Prb = 2 / 3 . ?- curtains(gamma, 0, Prb) Prb = 1 / 3 . Ismis 2003 – p.22

  28. clp(pfd(BN)) Other discrete probabilistic inference is possible. For instance Bayesian Networks representation and inference. Ismis 2003 – p.23

  29. example BN A B C A = y A = n A = y A = n B = y 0.80 0.10 C = y 0.60 0.90 B = n 0.20 0.90 C = n 0.40 0.10 Ismis 2003 – p.24

  30. clp(pfd(BN)) program example_bn( A, B, C ) :- cpt(A,[],[y,n]), cpt(B,[A],[(y,y,0.8),(y,n,0.2), (n,y,0.1),(n,n,0.9)]), cpt(C,[A],[(y,y,0.6),(y,n,0.4), (n,y,0.9),(n,n,0.1)]). Ismis 2003 – p.25

  31. program example_bn( A, B, C ) :- cpt(A,[],[y,n]), cpt(B,[A],[(y,y,0.8),(y,n,0.2), (n,y,0.1),(n,n,0.9)]), cpt(C,[A],[(y,y,0.6),(y,n,0.4), (n,y,0.9),(n,n,0.1)]). ?- example_bn(X,Y,Z), evidence(X,[(y,0.8),(n,0.2)], Zy is p(Z = y). Ismis 2003 – p.26

  32. program example_bn( A, B, C ) :- cpt(A,[],[y,n]), cpt(B,[A],[(y,y,0.8),(y,n,0.2), (n,y,0.1),(n,n,0.9)]), cpt(C,[A],[(y,y,0.6),(y,n,0.4), (n,y,0.9),(n,n,0.1)]). ?- example_bn(X,Y,Z), evidence(X,[(y,0.8),(n,0.2)], Zy is p(Z = y). Zy = 0.66 Ismis 2003 – p.27

  33. current inference scheme CLP Probabilistic Inference & Constraints Logic engine Learning Store Ismis 2003 – p.28

  34. bottom line CLP Constraint Propagation Logic engine & Probabilistic Inference Ismis 2003 – p.29

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend