for tuesday
play

For Tuesday Read chapter 12, sections 1-4 Homework: Chapter 10, - PowerPoint PPT Presentation

For Tuesday Read chapter 12, sections 1-4 Homework: Chapter 10, exercise 9 Chapter 13, exercise 8 Program 2 Any questions? Due Friday night Constructing the planning graph Level P 1 : all literals from the initial state


  1. For Tuesday • Read chapter 12, sections 1-4 • Homework: – Chapter 10, exercise 9 – Chapter 13, exercise 8

  2. Program 2 • Any questions? • Due Friday night

  3. Constructing the planning graph • Level P 1 : all literals from the initial state • Add an action in level A i if all its preconditions are present in level P i • Add a precondition in level P i if it is the effect of some action in level A i-1 (including no-ops) • Maintain a set of exclusion relations to eliminate incompatible propositions and actions (thus reducing the graph size)

  4. Planning graph … … …

  5. Mutual Exclusion relations • Two actions (or literals) are mutually exclusive (mutex) at some stage if no valid plan could contain both. • Two actions are mutex if: – Interference: one clobbers others‟ effect or precondition – Competing needs: mutex preconditions • Two propositions are mutex if: – All ways of achieving them are mutex

  6. Dinner Date example • Initial Conditions: garbage ˄ cleanHands ˄ quiet • Goal: dinner ^ present ^ ¬garbage • Actions: – Cook precondition: cleanHands effect: dinner – Wrap precondition: quiet effect present – Carry precondition: effect: ¬garbage ^ ¬cleanHands – Dolly precondition: effect: ¬garbage ^ ¬quiet

  7. Dinner Date example

  8. Dinner Date example

  9. Observation 1 p p p p A A A ¬q q q q ¬r ¬q ¬q ¬q B B ¬r r r ¬r ¬r Propositions monotonically increase (always carried forward by no-ops)

  10. Observation 2 p p p p A A A ¬q q q q ¬r ¬q ¬q ¬q B B ¬r r r ¬r ¬r Actions monotonically increase

  11. Observation 3 p p p q q q A r r r … … … Proposition mutex relationships monotonically decrease

  12. Observation 4 A A A p p p p q q q q B B B … r r r C C C s s s … … … Action mutex relationships monotonically decrease

  13. Observation 5 Planning Graph „levels off‟. • After some time k all levels are identical • Because it‟s a finite space, the set of literals never decreases and mutexes don‟t reappear.

  14. Valid plan A valid plan is a planning graph where: • Actions at the same level don‟t interfere • Each action‟s preconditions are made true by the plan • Goals are satisfied

  15. GraphPlan algorithm • Grow the planning graph (PG) until all goals are reachable and not mutex. (If PG levels off first, fail) • Search the PG for a valid plan • If none is found, add a level to the PG and try again

  16. Searching for a solution plan • Backward chain on the planning graph • Achieve goals level by level • At level k, pick a subset of non-mutex actions to achieve current goals. Their preconditions become the goals for k-1 level. • Build goal subset by picking each goal and choosing an action to add. Use one already selected if possible. Do forward checking on remaining goals (backtrack if can‟t pick non - mutex action)

  17. Plan Graph Search If goals are present & non-mutex: Choose action to achieve each goal Add preconditions to next goal set

  18. Termination for unsolvable problems • Graphplan records (memoizes) sets of unsolvable goals: – U(i,t) = unsolvable goals at level i after stage t. • More efficient: early backtracking • Also provides necessary and sufficient conditions for termination: – Assume plan graph levels off at level n, stage t > n – If U(n, t- 1) = U(n, t) then we know we‟re in a loop and can terminate safely.

  19. Dinner Date example • Initial Conditions: garbage ˄ cleanHands ˄ quiet • Goal: dinner ^ present ^ ¬garbage • Actions: – Cook precondition: cleanHands effect: dinner – Wrap precondition: quiet effect present – Carry precondition: effect: ¬garbage ^ ¬cleanHands – Dolly precondition: effect: ¬garbage ^ ¬quiet

  20. Dinner Date example

  21. Dinner Date example

  22. Dinner Date example

  23. Shopping Example Op( Action: Go(there); Precond: At(here); Effects: At(there), ¬At(here) ) Op( Action: Buy(x), Precond: At(store), Sells(store,x); Effects: Have(x) ) • A 0 : – At(Home) Sells(SM,Banana) Sells(SM,Milk) Sells(HWS,Drill) • A  – Have(Drill) Have(Milk) Have(Banana) At(Home)

  24. Uncertainty • Everyday reasoning and decision making is based on uncertain evidence and inferences. • Classical logic only allows conclusions to be strictly true or strictly false • We need to account for this uncertainty and the need to weigh and combine conflicting evidence.

  25. Coping with Uncertainty • Straightforward application of probability theory is impractical since the large number of conditional probabilities required are rarely, if ever, available. • Therefore, early expert systems employed fairly ad hoc methods for reasoning under uncertainty and for combining evidence. • Recently, methods more rigorously founded in probability theory that attempt to decrease the amount of conditional probabilities required have flourished.

  26. Probability • Probabilities are real numbers 0-1 representing the a priori likelihood that a proposition is true. P(Cold) = 0.1 P(¬Cold) = 0.9 • Probabilities can also be assigned to all values of a random variable (continuous or discrete) with a specific range of values (domain), e.g. low, normal, high. P(temperature=normal)=0.99 P(temperature=98.6) = 0.99

  27. Probability Vectors • The vector form gives probabilities for all values of a discrete variable, or its probability distribution. P(temperature) = <0.002, 0.99, 0.008> • This indicates the prior probability, in which no information is known.

  28. Conditional Probability • Conditional probability specifies the probability given that the values of some other random variables are known. P(Sneeze | Cold) = 0.8 P(Cold | Sneeze) = 0.6 • The probability of a sneeze given a cold is 80%. • The probability of a cold given a sneeze is 60%.

  29. Cond. Probability cont. • Assumes that the given information is all that is known, so all known information must be given. P(Sneeze | Cold  Allergy) = 0.95 • Also allows for conditional distributions P(X |Y) gives 2-D array of values for all P(X=x i |Y=y j ) • Defined as P (A | B) = P (A  B) P(B)

  30. Axioms of Probability Theory • All probabilities are between 0 and 1. 0  P(A)  1 • Necessarily true propositions have probability 1, necessarily false have probability 0. P(true) = 1 P(false) = 0 • The probability of a disjunction is given by P(A  B) = P(A) + P(B) - P(A  B)

  31. Joint Probability Distribution • The joint probability distribution for a set of random variables X 1 …X n gives the probability of every combination of values (an n-dimensional array with vn values if each variable has v values) P(X 1 ,...,X n ) Sneeze ¬Sneeze Cold 0.08 0.01 ¬Cold 0.01 0.9 • The probability of all possible cases (assignments of values to some subset of variables) can be calculated by summing the appropriate subset of values from the joint distribution. • All conditional probabilities can therefore also be calculated

  32. Bayes Theorem P(H | e) = P(e | H) P(H) P(e) • Follows from definition of conditional probability: P (A | B) = P (A  B) P(B)

  33. Other Basic Theorems • If events A and B are independent then: P(A  B) = P(A)P(B) • If events A and B are incompatible then: P(A  B) = P(A) + P(B)

  34. Simple Bayesian Reasoning • If we assume there are n possible disjoint diagnoses, d 1 … d n P(d i | e) = P(e | d i ) P(d i ) P(e) • P(e) may not be known but the total probability of all diagnoses must always be 1, so all must sum to 1 • Thus, we can determine the most probable without knowing P(e).

  35. Efficiency • This method requires that for each disease the probability it will cause any possible combination of symptoms and the number of possible symptom sets, e, is exponential in the number of basic symptoms. • This huge amount of data is usually not available.

  36. Bayesian Reasoning with Independence (“Naïve” Bayes) • If we assume that each piece of evidence (symptom) is independent given the diagnosis (conditional independence), then given evidence e as a sequence {e 1 ,e 2 ,…,e d } of observations, P(e | d i ) is the product of the probabilities of the observations given d i . • The conditional probability of each individual symptom for each possible diagnosis can then be computed from a set of data or estimated by the expert. • However, symptoms are usually not independent and frequently correlate, in which case the assumptions of this simple model are violated and it is not guaranteed to give reasonable results.

  37. Bayes Independence Example • Imagine there are diagnoses ALLERGY, COLD, and WELL and symptoms SNEEZE, COUGH, and FEVER Prob Well Cold Allergy P(d) 0.9 0.05 0.05 P(sneeze|d) 0.1 0.9 0.9 P(cough | d) 0.1 0.8 0.7 0.4 P(fever | d) 0.01 0.7

  38. Calculations • If symptoms sneeze & cough & no fever, what is the diagnosis?

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend