For Tuesday Read chapter 12, sections 1-4 Homework: Chapter 10, - - PowerPoint PPT Presentation

for tuesday
SMART_READER_LITE
LIVE PREVIEW

For Tuesday Read chapter 12, sections 1-4 Homework: Chapter 10, - - PowerPoint PPT Presentation

For Tuesday Read chapter 12, sections 1-4 Homework: Chapter 10, exercise 9 Chapter 13, exercise 8 Program 2 Any questions? Due Friday night Constructing the planning graph Level P 1 : all literals from the initial state


slide-1
SLIDE 1

For Tuesday

  • Read chapter 12, sections 1-4
  • Homework:

– Chapter 10, exercise 9 – Chapter 13, exercise 8

slide-2
SLIDE 2

Program 2

  • Any questions?
  • Due Friday night
slide-3
SLIDE 3

Constructing the planning graph

  • Level P1: all literals from the initial state
  • Add an action in level Ai if all its

preconditions are present in level Pi

  • Add a precondition in level Pi if it is the

effect of some action in level Ai-1 (including no-ops)

  • Maintain a set of exclusion relations to

eliminate incompatible propositions and actions (thus reducing the graph size)

slide-4
SLIDE 4

Planning graph

… … …

slide-5
SLIDE 5

Mutual Exclusion relations

  • Two actions (or literals) are mutually

exclusive (mutex) at some stage if no valid plan could contain both.

  • Two actions are mutex if:

– Interference: one clobbers others‟ effect or precondition – Competing needs: mutex preconditions

  • Two propositions are mutex if:

– All ways of achieving them are mutex

slide-6
SLIDE 6

Dinner Date example

  • Initial Conditions: garbage ˄ cleanHands ˄ quiet
  • Goal: dinner ^ present ^ ¬garbage
  • Actions:

– Cook precondition: cleanHands effect: dinner – Wrap precondition: quiet effect present – Carry precondition: effect: ¬garbage ^ ¬cleanHands – Dolly precondition: effect: ¬garbage ^ ¬quiet

slide-7
SLIDE 7

Dinner Date example

slide-8
SLIDE 8

Dinner Date example

slide-9
SLIDE 9

Observation 1

Propositions monotonically increase

(always carried forward by no-ops) p ¬q ¬r p q ¬q ¬r p q ¬q r ¬r p q ¬q r ¬r A A B A B

slide-10
SLIDE 10

Observation 2

Actions monotonically increase

p ¬q ¬r p q ¬q ¬r p q ¬q r ¬r p q ¬q r ¬r A A B A B

slide-11
SLIDE 11

Observation 3

Proposition mutex relationships monotonically decrease

p q r … A p q r … p q r …

slide-12
SLIDE 12

Observation 4

Action mutex relationships monotonically decrease

p q … B p q r s … p q r s … A C B C A p q r s … B C A

slide-13
SLIDE 13

Observation 5

Planning Graph „levels off‟.

  • After some time k all levels are identical
  • Because it‟s a finite space, the set of literals

never decreases and mutexes don‟t reappear.

slide-14
SLIDE 14

Valid plan

A valid plan is a planning graph where:

  • Actions at the same level don‟t interfere
  • Each action‟s preconditions are made true

by the plan

  • Goals are satisfied
slide-15
SLIDE 15

GraphPlan algorithm

  • Grow the planning graph (PG) until all

goals are reachable and not mutex. (If PG levels off first, fail)

  • Search the PG for a valid plan
  • If none is found, add a level to the PG and

try again

slide-16
SLIDE 16

Searching for a solution plan

  • Backward chain on the planning graph
  • Achieve goals level by level
  • At level k, pick a subset of non-mutex actions to

achieve current goals. Their preconditions become the goals for k-1 level.

  • Build goal subset by picking each goal and

choosing an action to add. Use one already selected if possible. Do forward checking on remaining goals (backtrack if can‟t pick non- mutex action)

slide-17
SLIDE 17

Plan Graph Search

If goals are present & non-mutex:

Choose action to achieve each goal Add preconditions to next goal set

slide-18
SLIDE 18

Termination for unsolvable problems

  • Graphplan records (memoizes) sets of unsolvable

goals:

– U(i,t) = unsolvable goals at level i after stage t.

  • More efficient: early backtracking
  • Also provides necessary and sufficient conditions

for termination:

– Assume plan graph levels off at level n, stage t > n – If U(n, t-1) = U(n, t) then we know we‟re in a loop and can terminate safely.

slide-19
SLIDE 19

Dinner Date example

  • Initial Conditions: garbage ˄ cleanHands ˄ quiet
  • Goal: dinner ^ present ^ ¬garbage
  • Actions:

– Cook precondition: cleanHands effect: dinner – Wrap precondition: quiet effect present – Carry precondition: effect: ¬garbage ^ ¬cleanHands – Dolly precondition: effect: ¬garbage ^ ¬quiet

slide-20
SLIDE 20

Dinner Date example

slide-21
SLIDE 21

Dinner Date example

slide-22
SLIDE 22

Dinner Date example

slide-23
SLIDE 23

Shopping Example

Op( Action: Go(there); Precond: At(here); Effects: At(there), ¬At(here) ) Op( Action: Buy(x), Precond: At(store), Sells(store,x); Effects: Have(x) )

  • A0:

– At(Home) Sells(SM,Banana) Sells(SM,Milk) Sells(HWS,Drill)

  • A

– Have(Drill) Have(Milk) Have(Banana) At(Home)

slide-24
SLIDE 24

Uncertainty

  • Everyday reasoning and decision making is

based on uncertain evidence and inferences.

  • Classical logic only allows conclusions to

be strictly true or strictly false

  • We need to account for this uncertainty and

the need to weigh and combine conflicting evidence.

slide-25
SLIDE 25

Coping with Uncertainty

  • Straightforward application of probability theory

is impractical since the large number of conditional probabilities required are rarely, if ever, available.

  • Therefore, early expert systems employed fairly

ad hoc methods for reasoning under uncertainty and for combining evidence.

  • Recently, methods more rigorously founded in

probability theory that attempt to decrease the amount of conditional probabilities required have flourished.

slide-26
SLIDE 26

Probability

  • Probabilities are real numbers 0-1 representing

the a priori likelihood that a proposition is true.

P(Cold) = 0.1 P(¬Cold) = 0.9

  • Probabilities can also be assigned to all values
  • f a random variable (continuous or discrete)

with a specific range of values (domain), e.g. low, normal, high.

P(temperature=normal)=0.99 P(temperature=98.6) = 0.99

slide-27
SLIDE 27

Probability Vectors

  • The vector form gives probabilities for all

values of a discrete variable, or its probability distribution.

P(temperature) = <0.002, 0.99, 0.008>

  • This indicates the prior probability, in

which no information is known.

slide-28
SLIDE 28

Conditional Probability

  • Conditional probability specifies the

probability given that the values of some

  • ther random variables are known.

P(Sneeze | Cold) = 0.8 P(Cold | Sneeze) = 0.6

  • The probability of a sneeze given a cold is

80%.

  • The probability of a cold given a sneeze is

60%.

slide-29
SLIDE 29
  • Cond. Probability cont.
  • Assumes that the given information is all that is

known, so all known information must be given.

P(Sneeze | Cold  Allergy) = 0.95

  • Also allows for conditional distributions

P(X |Y) gives 2-D array of values for all P(X=xi|Y=yj)

  • Defined as

P (A | B) = P (A  B) P(B)

slide-30
SLIDE 30

Axioms of Probability Theory

  • All probabilities are between 0 and 1.

0  P(A)  1

  • Necessarily true propositions have probability 1,

necessarily false have probability 0.

P(true) = 1 P(false) = 0

  • The probability of a disjunction is given by

P(A  B) = P(A) + P(B) - P(A  B)

slide-31
SLIDE 31

Joint Probability Distribution

  • The joint probability distribution for a set of random

variables X1…Xn gives the probability of every combination of values (an n-dimensional array with vn values if each variable has v values)

P(X1,...,Xn) Sneeze ¬Sneeze Cold 0.08 0.01 ¬Cold 0.01 0.9

  • The probability of all possible cases (assignments of values

to some subset of variables) can be calculated by summing the appropriate subset of values from the joint distribution.

  • All conditional probabilities can therefore also be

calculated

slide-32
SLIDE 32

Bayes Theorem

P(H | e) = P(e | H) P(H) P(e)

  • Follows from definition of conditional

probability: P (A | B) = P (A  B) P(B)

slide-33
SLIDE 33

Other Basic Theorems

  • If events A and B are independent then:

P(A  B) = P(A)P(B)

  • If events A and B are incompatible then:

P(A  B) = P(A) + P(B)

slide-34
SLIDE 34

Simple Bayesian Reasoning

  • If we assume there are n possible disjoint

diagnoses, d1 … dn P(di | e) = P(e | di) P(di) P(e)

  • P(e) may not be known but the total

probability of all diagnoses must always be 1, so all must sum to 1

  • Thus, we can determine the most probable

without knowing P(e).

slide-35
SLIDE 35

Efficiency

  • This method requires that for each disease

the probability it will cause any possible combination of symptoms and the number

  • f possible symptom sets, e, is exponential

in the number of basic symptoms.

  • This huge amount of data is usually not

available.

slide-36
SLIDE 36

Bayesian Reasoning with Independence (“Naïve” Bayes)

  • If we assume that each piece of evidence (symptom) is

independent given the diagnosis (conditional independence), then given evidence e as a sequence {e1,e2,…,ed} of

  • bservations, P(e | di) is the product of the probabilities of the
  • bservations given di.
  • The conditional probability of each individual symptom for

each possible diagnosis can then be computed from a set of data

  • r estimated by the expert.
  • However, symptoms are usually not independent and frequently

correlate, in which case the assumptions of this simple model are violated and it is not guaranteed to give reasonable results.

slide-37
SLIDE 37

Bayes Independence Example

  • Imagine there are diagnoses ALLERGY, COLD,

and WELL and symptoms SNEEZE, COUGH, and FEVER

Prob Well Cold Allergy P(d) 0.9 0.05 0.05 P(sneeze|d) 0.1 0.9 0.9 P(cough | d) 0.1 0.8 0.7 P(fever | d) 0.01 0.7 0.4

slide-38
SLIDE 38

Calculations

  • If symptoms sneeze & cough & no fever, what is

the diagnosis?

slide-39
SLIDE 39

Problems with Probabilistic Reasoning

  • If no assumptions of independence are made,

then an exponential number of parameters is needed for sound probabilistic reasoning.

  • There is almost never enough data or

patience to reliably estimate so many very specific parameters.

  • If a blanket assumption of conditional

independence is made, efficient probabilistic reasoning is possible, but such a strong assumption is rarely warranted.

slide-40
SLIDE 40

Practical Naïve Bayes

  • We‟re going to assume independence, so

what numbers do we need?

  • Where do the numbers come from?
  • What about zeroes?
slide-41
SLIDE 41

Program 3