CS325 Artificial Intelligence Ch 14b Probabilistic Inference - - PowerPoint PPT Presentation

cs325 artificial intelligence ch 14b probabilistic
SMART_READER_LITE
LIVE PREVIEW

CS325 Artificial Intelligence Ch 14b Probabilistic Inference - - PowerPoint PPT Presentation

CS325 Artificial Intelligence Ch 14b Probabilistic Inference Cengiz Gnay Spring 2013 Gnay Ch 14b Probabilistic Inference Inference tasks Simple queries: compute posterior marginal P ( X i | E = e ) Conjunctive queries: P ( X i , X


slide-1
SLIDE 1

CS325 Artificial Intelligence Ch 14b – Probabilistic Inference

Cengiz Günay Spring 2013

Günay Ch 14b – Probabilistic Inference

slide-2
SLIDE 2

Inference tasks

Simple queries: compute posterior marginal P(Xi|E =e) Conjunctive queries: P(Xi, Xj|E =e) = P(Xi|E = e)P(Xj|Xi, E = e) Optimal decisions: decision networks include utility information; probabilistic inference required for P(outcome|action, evidence) Value of information: which evidence to seek next? Sensitivity analysis: which probability values are most critical? Explanation: why do I need a new starter motor?

Günay Ch 14b – Probabilistic Inference

slide-3
SLIDE 3

Inference by Enumeration

With no dependency information, we need 2n entries in joint dist.: Cavity Toothache Catch

?

cavity

L

toothache cavity catch catch

L

toothache

L

catch catch

L

.108 .012 .016 .064 .072 .144 .008 .576

Günay Ch 14b – Probabilistic Inference

slide-4
SLIDE 4

Inference by Enumeration

With no dependency information, we need 2n entries in joint dist.: Cavity Toothache Catch

?

cavity

L

toothache cavity catch catch

L

toothache

L

catch catch

L

.108 .012 .016 .064 .072 .144 .008 .576

For any proposition φ, sum the events where it is true: P(φ) =

w:w| =φ P(w)

Günay Ch 14b – Probabilistic Inference

slide-5
SLIDE 5

Inference by Enumeration

With no dependency information, we need 2n entries in joint dist.: Cavity Toothache Catch

?

cavity

L

toothache cavity catch catch

L

toothache

L

catch catch

L

.108 .012 .016 .064 .072 .144 .008 .576

For any proposition φ, sum the events where it is true: P(φ) =

w:w| =φ P(w)

P(toothache) = 0.108 + 0.012 + 0.016 + 0.064 = 0.2

Günay Ch 14b – Probabilistic Inference

slide-6
SLIDE 6

Inference by Enumeration

With no dependency information, we need 2n entries in joint dist.: Cavity Toothache Catch

?

cavity

L

toothache cavity catch catch

L

toothache

L

catch catch

L

.108 .012 .016 .064 .072 .144 .008 .576

For any proposition φ, sum the events where it is true: P(φ) =

w:w| =φ P(w)

P(cavity ∨ toothache) =?

Günay Ch 14b – Probabilistic Inference

slide-7
SLIDE 7

Inference by Enumeration

With no dependency information, we need 2n entries in joint dist.: Cavity Toothache Catch

?

cavity

L

toothache cavity catch catch

L

toothache

L

catch catch

L

.108 .012 .016 .064 .072 .144 .008 .576

For any proposition φ, sum the events where it is true: P(φ) =

w:w| =φ P(w)

P(cavity ∨ toothache) = 0.108 + 0.012 + 0.072 + 0.008 + 0.016 + 0.064 = 0.28

Günay Ch 14b – Probabilistic Inference

slide-8
SLIDE 8

Inference by Enumeration

With no dependency information, we need 2n entries in joint dist.: Cavity Toothache Catch

?

cavity

L

toothache cavity catch catch

L

toothache

L

catch catch

L

.108 .012 .016 .064 .072 .144 .008 .576

Can also compute conditional probabilities: P(¬cavity|toothache) = ?

Günay Ch 14b – Probabilistic Inference

slide-9
SLIDE 9

Inference by Enumeration

With no dependency information, we need 2n entries in joint dist.: Cavity Toothache Catch

?

cavity

L

toothache cavity catch catch

L

toothache

L

catch catch

L

.108 .012 .016 .064 .072 .144 .008 .576

Can also compute conditional probabilities: P(¬cavity|toothache) = P(¬cavity ∧ toothache) P(toothache) = 0.016 + 0.064 0.108 + 0.012 + 0.016 + 0.064 = 0.4

Günay Ch 14b – Probabilistic Inference

slide-10
SLIDE 10

Joint probability with known dependencies

Cavity Toothache Catch P(tootache, catch, cavity) = P(tootache|cavity) P(catch|cavity)P(cavity) In general, P(x1, . . . , xn) = n

i=1 P(xi|parents(Xi))

Günay Ch 14b – Probabilistic Inference

slide-11
SLIDE 11

Joint probability with known dependencies

Cavity Toothache Catch P(tootache, catch, cavity) = P(tootache|cavity) P(catch|cavity)P(cavity) In general, P(x1, . . . , xn) = n

i=1 P(xi|parents(Xi))

Günay Ch 14b – Probabilistic Inference

slide-12
SLIDE 12

Burglary or Earthquake: inference from joint

.001

P(B)

.002

P(E)

Alarm Earthquake MaryCalls JohnCalls Burglary

B

T T F F

E

T F T F .95 .29 .001 .94

P(A|B,E) A

T F .90 .05

P(J|A) A

T F .70 .01

P(M|A)

P(j ∧ m ∧ a ∧ ¬b ∧ ¬e) = ?

Günay Ch 14b – Probabilistic Inference

slide-13
SLIDE 13

Burglary or Earthquake: inference from joint

.001

P(B)

.002

P(E)

Alarm Earthquake MaryCalls JohnCalls Burglary

B

T T F F

E

T F T F .95 .29 .001 .94

P(A|B,E) A

T F .90 .05

P(J|A) A

T F .70 .01

P(M|A)

P(j ∧ m ∧ a ∧ ¬b ∧ ¬e) = P(j|a)P(m|a)P(a|¬b, ¬e)P(¬b)P(¬e) = 0.9 × 0.7 × 0.001 × 0.999 × 0.998 ≈ 0.00063

Günay Ch 14b – Probabilistic Inference

slide-14
SLIDE 14

Burglary or Earthquake: inference by enumeration

.001

P(B)

.002

P(E)

Alarm Earthquake MaryCalls JohnCalls Burglary

B

T T F F

E

T F T F .95 .29 .001 .94

P(A|B,E) A

T F .90 .05

P(J|A) A

T F .70 .01

P(M|A)

P(B|j, m) = P(B, j, m)/P(j, m) = αP(B, j, m) = α

e

  • a P(B, j, m, e, a)

Günay Ch 14b – Probabilistic Inference

slide-15
SLIDE 15

Burglary or Earthquake: inference by enumeration

.001

P(B)

.002

P(E)

Alarm Earthquake MaryCalls JohnCalls Burglary

B

T T F F

E

T F T F .95 .29 .001 .94

P(A|B,E) A

T F .90 .05

P(J|A) A

T F .70 .01

P(M|A)

P(B|j, m) = α

e

  • a P(B)P(e)P(a|B, e)P(j|a)P(m|a)

= α P(B)

e P(e) a P(a|B, e)P(j|a)P(m|a)

Günay Ch 14b – Probabilistic Inference

slide-16
SLIDE 16

Burglary or Earthquake: inference by enumeration

.001

P(B)

.002

P(E)

Alarm Earthquake MaryCalls JohnCalls Burglary

B

T T F F

E

T F T F .95 .29 .001 .94

P(A|B,E) A

T F .90 .05

P(J|A) A

T F .70 .01

P(M|A)

joining and elimination

Günay Ch 14b – Probabilistic Inference

slide-17
SLIDE 17

What if we cannot infer exactly?

Exact inference is expensive What else can we do? Observe random events and record outcomes to approximate probabilities Also called a Monte Carlo method With ∞ samples, it is consistent Rejection sampling: for rare events Likelihood weighing: to avoid inconsistency Gibbs sampling: random walk through state space Monty Hall letter

Günay Ch 14b – Probabilistic Inference

slide-18
SLIDE 18

What if we cannot infer exactly?

Exact inference is expensive What else can we do? Observe random events and record outcomes to approximate probabilities Also called a Monte Carlo method With ∞ samples, it is consistent Rejection sampling: for rare events Likelihood weighing: to avoid inconsistency Gibbs sampling: random walk through state space Monty Hall letter

Günay Ch 14b – Probabilistic Inference

slide-19
SLIDE 19

What if we cannot infer exactly?

Exact inference is expensive What else can we do? Observe random events and record outcomes to approximate probabilities Also called a Monte Carlo method With ∞ samples, it is consistent Rejection sampling: for rare events Likelihood weighing: to avoid inconsistency Gibbs sampling: random walk through state space Monty Hall letter

Günay Ch 14b – Probabilistic Inference