Reasoning under Uncertainty Reasoning under Uncertainty Course: - - PowerPoint PPT Presentation

reasoning under uncertainty reasoning under uncertainty
SMART_READER_LITE
LIVE PREVIEW

Reasoning under Uncertainty Reasoning under Uncertainty Course: - - PowerPoint PPT Presentation

Reasoning under Uncertainty Reasoning under Uncertainty Course: CS40022 Course: CS40022 Instructor: Dr. Pallab Dasgupta Pallab Dasgupta Instructor: Dr. Department of Computer Science & Engineering Department of Computer Science &


slide-1
SLIDE 1

Reasoning under Uncertainty Reasoning under Uncertainty

Course: CS40022 Course: CS40022 Instructor: Dr. Instructor: Dr. Pallab Dasgupta Pallab Dasgupta

Department of Computer Science & Engineering Department of Computer Science & Engineering Indian Institute of Technology Indian Institute of Technology Kharagpur Kharagpur

slide-2
SLIDE 2

CSE, IIT CSE, IIT Kharagpur Kharagpur

Handling uncertain knowledge Handling uncertain knowledge

∀p Symptom(p, Toothache) p Symptom(p, Toothache) ⇒ ⇒ Disease(p, Cavity) Disease(p, Cavity)

  • Not correct

Not correct – – toothache can be caused in toothache can be caused in many other cases many other cases

∀p Symptom(p, Toothache) p Symptom(p, Toothache) ⇒ ⇒ Disease(p, Cavity) Disease(p, Cavity) ∨ ∨ Disease(p, Disease(p, GumDisease GumDisease) ) ∨ ∨ Disease(p, Disease(p, ImpactedWisdom ImpactedWisdom) ) ∨ ∨ … …

slide-3
SLIDE 3

CSE, IIT CSE, IIT Kharagpur Kharagpur

Handling uncertain knowledge Handling uncertain knowledge

∀p Disease(p, Cavity) p Disease(p, Cavity) ⇒ ⇒ Symptom(p, Toothache) Symptom(p, Toothache)

  • This is not correct either, since all cavities

This is not correct either, since all cavities do not cause toothache do not cause toothache

slide-4
SLIDE 4

CSE, IIT CSE, IIT Kharagpur Kharagpur

Reasons for using probability Reasons for using probability

  • Specification becomes too large

Specification becomes too large

  • It is too much work to list the complete set

It is too much work to list the complete set

  • f antecedents or consequents needed to
  • f antecedents or consequents needed to

ensure an exception ensure an exception-

  • less rule

less rule

  • Theoretical ignorance

Theoretical ignorance

  • The complete set of antecedents is not

The complete set of antecedents is not known known

  • Practical ignorance

Practical ignorance

  • The truth of the antecedents is not known,

The truth of the antecedents is not known, but we still wish to reason but we still wish to reason

slide-5
SLIDE 5

CSE, IIT CSE, IIT Kharagpur Kharagpur

Axioms of Probability

  • 1. All prob are between 0 and 1: 0 ≤ P(A) ≤ 1
  • 2. P(True) = 1 and P(False) = 0
  • 3. P(A ∨ B) = P(A) + P(B) – P(A ∧ B)

Bayes’ Rule P(A ∧ B) = P(A | B) P(B) P(A ∧ B) = P(B | A) P(A)

) ( ) ( ) | ( ) | ( A P B P B A P A B P =

slide-6
SLIDE 6

CSE, IIT CSE, IIT Kharagpur Kharagpur

Belief Networks Belief Networks

A belief network is a graph with the following: A belief network is a graph with the following:

1. 1.

Nodes: Nodes: Set of random variables Set of random variables

2. 2.

Directed links: Directed links: The intuitive meaning of a link The intuitive meaning of a link from node X to node Y is that X has a direct from node X to node Y is that X has a direct influence on Y influence on Y

3. 3.

Each node has a Each node has a conditional probability conditional probability table table that quantifies the effects that the that quantifies the effects that the parent have on the node. parent have on the node.

4. 4.

The graph has no directed cycles (DAG) The graph has no directed cycles (DAG)

slide-7
SLIDE 7

CSE, IIT CSE, IIT Kharagpur Kharagpur

Example Example

  • Burglar alarm at home

Burglar alarm at home

  • Fairly reliable at detecting a burglary

Fairly reliable at detecting a burglary

  • Responds at times to minor earthquakes

Responds at times to minor earthquakes

  • Two neighbors, on hearing alarm, calls police

Two neighbors, on hearing alarm, calls police

  • John always calls when he hears the

John always calls when he hears the alarm, but sometimes confuses the alarm, but sometimes confuses the telephone ringing with the alarm and calls telephone ringing with the alarm and calls then, too. then, too.

  • Mary likes loud music and sometimes

Mary likes loud music and sometimes misses the alarm altogether misses the alarm altogether

slide-8
SLIDE 8

CSE, IIT CSE, IIT Kharagpur Kharagpur

Belief Network Example Belief Network Example

Alarm Burglary Earthquake JohnCalls MaryCalls

0.05 0.05 F F 0.90 0.90 T T P(J) P(J) A A 0.01 0.01 F F 0.70 0.70 T T P(M) P(M) A A 0.001 0.001 F F F F 0.29 0.29 T T F F 0.95 0.95 F F T T 0.95 0.95 T T T T P(A) P(A) E E B B 0.001 0.001 P(B) P(B) 0.002 0.002 P(E) P(E)

slide-9
SLIDE 9

CSE, IIT CSE, IIT Kharagpur Kharagpur

The joint probability distribution The joint probability distribution

  • A generic entry in the joint probability

A generic entry in the joint probability distribution P(x distribution P(x1

1, …,

, …, x xn

n) is given by:

) is given by:

)) X ( Parents | x ( P ) x ,..., x ( P

i i n i n

=

=

1 1

slide-10
SLIDE 10

CSE, IIT CSE, IIT Kharagpur Kharagpur

The joint probability distribution The joint probability distribution

  • Probability of the event that the alarm has

Probability of the event that the alarm has sounded but neither a burglary nor an sounded but neither a burglary nor an earthquake has occurred, and both Mary and earthquake has occurred, and both Mary and John call: John call: P(J P(J ∧ ∧ M M ∧ ∧ A A ∧ ∧ ¬ ¬B B ∧ ∧ ¬ ¬E) E) = P(J | A) P(M | A) P(A | = P(J | A) P(M | A) P(A | ¬ ¬B B ∧ ∧ ¬ ¬E) E) P( P(¬ ¬B) P( B) P(¬ ¬E) E) = 0.9 X 0.7 X 0.001 X 0.999 X 0.998 = 0.9 X 0.7 X 0.001 X 0.999 X 0.998 = 0.00062 = 0.00062

slide-11
SLIDE 11

CSE, IIT CSE, IIT Kharagpur Kharagpur

Conditional independence Conditional independence

) x ,..., x | x ( P ) x ( P ) x | x ( P ... ) x ,..., x | x ( P ) x ,..., x | x ( P ) x ,..., x ( P ) x ,..., x | x ( P ) x ,..., x ( P

i i n i n n n n n n n n 1 1 1 1 1 2 1 2 1 1 1 1 1 1 1 1 − = − − − − −

= = =

  • The belief network represents conditional

The belief network represents conditional independence: independence:

)) X ( Parents | X ( P ) X ,..., X | X ( P

i i i i

=

1

slide-12
SLIDE 12

CSE, IIT CSE, IIT Kharagpur Kharagpur

Incremental Network Construction Incremental Network Construction

1.

  • 1. Choose the set of relevant variables

Choose the set of relevant variables X Xi

i that

that describe the domain describe the domain 2.

  • 2. Choose an ordering for the variables (

Choose an ordering for the variables (very very important step important step) ) 3.

  • 3. While there are variables left:

While there are variables left: a) a) Pick a variable X and add a node for it Pick a variable X and add a node for it b) b) Set Parents(X) to some minimal set of Set Parents(X) to some minimal set of existing nodes such that the conditional existing nodes such that the conditional independence property is satisfied independence property is satisfied c) c) Define the conditional Define the conditional prob prob table for X table for X

slide-13
SLIDE 13

CSE, IIT CSE, IIT Kharagpur Kharagpur

Conditional Independence Relations Conditional Independence Relations

  • If every undirected path from a node in X to a

If every undirected path from a node in X to a node in Y is d node in Y is d-

  • separated by a given set of

separated by a given set of evidence nodes E, then X and Y are evidence nodes E, then X and Y are conditionally independent given E. conditionally independent given E.

  • A set of nodes E

A set of nodes E d d-

  • separates

separates two sets of two sets of nodes X and Y if every undirected path from a nodes X and Y if every undirected path from a node in X to a node in Y is node in X to a node in Y is blocked blocked given E. given E.

slide-14
SLIDE 14

CSE, IIT CSE, IIT Kharagpur Kharagpur

Conditional Independence Relations Conditional Independence Relations

  • A path is blocked given a set of nodes E if

A path is blocked given a set of nodes E if there is a node Z on the path for which one of there is a node Z on the path for which one of three conditions holds: three conditions holds: 1.

  • 1. Z is in E and Z has one arrow on the path

Z is in E and Z has one arrow on the path leading in and one arrow out leading in and one arrow out 2.

  • 2. Z is in E and Z has both path arrows

Z is in E and Z has both path arrows leading out leading out 3.

  • 3. Neither Z nor any descendant of Z is in E,

Neither Z nor any descendant of Z is in E, and both path arrows lead in to Z and both path arrows lead in to Z

slide-15
SLIDE 15

CSE, IIT CSE, IIT Kharagpur Kharagpur

Cond Cond Independence in belief networks Independence in belief networks

Battery Radio Ignition Petrol Starts Whether there is petrol and whether the radio plays are independent given evidence about whether the ignition takes place Petrol and Radio are independent if it is known whether the battery works

slide-16
SLIDE 16

CSE, IIT CSE, IIT Kharagpur Kharagpur

Cond Cond Independence in belief networks Independence in belief networks

Battery Radio Ignition Petrol Starts Petrol and Radio are independent given no evidence at all. But they are dependent given evidence about whether the car starts. If the car does not start, then the radio playing is increased evidence that we are out of petrol.

slide-17
SLIDE 17

CSE, IIT CSE, IIT Kharagpur Kharagpur

Inferences using belief networks Inferences using belief networks

  • Diagnostic inferences

Diagnostic inferences (from effects to causes) (from effects to causes)

  • Given that

Given that JohnCalls JohnCalls, infer that , infer that P(Burglary | P(Burglary | JohnCalls JohnCalls) = 0.016 ) = 0.016

  • Causal inferences

Causal inferences (from causes to effects) (from causes to effects)

  • Given Burglary, infer that

Given Burglary, infer that P( P(JohnCalls JohnCalls | Burglary) = 0.86 and | Burglary) = 0.86 and P( P(MaryCalls MaryCalls | Burglary) = 0.67 | Burglary) = 0.67

slide-18
SLIDE 18

CSE, IIT CSE, IIT Kharagpur Kharagpur

Inferences using belief networks Inferences using belief networks

  • Intercausal

Intercausal inferences inferences (between causes of a common (between causes of a common effect) effect)

  • Given Alarm, we have

Given Alarm, we have P(Burglary | Alarm) = 0.376. P(Burglary | Alarm) = 0.376.

  • If we add evidence that Earthquake is true, then

If we add evidence that Earthquake is true, then P(Burglary | Alarm P(Burglary | Alarm ∧ ∧ Earthquake) goes down to Earthquake) goes down to 0.003 0.003

  • Mixed inferences

Mixed inferences

  • Setting the effect

Setting the effect JohnCalls JohnCalls to true and the cause to true and the cause Earthquake to false gives Earthquake to false gives P(Alarm | P(Alarm | JohnCalls JohnCalls ∧ ∧ ¬ ¬ Earthquake) Earthquake) = 0.003 = 0.003

slide-19
SLIDE 19

CSE, IIT CSE, IIT Kharagpur Kharagpur

The four patterns The four patterns

Q E Q E E E Diagnostic Q Q E Causal InterCausal Mixed

slide-20
SLIDE 20

CSE, IIT CSE, IIT Kharagpur Kharagpur

Answering queries Answering queries

  • We consider cases where the belief network

We consider cases where the belief network is a poly is a poly-

  • tree

tree

  • There is at most one undirected path

There is at most one undirected path between any two nodes between any two nodes

slide-21
SLIDE 21

CSE, IIT CSE, IIT Kharagpur Kharagpur

Answering queries Answering queries

U1 Um

+ X

E X Y1 Z1j Znj Yn

− X

E

slide-22
SLIDE 22

CSE, IIT CSE, IIT Kharagpur Kharagpur

Answering queries Answering queries

  • U = U1 … Um are parents of node X
  • Y = Y1 … Yn are children of node X
  • X is the query variable
  • E is a set of evidence variables
  • The aim is to compute P(X | E)
slide-23
SLIDE 23

CSE, IIT CSE, IIT Kharagpur Kharagpur

Definitions Definitions

  • E

EX

X+ + is the causal support for X

is the causal support for X

  • The evidence variables “

The evidence variables “above above” X that are ” X that are connected to X through its parents connected to X through its parents

  • E

EX

X– – is the evidential support for X

is the evidential support for X

  • The evidence variables “

The evidence variables “below below” X that are ” X that are connected to X through its children connected to X through its children

  • E

EUi

Ui \ \ X X refers to all the evidence connected to

refers to all the evidence connected to node node U Ui

i except via the path from X

except via the path from X

  • E

EYi

Yi \ \ X X+ + refers to all the evidence connected to

refers to all the evidence connected to node Y node Yi

i through its parents for X

through its parents for X

slide-24
SLIDE 24

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(X|E) The computation of P(X|E)

) E | E ( P ) E | X ( P ) E , X | E ( P ) E , E | X ( P ) E | X ( P

X X X X X X X + − + + − + −

= =

  • Since X d-separates EX+ from EX

–, we can

use conditional independence to simplify the first term in the numerator

  • We can treat the denominator as a constant

) E | X ( P ) X | E ( P ) E | X ( P

X X + −

α =

slide-25
SLIDE 25

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(X | E The computation of P(X | EX

X+ +)

)

We consider all possible configurations of the parents of X and how likely they are given EX+. Let U be the vector of parents U1, …, Um, and let u be an assignment of values to them.

) E | u ( P ) E , u | X ( P ) E | X ( P

X u X X + + +

=

slide-26
SLIDE 26

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(X | E The computation of P(X | EX

X+ +)

)

) E | u ( P ) E , u | X ( P ) E | X ( P

X u X X + + +

=

U d-separates X from EX+, so the first term simplifies to P(X | u) We can simplify the second term by noting – EX+ d-separates each Ui from the others, – the probability of a conjunction of independent variables is equal to the product of their individual probabilities

) E | u ( P ) u | X ( P ) E | X ( P

X i i u X + +

∏ ∑

=

slide-27
SLIDE 27

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(X | E The computation of P(X | EX

X+ +)

)

) E | u ( P ) u | X ( P ) E | X ( P

X i i u X + +

∏ ∑

=

The last term can be simplified by partitioning EX+ into EU1\X, …, EUm\X and noting that EUi\X d-separates Ui from all the other evidence in EX+

) E | u ( P ) u | X ( P ) E | X ( P

X \ Ui i i u X

∏ ∑

=

+

  • P(X | u) is a lookup in the cond prob table of X
  • P(ui | EUi\X) is a recursive (smaller) sub-problem
slide-28
SLIDE 28

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(E The computation of P(EX

X – – | X)

| X)

Let Zi be the parents of Yi other than X, and let zi be an assignment of values to the parents – The evidence in each Yi box is conditionally independent of the others given X

) X | E ( P ) X | E ( P

X \ Yi i X

=

slide-29
SLIDE 29

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(E The computation of P(EX

X – – | X)

| X)

) X | E ( P ) X | E ( P

X \ Yi i X

=

Averaging over Yi and zi yields:

) X | z , y ( P ) z , y , X | E ( P ) X | E ( P

i i i y z i i X \ Yi X

i i

∏∑∑

=

slide-30
SLIDE 30

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(E The computation of P(EX

X – – | X)

| X)

) X | z , y ( P ) z , y , X | E ( P ) X | E ( P

i i i y z i i X \ Yi X

i i

∏∑∑

=

Breaking EYi\X into the two independent components EYi

– and EYi\X+

) X | z , y ( P ) z , y , X | E ( P ) z , y , X | E ( P ) X | E ( P

i i i i X \ Yi i y z i i Yi X

i i

+ − −

∏∑∑

=

slide-31
SLIDE 31

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(E The computation of P(EX

X – – | X)

| X) ) X | z , y ( P ) z , y , X | E ( P ) z , y , X | E ( P ) X | E ( P

i i i i X \ Yi i y z i i Yi X

i i

+ − −

∏∑∑

=

EYi

– is independent of X and zi given yi, and

EYi\X+ is independent of X and yi

) X | z , y ( P ) z | E ( P ) y | E ( P ) X | E ( P

i i i i X \ Yi y z i Yi X

i i

∏∑ ∑

+ − −

=

slide-32
SLIDE 32

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(E The computation of P(EX

X – – | X)

| X)

) X | z , y ( P ) z | E ( P ) y | E ( P ) X | E ( P

i i i i X \ Yi y z i Yi X

i i

∏∑ ∑

+ − −

=

Apply Bayes’ rule to P(EYi\X+ | zi):

) X | z , y ( P ) z ( P ) E ( P ) E | z ( P ) y | E ( P ) X | E ( P

i i i y z i X \ Yi X \ Yi i i Yi X

i i

∏∑ ∑

+ + − −

=

slide-33
SLIDE 33

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(E The computation of P(EX

X – – | X)

| X)

) X | z , y ( P ) z ( P ) E ( P ) E | z ( P ) y | E ( P ) X | E ( P

i i i y z i X \ Yi X \ Yi i i Yi X

i i

∏∑ ∑

+ + − −

=

  • Rewriting the conjunction of Yi and zi:

) X | z ( P ) z , X | y ( P ) z ( P ) E ( P ) E | z ( P ) y | E ( P ) X | E ( P

i i i z i X \ Yi X \ Yi i i y i Yi X

i i

∑ ∏∑

+ + − −

=

slide-34
SLIDE 34

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(E The computation of P(EX

X – – | X)

| X)

) X | z ( P ) z , X | y ( P ) z ( P ) E ( P ) E | z ( P ) y | E ( P ) X | E ( P

i i i z i X \ Yi X \ Yi i i y i Yi X

i i

∑ ∏∑

+ + − −

=

P(zi | X) = P(zi) because Z and X are d-separated. Also P(EYi\X+) is a constant

) z , X | y ( P ) E | z ( P ) y | E ( P ) X | E ( P

i i i X \ Yi y i z i i Yi X

i i

∏∑ ∑

+ − −

β =

slide-35
SLIDE 35

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(E The computation of P(EX

X – – | X)

| X)

) z , X | y ( P ) E | z ( P ) y | E ( P ) X | E ( P

i i i X \ Yi y i z i i Yi X

i i

∏∑ ∑

+ − −

β =

  • The parents of Yi (the Zij) are independent of

each other.

  • We also combine the βi into one single β
slide-36
SLIDE 36

CSE, IIT CSE, IIT Kharagpur Kharagpur

The computation of P(E The computation of P(EX

X – – | X)

| X)

∏∑ ∑ ∏

− −

β =

i Y \ Z y ij z j i i i Yi X

) E | z ( P ) z , X | y ( P ) y | E ( P ) X | E ( P

i ij i i

  • P(EYi

– | yi) is a recursive instance of P(EX – | X)

  • P(yi | X, zi) is a cond prob table entry for Yi
  • P(zij | EZij\Yi) is a recursive sub-instance of the

P(X | E) calculation

slide-37
SLIDE 37

CSE, IIT CSE, IIT Kharagpur Kharagpur

Inference in multiply connected Inference in multiply connected belief networks belief networks

  • Clustering methods

Clustering methods

  • Transform the net into a probabilistically

Transform the net into a probabilistically equivalent (but topologically different) poly equivalent (but topologically different) poly-

  • tree by merging offending nodes

tree by merging offending nodes

  • Conditioning methods

Conditioning methods

  • Instantiate variables to definite values, and

Instantiate variables to definite values, and then evaluate a poly then evaluate a poly-

  • tree for each possible

tree for each possible instantiation instantiation

slide-38
SLIDE 38

CSE, IIT CSE, IIT Kharagpur Kharagpur

Inference in multiply connected Inference in multiply connected belief networks belief networks

  • Stochastic simulation methods

Stochastic simulation methods

  • Use the network to generate a large

Use the network to generate a large number of concrete models of the domain number of concrete models of the domain that are consistent with the network that are consistent with the network distribution. distribution.

  • They give an approximation of the exact

They give an approximation of the exact evaluation. evaluation.

slide-39
SLIDE 39

CSE, IIT CSE, IIT Kharagpur Kharagpur

Default reasoning Default reasoning

  • Some conclusions are made by default unless a

Some conclusions are made by default unless a counter counter-

  • evidence is obtained

evidence is obtained

  • Non

Non-

  • monotonic reasoning

monotonic reasoning

  • Points to ponder

Points to ponder

  • Whats

Whats the semantic status of default rules? the semantic status of default rules?

  • What happens when the evidence matches the

What happens when the evidence matches the premises of two default rules with conflicting premises of two default rules with conflicting conclusions? conclusions?

  • If a belief is retracted later, how can a system

If a belief is retracted later, how can a system keep track of which conclusions need to be keep track of which conclusions need to be retracted as a consequence? retracted as a consequence?

slide-40
SLIDE 40

CSE, IIT CSE, IIT Kharagpur Kharagpur

Issues in Rule Issues in Rule-

  • based methods for

based methods for Uncertain Reasoning Uncertain Reasoning

  • Locality

Locality

  • In logical reasoning systems, if we have

In logical reasoning systems, if we have A A ⇒ ⇒ B, then we can conclude B given B, then we can conclude B given evidence A, evidence A, without worrying about any without worrying about any

  • ther rules
  • ther rules. In probabilistic systems, we

. In probabilistic systems, we need to consider need to consider all all available evidence. available evidence.

slide-41
SLIDE 41

CSE, IIT CSE, IIT Kharagpur Kharagpur

Issues in Rule Issues in Rule-

  • based methods for

based methods for Uncertain Reasoning Uncertain Reasoning

  • Detachment

Detachment

  • Once a logical proof is found for

Once a logical proof is found for proposition B, we can use it regardless of proposition B, we can use it regardless of how it was derived ( how it was derived (it can be detached it can be detached from its justification from its justification). ). In probabilistic In probabilistic reasoning, the source of the evidence is reasoning, the source of the evidence is important for subsequent reasoning. important for subsequent reasoning.

slide-42
SLIDE 42

CSE, IIT CSE, IIT Kharagpur Kharagpur

Issues in Rule Issues in Rule-

  • based methods for

based methods for Uncertain Reasoning Uncertain Reasoning

  • Truth functionality

Truth functionality

  • In logic, the truth of complex sentences

In logic, the truth of complex sentences can be computed from the truth of the can be computed from the truth of the

  • components. Probability combination does
  • components. Probability combination does

not work this way, except under strong not work this way, except under strong independence assumptions. independence assumptions. A famous example of a truth functional system A famous example of a truth functional system for uncertain reasoning is the for uncertain reasoning is the certainty factors certainty factors model model, developed for the , developed for the Mycin Mycin medical medical diagnostic program diagnostic program

slide-43
SLIDE 43

CSE, IIT CSE, IIT Kharagpur Kharagpur

Dempster Dempster-

  • Shafer Theory

Shafer Theory

  • Designed to deal with the distinction between

Designed to deal with the distinction between uncertainty uncertainty and and ignorance ignorance. .

  • We use a belief function

We use a belief function Bel Bel(X) (X) – – probability probability that the evidence supports the proposition that the evidence supports the proposition

  • When we do not have any evidence about X,

When we do not have any evidence about X, we assign we assign Bel Bel(X) = 0 as well as (X) = 0 as well as Bel Bel( (¬ ¬X) = 0 X) = 0

slide-44
SLIDE 44

CSE, IIT CSE, IIT Kharagpur Kharagpur

Dempster Dempster-

  • Shafer Theory

Shafer Theory

For example, if we do not know whether a coin For example, if we do not know whether a coin is fair, then: is fair, then: Bel Bel( Heads ) = ( Heads ) = Bel Bel( ( ¬ ¬Heads ) = 0 Heads ) = 0 If we are given that the coin is fair with 90% If we are given that the coin is fair with 90% certainty, then: certainty, then: Bel Bel( Heads ) = 0.9 X 0.5 = 0.45 ( Heads ) = 0.9 X 0.5 = 0.45 Bel Bel( (¬ ¬Heads ) = 0.9 X 0.5 = 0.45 Heads ) = 0.9 X 0.5 = 0.45 Note that we still have a gap of Note that we still have a gap of 0.1 0.1 that is not that is not accounted for by the evidence accounted for by the evidence

slide-45
SLIDE 45

CSE, IIT CSE, IIT Kharagpur Kharagpur

Fuzzy Logic Fuzzy Logic

  • Fuzzy set theory is a means of specifying

Fuzzy set theory is a means of specifying how well an object satisfies a vague how well an object satisfies a vague description description

  • Truth is a value between 0 and 1

Truth is a value between 0 and 1

  • Uncertainty stems from lack of evidence,

Uncertainty stems from lack of evidence, but given the dimensions of a man but given the dimensions of a man concluding whether he is fat has no concluding whether he is fat has no uncertainty involved uncertainty involved

slide-46
SLIDE 46

CSE, IIT CSE, IIT Kharagpur Kharagpur

Fuzzy Logic Fuzzy Logic

  • The rules for evaluating the fuzzy truth, T, of

The rules for evaluating the fuzzy truth, T, of a complex sentence are a complex sentence are T(A T(A ∧ ∧ B) = min( T(A), T(B) ) B) = min( T(A), T(B) ) T(A T(A ∨ ∨ B) = max( T(A), T(B) ) B) = max( T(A), T(B) ) T( T(¬ ¬A) = 1 A) = 1 − − T(A) T(A)