uncertainty in knowledge
play

UNCERTAINTY IN KNOWLEDGE Ch. 9 Uncertainty in Knowledge 1 - PowerPoint PPT Presentation

UNCERTAINTY IN KNOWLEDGE Ch. 9 Uncertainty in Knowledge 1 Sources of Uncertainty Uncertainty in problem solving can be attributed to Imperfect domain knowledge Imperfect case data Experts employ inexact methods for two


  1. UNCERTAINTY IN KNOWLEDGE Ch. 9 Uncertainty in Knowledge 1

  2. Sources of Uncertainty • Uncertainty in problem solving can be attributed to – Imperfect domain knowledge – Imperfect case data • Experts employ inexact methods for two reasons – Exact methods are not known; and – Exact methods are known but are impractical, owing to lack of data, or problem with data collection, or difficulties with data processing. Ch. 9 Uncertainty in Knowledge 2

  3. Conditional Probability for Expert System Rule If: the patient has symptom of chest pain Then: conclude myocardial infarction with probability P where P is the following conditional probability: P ( myocardial infarction chest pain) Ch. 9 Uncertainty in Knowledge 3

  4. Definition of Conditional Probability ∧ ( ) P d s = ( | ) P d s ( ) P s Thus, to compute P ( myocardinal infarction | chest pain ), one needs the following information which is usually unavailable. P ( myocardinal infarction ∧ chest pain ) Ch. 9 Uncertainty in Knowledge 4

  5. Bayes’ Rule By definition of conditional probability: ∧ ( ) P d s = ( | ) P d s ( ) P s ∧ ( ) P d s = ( | ) P s d ( ) P d Thus we have Bayes’ rule as follows: ( | ) ( ) P s d P d = ( | ) P d s ( ) P s Ch. 9 Uncertainty in Knowledge 5

  6. The Use of Bayes’ Rule • The doctors have some consistent notion of how many heart attach patients have chest pain, and will be able to give an estimate of P ( chest pain | myocardial infarction ) • Medical statistics should enable an estimate of P ( myocardial infarction ) while a doctor’s own records could provide an estimate of P ( chest pain ). • Thus Bayes’ Rule allows the writing of the expert system rule to infer myocardial infarction based on observation of chest pain . Ch. 9 Uncertainty in Knowledge 6

  7. General Form of Bayes’ Rule ∧ ∧ ( ... | ) ( ) P s s d P d ∧ ∧ = 1 k ( | ... ) P d s s 1 ∧ ∧ k ( ... ) P s s 1 k In order to compute P ( s 1 ∧ … ∧ s k ), we must compute ∧ ∧ ∧ ∧ ( | ... ) ( | ... )... ( ) P s s s P s s s P s 1 2 2 3 k k k However, we can assume conditional independence, such that ∧ ∧ ∧ = ( ... ) ( ) ( )... ( ) P s s s P s P s P s 1 2 1 2 k k Ch. 9 Uncertainty in Knowledge 7

  8. Certainty Factors in MYCIN the patient has signs and symptoms s 1 ∧ … ∧ s k , and If: certain background conditions t 1 ∧ … ∧ t m hold Then: conclude that the patient has disease d i , with certainty τ Thus the degree of certainty associated with the conclusion is: CF(d i , s 1 ∧ … ∧ s k ∧ t 1 ∧ … ∧ t m ) = τ × min (CF(s 1 ),…,CF(S k ),CF(t 1 ),…,CF(t m ) ) Ch. 9 Uncertainty in Knowledge 8

  9. Measures of Belief and Disbelief • If, by adding supporting evidence e, P ( h | e ) > P ( h ), then the measure of belief MB is: − ( | ) ( ) P h e P h = ( , ) MB h e − 1 ( ) P h • If, on the other hand, e constitutes evidence against h, such that P ( h | e ) < P ( h ), then the measure of disbelief: − ( ) ( | ) P h P h e = ( , ) MD h e ( ) P h Ch. 9 Uncertainty in Knowledge 9

  10. Certainty Factor Algebra • Either: 1 > MB ( h , e ) > 0 while MD ( h , e ) = 0 or 1 > MD ( h , e ) > 0 while MB ( h , e ) = 0 • MB and MD constrain each other in that e is either for or against h . Once the line between the two is established, they can be tied together by: CF ( h , e ) = MB ( h , e ) – MD ( h , e ) • As CF approaches 1, the evidence e is stronger for hypothesis h; as CF approaches –1, then confidence against the hypothesis gets stronger. Ch. 9 Uncertainty in Knowledge 10

  11. Combining CFs within a Rule • For conjunctive premises: CF ( P 1 and P 2 ) = MIN ( CF ( P 1 ), CF ( P 2 )) • For disjunctive premises: CF ( P 1 or P 2 ) = MAX ( CF ( P 1 ), CF ( P 2 )) • Example: ( P 1 and P 2 ) or P 3 ⇒ R 1 (0.7) and R 2 (0.3) CF ( P 1 (0.6) and P 2 (0.4)) = MIN (0.6, 0.4) = 0.4 CF ((0.4) or P 3 (0.2)) = MAX (0.4,0.2) = 0.4 CF for R 1 = 0.7 × 0.4 = 0.28 CF for R 2 = 0.3 × 0.4 = 0.12 Ch. 9 Uncertainty in Knowledge 11

  12. Combining CFs from Different Rules • Assume that two rules support the same result R . Let R 1 represent the result R from one rule and R 2 the result R from another rule. One of the following situations applies: – CF ( R 1 ) and CF ( R 2 ) are both positive, then = + − × ( ) ( ) ( ) ( ( ) ( )) CF R CF R CF R CF R CF R 1 2 1 2 – CF ( R 1 ) and CF ( R 2 ) are both negative, then = + + × ( ) ( ) ( ) ( ( ) ( )) CF R CF R CF R CF R CF R 1 2 1 2 – Otherwise: + ( ) ( ) CF R CF R = 1 2 ( ) CF R − 1 (| ( ), | ( ) |) MIN CF R CF R 1 2 Ch. 9 Uncertainty in Knowledge 12

  13. The use of CFs in MYCIN • To guide the program in its reasoning • To cause the current goal to be deemed unpromising and pruned from the search space if its CF falls in the range of [+0.2, − 0.2] • To rank hypotheses after all the evidence has been considered. Ch. 9 Uncertainty in Knowledge 13

  14. CFs and Probabilities • Suppose that P ( d 1 ) = 0.8; P ( d 1 | e ) = 0.9 P ( d 2 ) = 0.2; P ( d 2 | e ) = 0.8 • The increase in belief in d 1 is: − − ( | ) ( ) 0 . 9 0 . 8 P d e P d = − = = = 1 1 ( , ) ( , ) 0 0 . 5 CF d e MB d e 1 1 − − 1 ( ) 1 0 . 8 P d 1 • The increase in belief in d 2 is: − − ( | ) ( ) 0 . 8 0 . 2 P d e P d = − = = = 2 2 ( , ) ( , ) 0 0 . 75 CF d e MB d e 2 2 − − 1 ( ) 1 0 . 2 P d 2 • Thus CF ( d 1 , e ) < CF ( d 2 , e ) even though P ( d 1 | e ) > P ( d 2 | e ) Ch. 9 Uncertainty in Knowledge 14

  15. Implementing CFs in CLIPS • Consider the following MYCIN rule: IF The stain of the organism is gramneg and The morphology of the organism is rod and The patient is a compromised host Then There is suggestive evidence (0.6) that the identity of the organism is pseudomonas • Represent facts as object-attribute-value (OAV) triples. • The OAV template allows two identical OAV triples to be asserted only if they have different CFs. To allow identical OAV triples with the same CFs, set the following (set-fact-duplication TRUE) Ch. 9 Uncertainty in Knowledge 15

  16. Implementing CFs in CLIPS (contd.) CLIPS> (defmodule OAV (export deftemplate oav) (deftemplate OAV::oav (multislot object (type SYMBOL) (multislot attribute (type SYMBOL) (multislot value) (slot CF (type FLOAT) (range –1.0 +1.0) ) ) CLIPS> (deffacts data (oav (object organism) (attribute stain) (value gramneg) (CF 0.3)) (oav (object organism) (attribute morphology) (value rod) (CF 0.7)) (oav (object patient) (attribute is a) (value compromised host) (CF 0.8) ) ) Ch. 9 Uncertainty in Knowledge 16

  17. Combining CFs of Different Facts CLIPS> (defrule OAV:: combine-CFs-both-positive (declare (auto-focus TRUE)) ?fact1 <− (oav (object $?o) (attribute $?a) (value $?v) (CF ?C1&: ( >= ?C1 0 ) ) ) ?fact2 <− (oav (object $?o) (attribute $?a) (value $?v) (CF ?C2&: ( >= ?C2 0 ) ) ) (test (neq ?fact1 ?fact2) ) => (retract ?fact1) (bind ?C3 ( − (+ ?C1 ?C2) (* ?C1 ?C2) ) ) (modify ?fact2 (CF ?C3) ) ) Ch. 9 Uncertainty in Knowledge 17

  18. Combining CFs within a Rule CLIPS> (defmodule IDENTIFY (import OAV deftemplate oav)) (defrule IDENTIFY::MYCIN-to-CLIPS-translation (oav (object organism) (attribute stain) (value gramneg) (CF ?C1) ) (oav (object organism) (attribute morphology) (value rod) (CF ?C2) ) (oav (object patient) (attribute is a) (value compromised host) (CF ?C3) ) (test ( > (min ?C1 ?C2 ?C3) 0.2) ) => (bind ?C4 (* (min ?C1 ?C2 ?C3) 0.6) ) (assert (oav (object organism) (attribute identity) (value pseudomonas) (CF ?C4) ) ) ) Ch. 9 Uncertainty in Knowledge 18

  19. Classic Sets (Crisp Sets) • f ( X ) = true if and only if X ∈ A • Characterizing a set of cars capable of more than 150 mph: >  ( ) ( ) 150 true if CAR X and TOP - SPEED X =  150 ( ) GT X  false otherwise • The set may also be written as: { X ∈ CAR | TOP-SPEED ( X ) > 150 } Ch. 9 Uncertainty in Knowledge 19

  20. Fuzzy Sets • A fuzzy set is a function f with domain [0,1], whose value denotes degree of membership, with 0 denoting that X is not a member and 1 denoting that X is definitely a member. • Car Example: f FAST (80) = 0 and f FAST (180) = 1 • f FAST-CAR ( X ) = f FAST ( TOP-SPEED ( X )) Fast cars • FAST-CAR = { BMW316 ( Porsche -944, 0.9), ( BMW -316, 0.5), Porsche 944 ( Chevy-Nova , 0.1) } Chevy Nova Ch. 9 Uncertainty in Knowledge 20

  21. Fuzzy Logic • ¬ F ( X ) = 1 − F ( X ) • f ( F ∧ G )( X ) = MIN ( f F ( X ), f G ( X ) ) • f ( F / G )( X ) = MAX ( f F ( X ), f G ( X ) ) • Examples: FAST-CAR ( Porsche -944) = 0.9 ¬ FAST-CAR ( Porsche -944) = 0.1 PRETENTIOUS-CAR ( Porsche -944) = 0.7 FAST-CAR ( Porsche-944 ) ∧ PRETENTIOUS-CAR ( Porsche -944) = 0.7 FAST-CAR ( Porsche -944) ∧ ¬ FAST-CAR ( Porsche -944) = 0.1 FAST-CAR ( Porsche -944) ∨ ¬ FAST-CAR ( Porsche -944) = 0.9 Ch. 9 Uncertainty in Knowledge 21

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend