estimation of conflict and decreasing of ignorance in
play

Estimation of Conflict and Decreasing of Ignorance in - PowerPoint PPT Presentation

Estimation of Conflict and Decreasing of Ignorance in Dempster-Shafer Theory Alexander Lepskiy National Research University - Higher School of Economics, Moscow, Russia The 1 st International Conference on Information Technology and


  1. Estimation of Conflict and Decreasing of Ignorance in Dempster-Shafer Theory Alexander Lepskiy National Research University - Higher School of Economics, Moscow, Russia The 1 st International Conference on Information Technology and Quantitative Management, May 16 - 18, 2013, Suzhou, China Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 1 / 23

  2. Outline Outline of presentation 1 Theory of evidence Belief function and body of evidence Combining rules in Dempster-Shafer theory 2 Changing of ignorance after application of combining rules Imprecise indices Index of decreasing of ignorance 3 Conflict measure 4 Studying the relation between measure of conflict and index of decreasing of ignorance Statistical analyses Theoretical analyses 5 Summary and conclusion Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 2 / 23

  3. Theory of evidence Theory of evidence. Belief function and body of evidence Let X be a finite universal set and 2 X be the power set of X . Consider a belief measure g : 2 X → [0 , 1]. A belief function g is defined in evidence theory by a set function m g ( A ), called basic probability assignment (bpa): m g : 2 X → [0 , 1] , � m g ( ∅ ) = 0 , m g ( A ) = 1 . A ⊆ X B : B ⊆ A m g ( B ). Let the set of all belief measures on 2 X Then g ( A ) = � be denoted by Bel ( X ). Belief function g , and its dual, plausibility g ( A ) = 1 − g ( ¯ function ¯ A ), are considered together in evidence theory. Basic probability assignment m g may be computed by belief function g with help of so called Mobius transform of g : ( − 1) | B \ A | g ( A ) . � m g ( B ) = A : A ⊆ B Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 3 / 23

  4. Theory of evidence Combining rules in Dempster-Shafer theory The subset A ∈ 2 X is called by a focal element if m ( A ) > 0. Let A is a set of all focal elements. Then pair F = ( A , m ) is called a body of evidence . Let A ( g ) is the set of all focal elements and F ( g ) is the body of evidence related with belief function g . Suppose that we have two bodies of evidence ( A (1) , m (1) ) and ( A (2) , m (2) ) which are defined on the set X . In general a combining rule is a some operator R : Bel ( X ) × Bel ( X ) → Bel ( X ). Dempster’s rule (1967) 1 � m (1) ( A 1 ) m (2) ( A 2 ) , A � = ∅ , m D ( A ) = 1 − K A 1 ∩ A 2 = A � m (1) ( A 1 ) m (2) ( A 2 ) . m D ( ∅ ) = 0 , K = A 1 ∩ A 2 = ∅ The value K characterizes the amount of conflict of two sources of information. Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 4 / 23

  5. Theory of evidence Combining rules in Dempster-Shafer theory Discount rule (Shafer, 1976) m α ( A ) = (1 − α ) m ( A ) , A � = X ; m α ( X ) = α + (1 − α ) m ( X ) . The Dempster’s rule applies after discounting. The coefficient α ∈ [0 , 1] characterizes the degree of reliability of information: if α = 0 then source of information is absolutely reliable. If α = 1 then source of information is absolutely no reliable. Yager’s modified Dempster’s rule (1987) � m (1) ( A 1 ) m (2) ( A 2 ) , A ∈ 2 X , q ( A ) = A 1 ∩ A 2 = A m Y ( A ) = q ( A ) , A � = ∅ , X, m Y ( ∅ ) = q ( ∅ ) = K, m Y ( X ) = m Y ( ∅ ) + q ( X ) . The value q ( X ) = m (1) ( X ) m (2) ( X ) characterizes the amount of ignorance in two bodies of evidence ( A (1) , m (1) ) and ( A (2) , m (2) ). Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 5 / 23

  6. Theory of evidence Combining rules in Dempster-Shafer theory Inagaki’s unified combination rule (1991) m I ( A ) = q ( A )(1 + kq ( ∅ )) , A � = X, m I ( X ) = q ( X )(1 + kq ( ∅ )) + q ( ∅ )(1 + kq ( ∅ ) − k ) , where 0 ≤ k ≤ 1/(1 − q ( ∅ ) − q ( X )). If k = 0 then we have Yager’s rule. If k = 1/(1 − q ( ∅ )) then we get Dempster’s rule. Zhang’s center combination rule (1994) � r ( A 1 , A 2 ) m (1) ( A 1 ) m (2) ( A 2 ) , A ∈ 2 X , m Z ( A ) = A 1 ∩ A 2 = A where r ( A 1 , A 2 ) be a measure of intersection of sets A 1 and A 2 . For example r ( A 1 , A 2 ) = c | A 1 ∩ A 2 | | A 1 ∪ A | Jaccard similarity coefficient. Dubois and Prade’s disjunctive consensus rule (1992) � m (1) ( A 1 ) m (2) ( A 2 ) , A ∈ 2 X . m DP ( A ) = A 1 ∪ A 2 = A Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 6 / 23

  7. Quantity of information ignorance Quantity of information ignorance. Measure of uncertainty Let we know only that the “true” alternative is in a nonempty set B ⊆ X . This situation can be described by the non-additive measure (the so-called primitive belief function ) � 1 , B ⊆ A, η � B � ( A ) = B �⊆ A, 0 , A ⊆ X , B � = ∅ . Hartley’s measure H ( η � B � ) = log 2 | B | characterizes the degree of imprecision of the information about belonging of “true” alternative. Let g = � m g ( B ) η � B � be a belief function. Then B ∈ 2 X \{∅} generalized Hartley’s measure is defined by � m ( B )log 2 | B | . GH ( g ) = B ∈ 2 X \{∅} Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 7 / 23

  8. Quantity of information ignorance Imprecise indices Definition 1. A functional f : Bel ( X ) → [0 , 1] is called imprecision index if the following conditions are fulfilled: 1 if g be a probability measure then f ( g ) = 0; 2 f ( g 1 ) ≥ f ( g 2 ) for all g 1 , g 2 ∈ Bel ( X ) such that g 1 ≤ g 2 ; 3 f � � η � X � = 1. An imprecision index f on Bel ( X ) is called linear if for any linear combination � k j =1 α j g j ∈ Bel ( X ), , g j ∈ Bel ( X ), j = 1 , ..., k , we have �� k � = � k j =1 α j f ( g j ). f j =1 α j g j Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 8 / 23

  9. Quantity of information ignorance Proposition 1. The functional f : Bel ( X ) → [0 , 1] is a linear imprecision index on Bel ( X ) iff f ( g ) = � m g ( B ) µ f ( B ), where set function µ f satisfies B ∈ 2 X \{∅} the conditions: 1 µ f ( { x } ) = 0 for any x ∈ X ; 2 µ f ( X ) = f � � η � X � = 1; 3 µ f be a monotonic set function i.e. µ f ( B ′ ) ≤ µ f ( B ′′ ) if B ′ ⊆ B ′′ . Suppose that we have two bodies of evidence F ( g 1 ) = ( A (1) , m (1) ) and F ( g 2 ) = ( A (2) , m (2) ). These bodies of evidence corresponds belief functions g 1 and g 2 correspondingly. Let f : Bel ( X ) → [0 , 1] be a some linear imprecision index that estimates the degree of ignorance contained in the measure g . Suppose that we used some combining rule R for combining of evidence F ( g 1 ) and F ( g 2 ). As a result we get new belief function g = R ( g 1 , g 2 ). Then we have a question about amount of decreasing of ignorance after the using of combining rule R . Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 9 / 23

  10. Quantity of information ignorance Index of decreasing of ignorance The degree of such decreasing may be estimated with help of comparison f ( g ) with f ( g 1 ) and f ( g 2 ). For example we may introduce the following indices of decreasing of ignorance I R ( g i | g j ) = f ( g i ) − f ( R ( g i , g j )) , i, j ∈ { 1 , 2 } , I R ( g 1 , g 2 ) = min { I R ( g 1 | g 2 ) , I R ( g 2 | g 1 ) } . The decreasing of ignorance corresponded to the case of positivity of index I R ( g 1 , g 2 ). Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 10 / 23

  11. Quantity of information ignorance Some partial cases of evidence. Consensual evidences Let A (1) and A (2) are the two sets of focal elements satisfying the conditions: 1 A ′ ∩ A ′′ = ∅ , B ′ ∩ B ′′ = ∅ for all A ′ , A ′′ ∈ A (1) , B ′ , B ′′ ∈ A (2) ; 2 for every A ∈ A (1) exists a unique B ∈ A (2) such that A ∩ B � = ∅ ; 3 for every B ∈ A (2) exists a unique A ∈ A (1) such that A ∩ B � = ∅ . We will call this situation by a “consensual evidences” . Thus there is a one-to-one correspondence ϕ between the elements of sets A (1) and A (2) . ... A A B B 1 1 k k Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 11 / 23

  12. Quantity of information ignorance Some partial cases of evidence. Clarifying evidences If two bodies of evidence satisfy the conditions 1)-3) and the additional condition 4 A ⊆ ϕ ( A ) for all A ∈ A (1) then we will call this situation by “clarifying evidences” . A A ... k 1 B B k 1 Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 12 / 23

  13. Quantity of information ignorance Decreasing of ignorance. Dempster’s rule Proposition 2. Let F ( g 1 ) = ( A (1) , m (1) ) and F ( g 2 ) = ( A (2) , m (2) ) are the two bodies of evidence satisfying the conditions 1)-3). Then I D ( g 1 , g 2 ) > 0 if � � m (1) ( A ) ) ,m (2) ( ϕ ( A ) ) � m (1) ( A ) m (2) ( ϕ ( A ) ) > max A ∈A (1) µ f ( A ∩ ϕ ( A ))max . µ f ( ϕ ( A ) µ f ( A ) A ∈A (1) Corollary 1. Let two bodies of evidence F ( g 1 ) = ( A (1) , m (1) ) and F ( g 2 ) = ( A (2) , m (2) ) satisfy the conditions 1)-4). Then I D ( g 1 , g 2 ) > 0 if the following condition is true: � µ f ( A ) � � m (1) ( A ) m (2) ( ϕ ( A )) > max m (1) ( A ) µ f ( ϕ ( A )) , m (2) ( ϕ ( A )) A ∈A (1) max . A ∈A (1) Alexander Lepskiy (HSE) Conflict and Ignorance in DST ITQM 2013 13 / 23

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend