how the concept of
play

How the Concept of Shannons Derivation: . . . Shannons Derivation . - PowerPoint PPT Presentation

Types of Uncertainty: . . . Need to Compare . . . Traditional Amount of . . . How to Extend These . . . How the Concept of Shannons Derivation: . . . Shannons Derivation . . . Case of a Continuous . . . Information Can Be Partial


  1. Types of Uncertainty: . . . Need to Compare . . . Traditional Amount of . . . How to Extend These . . . How the Concept of Shannon’s Derivation: . . . Shannon’s Derivation . . . Case of a Continuous . . . Information Can Be Partial Information . . . Problem with This . . . Extended to Intervals, Alternative Approach: . . . Alternative Approach: . . . P-Boxes, and more General Adding Fuzzy Uncertainty Acknowledgments Uncertainty Title Page ◭◭ ◮◮ Vladik Kreinovich and Gang Xiang ◭ ◮ Pan-American Center for Earth and Environmental Studies Page 1 of 14 University of Texas at El Paso, El Paso, TX 79968, USA vladik@cs.utep.edu Go Back Scott Ferson Full Screen Applied Biomathematics, 100 North Country Road Close Setauket, NY 11733, USA, scott@ramas.com Quit

  2. Types of Uncertainty: . . . 1. Types of Uncertainty: In Brief Need to Compare . . . Traditional Amount of . . . • Problem: measurement result (estimate) � x differs from the actual value x . How to Extend These . . . Shannon’s Derivation: . . . • Probabilistic uncertainty: we know which values of ∆ x = � x − x are possible; def Shannon’s Derivation . . . we also know the frequency of each value, i.e., we know F ( t ) = Prob ( x ≤ t ). Case of a Continuous . . . • Interval uncertainty: we only know the upper bound ∆ on | ∆ x | ; then, x ∈ Partial Information . . . [ � x − ∆ , � x + ∆]. Problem with This . . . Alternative Approach: . . . • p-boxes: for every t , we only know the interval [ F ( t ) , F ( t )] containing F ( t ). Alternative Approach: . . . • Fuzzy uncertainty: we may also have expert estimates that provide better Adding Fuzzy Uncertainty bounds ∆ x and on F ( t ) with limited confidence. Acknowledgments • A nested family of intervals corresponding to different levels of certainty forms Title Page a fuzzy number. ◭◭ ◮◮ ◭ ◮ Page 2 of 14 Go Back Full Screen Close Quit

  3. Types of Uncertainty: . . . 2. Need to Compare Different Types of Uncertainty Need to Compare . . . Traditional Amount of . . . • Problem. Often, there is a need to compare different types of uncertainty. How to Extend These . . . Shannon’s Derivation: . . . • Example: we have two sensors: Shannon’s Derivation . . . – one with a smaller bound on a systematic (interval) component of the Case of a Continuous . . . measurement error, Partial Information . . . – the other with the smaller bound on the standard deviation of the ran- Problem with This . . . dom component of the measurement error. Alternative Approach: . . . Alternative Approach: . . . • Question: if we can only afford one of these sensors, which one should we Adding Fuzzy Uncertainty buy? Acknowledgments • Question: which of the two sensors brings us more information about the Title Page measured signal? • Problem: to gauge the amount of information. ◭◭ ◮◮ ◭ ◮ Page 3 of 14 Go Back Full Screen Close Quit

  4. Types of Uncertainty: . . . 3. Traditional Amount of Information: Brief Reminder Need to Compare . . . Traditional Amount of . . . • Shannon’s idea: (average) number of “yes”-“no” (binary) questions that we How to Extend These . . . need to ask to determine the object. Shannon’s Derivation: . . . • Fact: after q binary questions, we have 2 q possible results. Shannon’s Derivation . . . Case of a Continuous . . . • Discrete case: if we have n alternatives, we need q questions, where 2 q ≥ n , Partial Information . . . i.e., q ∼ log 2 ( n ). Problem with This . . . � Alternative Approach: . . . • Discrete probability distribution: q = − p i · log 2 ( p i ). Alternative Approach: . . . • Continuous case – definition: number of questions to find an object with a Adding Fuzzy Uncertainty given accuracy ε . Acknowledgments • Interval uncertainty: if x ∈ [ a, b ], then q ∼ S − log 2 ( ε ), with S = log 2 ( b − a ). Title Page � ◭◭ ◮◮ • Probabilistic uncertainty: S = − ρ ( x ) · log 2 ρ ( x ) dx. ◭ ◮ Page 4 of 14 Go Back Full Screen Close Quit

  5. Types of Uncertainty: . . . 4. How to Extend These Formulas to p-Boxes etc. Need to Compare . . . Traditional Amount of . . . • Problem: extend the formulas for information to more general uncertainty. How to Extend These . . . Shannon’s Derivation: . . . • Axiomatic approach – idea: Shannon’s Derivation . . . – find properties of information; Case of a Continuous . . . – look for generalizations that satisfy as many of these properties as pos- Partial Information . . . sible. Problem with This . . . Alternative Approach: . . . • Problem: sometimes, there are several possible generalizations. Alternative Approach: . . . • Which generalization should we choose? Adding Fuzzy Uncertainty Acknowledgments • Our idea: define information as the worst-case average number of questions. Title Page ◭◭ ◮◮ ◭ ◮ Page 5 of 14 Go Back Full Screen Close Quit

  6. Types of Uncertainty: . . . 5. Shannon’s Derivation: Reminder Need to Compare . . . Traditional Amount of . . . • Situation: we know the probabilities p 1 , . . . , p n of different alternatives. How to Extend These . . . Shannon’s Derivation: . . . • We repeat the selection N times. Shannon’s Derivation . . . • Let N i be number of times when we get A i . Case of a Continuous . . . Partial Information . . . • For big N , the value N i is ≈ normally distributed with average a = p i · N � Problem with This . . . and σ = p i · (1 − p i ) · N . Alternative Approach: . . . • With certainty depending on k 0 , we conclude that N i ∈ [ a − k 0 · σ, a + k 0 · σ ]. Alternative Approach: . . . Adding Fuzzy Uncertainty • Let N con ( N ) be the number of situations for which N i is within these intervals. Acknowledgments • Then, for N repetitions, we need q ( N ) = log 2 ( N cons ) questions. Title Page • Per repetition, we need S = q ( N ) /N questions. ◭◭ ◮◮ ◭ ◮ Page 6 of 14 Go Back Full Screen Close Quit

  7. Types of Uncertainty: . . . 6. Shannon’s Derivation (cont-d) Need to Compare . . . Traditional Amount of . . . � • Shannon’s theorem: S → − p i · log 2 ( p i ). How to Extend These . . . Shannon’s Derivation: . . . • Proof: Shannon’s Derivation . . . N cons ∼ Case of a Continuous . . . N ! ( N − N 1 )! Partial Information . . . N 1 !( N − N 1 )! · N 2 !( N − N 1 − N 2 )! · . . . = Problem with This . . . N ! Alternative Approach: . . . N 1 ! N 2 ! . . . N n ! Alternative Approach: . . . Adding Fuzzy Uncertainty where k ! ∼ ( k/e ) k . So, Acknowledgments � N � N Title Page e N cons ∼ � N 1 � N 1 � N n � N n ◭◭ ◮◮ · . . . · e e ◭ ◮ � N i = N , terms e N and e N i cancel each other. Since Page 7 of 14 • Substituting N i = N · f i and taking logarithms, we get Go Back log 2 ( N cons ) ≈ − N · f 1 · log 2 ( f 1 ) − . . . − N · f n log 2 ( f n ) . Full Screen Close Quit

  8. Types of Uncertainty: . . . 7. Case of a Continuous Probability Distribution Need to Compare . . . Traditional Amount of . . . • Once an approximate value r is determined, possible actual values of x form How to Extend These . . . an interval [ r − ε, r + ε ] of width 2 ε . Shannon’s Derivation: . . . • So, we divide the real line into intervals [ x i , x i +1 ] of width 2 ε and find the Shannon’s Derivation . . . interval that contains x . Case of a Continuous . . . � Partial Information . . . • The average number of questions is S = − p i · log 2 ( p i ), where the proba- Problem with This . . . bility p i that x ∈ [ x i , x i +1 ] is p i ≈ 2 ε · ρ ( x i ). Alternative Approach: . . . Alternative Approach: . . . • So, for small ε , we have Adding Fuzzy Uncertainty � � S = − ρ ( x i ) · log 2 ( ρ ( x i )) · 2 ε − ρ ( x i ) · 2 ε · log 2 (2 ε ) , Acknowledgments Title Page where the first sum in this expression is the integral sum for the integral � def S ( ρ ) = − ρ ( x ) · log 2 ( ρ ( x )) dx , so ◭◭ ◮◮ � ◭ ◮ S ≈ − ρ ( x ) · log 2 ( ρ ( x )) dx − log 2 (2 ε ) . Page 8 of 14 Go Back Full Screen Close Quit

  9. Types of Uncertainty: . . . 8. Partial Information about Probability Distribution Need to Compare . . . Traditional Amount of . . . • Ideal case: complete information about the probabilities p = ( p 1 , . . . , p n ) of How to Extend These . . . different alternatives. Shannon’s Derivation: . . . • In practice: often, we only have partial information about these probabilities, Shannon’s Derivation . . . i.e., the set P of possible values of p . Case of a Continuous . . . Partial Information . . . • Convexity of P : if it is possible to have p ∈ P and p ′ ∈ P , then it is also Problem with This . . . possible that we have p with some probability α and p ′ with the probability Alternative Approach: . . . 1 − α . Alternative Approach: . . . • Definition. By the entropy S ( P ) of a probabilistic knowledge P , we mean the Adding Fuzzy Uncertainty def largest possible entropy among all distributions p ∈ P ; S ( P ) = max p ∈ P S ( p ). Acknowledgments Title Page • Proposition. When N → ∞ , the average number of questions tends to the S ( P ). ◭◭ ◮◮ ◭ ◮ Page 9 of 14 Go Back Full Screen Close Quit

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend