cs70 jean walrand lecture 36 review cdf and pdf
play

CS70: Jean Walrand: Lecture 36. Review: CDF and PDF. Expectation - PowerPoint PPT Presentation

CS70: Jean Walrand: Lecture 36. Review: CDF and PDF. Expectation Definitions: (a) The expectation of a random variable X with pdf Continuous Probability 3 f ( x ) is defined as Key idea: For a continuous RV, Pr [ X = x ] = 0 for all x


  1. CS70: Jean Walrand: Lecture 36. Review: CDF and PDF. Expectation Definitions: (a) The expectation of a random variable X with pdf Continuous Probability 3 f ( x ) is defined as � ∞ Key idea: For a continuous RV, Pr [ X = x ] = 0 for all x ∈ ℜ . E [ X ] = − ∞ xf X ( x ) dx . Examples: Uniform in [ 0 , 1 ] ; throw a dart in a target. 1. Review: CDF , PDF Thus, one cannot define Pr [ outcome ] , then Pr [ event ] . (b) The expectation of a function of a random variable is defined as 2. Review: Expectation Instead, one starts by defining Pr [ event ] . � ∞ 3. Review: Independence E [ h ( X )] = − ∞ h ( x ) f X ( x ) dx . Thus, one defines Pr [ X ∈ ( − ∞ , x ]] = Pr [ X ≤ x ] =: F X ( x ) , x ∈ ℜ . 4. Meeting at a Restaurant Then, one defines f X ( x ) := d dx F X ( x ) . 5. Breaking a Stick (c) The expectation of a function of multiple random variables is Hence, f X ( x ) ε = Pr [ X ∈ ( x , x + ε )] . 6. Maximum of Exponentials defined as � � F X ( · ) is the cumulative distribution function (CDF) of X . 7. Quantization Noise E [ h ( X )] = ··· h ( x ) f X ( x ) dx 1 ··· dx n . f X ( · ) is the probability density function (PDF) of X . 8. Replacing Light Bulbs Justifications: Think of the discrete approximations of the 9. Expected Squared Distance continuous RVs. 10. Geometric and Exponential Independent Continuous Random Variables Meeting at a Restaurant Breaking a Stick Definition: The continuous RVs X and Y are independent if Two friends go to a restaurant independently uniformly at random You break a stick at two points chosen independently uniformly at Pr [ X ∈ A , Y ∈ B ] = Pr [ X ∈ A ] Pr [ Y ∈ B ] , ∀ A , B . between noon and 1pm. random. They agree they will wait for 10 minutes. What is the probability they What is the probability you can make a triangle with the three pieces? Theorem: The continuous RVs X and Y are independent if and only meet? Let X , Y be the two break points if Here, ( X , Y ) are the times when along the [ 0 , 1 ] stick. f X , Y ( x , y ) = f X ( x ) f Y ( y ) . the friends reach the restaurant. You can make a triangle if The shaded area are the pairs A < B + C , B < A + C , and Proof: As in the discrete case. where | X − Y | < 1 / 6, i.e., such C < A + B . Definition: The continuous RVs X 1 ,..., X n are mutually independent that they meet. If X < Y , this means if The complement is the sum of two X < 0 . 5 , Y < X + 0 . 5 , Y > 0 . 5. This Pr [ X 1 ∈ A 1 ,..., X n ∈ A n ] = Pr [ X 1 ∈ A 1 ] ··· Pr [ X n ∈ A n ] , ∀ A 1 ,..., A n . rectangles. When you put them is the blue triangle. together, they form a square with If X > Y , we get the red triangle, sides 5 / 6. by symmetry. Theorem: The continuous RVs X 1 ,..., X n are mutually independent if and only if 6 ) 2 = 11 Thus, Pr [ meet ] = 1 − ( 5 f X ( x 1 ,..., x n ) = f X 1 ( x 1 ) ··· f X n ( x n ) . 36 . Thus, Pr [ make triangle ] = 1 / 4 . Proof: As in the discrete case.

  2. Maximum of Two Exponentials Maximum of n i.i.d. Exponentials Quantization Noise Let X 1 ,..., X n be i.i.d. Expo ( 1 ) . Define Z = max { X 1 , X 2 ,..., X n } . Let X = Expo ( λ ) and Y = Expo ( µ ) be independent. Define In digital video and audio, one represents a continuous value by a Calculate E [ Z ] . Z = max { X , Y } . finite number of bits. We use a recursion. The key idea is as follows: Calculate E [ Z ] . This introduces an error perceived as noise: the quantization noise. Z = min { X 1 ,..., X n } + V What is the power of that noise? We compute f Z , then integrate. Model: X = U [ 0 , 1 ] is the continuous value. Y is the closest multiple One has where V is the maximum of n − 1 i.i.d. Expo ( 1 ) . This follows from the of 2 − n to X . Thus, we can represent Y with n bits. The error is memoryless property of the exponential. Z := X − Y . Pr [ Z < z ] = Pr [ X < z , Y < z ] = Pr [ X < z ] Pr [ Y < z ] Let then A n = E [ Z ] . We see that ( 1 − e − λ z )( 1 − e − µ z ) = 1 − e − λ z − e − µ z + e − ( λ + µ ) z The power of the noise is E [ Z 2 ] . = A n = E [ min { X 1 ,..., X n } ]+ A n − 1 Analysis: We see that Z is uniform in [ 0 , a = 2 − ( n + 1 ) ] . Thus, 1 f Z ( z ) = λ e − λ z + µ e − µ z − ( λ + µ ) e − ( λ + µ ) z , ∀ z > 0 . = n + A n − 1 Thus, E [ Z 2 ] = a 2 3 = 1 32 − 2 ( n + 1 ) . Hence, � ∞ because the minimum of Expo is Expo with the sum of the rates. 0 zf Z ( z ) dz = 1 λ + 1 1 E [ Z ] = µ − λ + µ . The power of the signal X is E [ X 2 ] = 1 3 . Hence, E [ Z ] = A n = 1 + 1 2 + ··· + 1 n = H ( n ) . Quantization Noise Replacing Light Bulbs Replacing Light Bulbs Say that light bulbs have i.i.d. Expo ( 1 ) lifetimes. Say that light bulbs have i.i.d. Expo ( 1 ) lifetimes. We turn a light on, and replace it as soon as it burns out. We turn a light on, and replace it as soon as it burns out. How many light bulbs do we need to replace in t units of time? 3 2 − 2 ( n + 1 ) and E [ X 2 ] = 1 We saw that E [ Z 2 ] = 1 How many light bulbs do we need to replace in t units of time? 3 . Theorem: The number X t of replaced light bulbs is P ( t ) . Theorem: The number X t of replaced light bulbs is P ( t ) . The signal to noise ratio (SNR) is the power of the signal divided by That is, Pr [ X t = n ] = t n n ! e − t . the power of the noise. That is, Pr [ X t = n ] = t n n ! e − t . Proof: (continued) We saw that Thus, Proof: We see how X t increases over the next ε ≪ 1 time units. SNR = 2 2 ( n + 1 ) . g ( n , t + ε ) ≈ g ( n , t ) − g ( n , t ) ε + g ( n − 1 , t ) ε . Let A be the event that a burns out during [ t , t + ε ] . Then, Expressed in decibels, one has Subtracting g ( n , t ) , dividing by ε , and letting ε → 0, one gets Pr [ X t = n , A c ]+ Pr [ X t = n − 1 , A ] Pr [ X t + ε = n ] ≈ SNR ( dB ) = 10log 10 ( SNR ) = 20 ( n + 1 ) log 10 ( 2 ) ≈ 6 ( n + 1 ) . g ′ ( n , t ) = − g ( n , t )+ g ( n − 1 , t ) . Pr [ X t = n ] Pr [ A c ]+ Pr [ X t = n − 1 ] Pr [ A ] = You can check that these equations are solved by g ( n , t ) = t n ≈ Pr [ X t = n ]( 1 − ε )+ Pr [ X t = n − 1 ] ε . For instance, if n = 16, then SNR ( dB ) ≈ 112 dB . n ! e − t . Indeed, then Hence, g ( n , t ) := Pr [ X t = n ] is such that t n − 1 ( n − 1 )! e − t − g ( n , t ) g ′ ( n , t ) = g ( n , t + ε ) ≈ g ( n , t ) − g ( n , t ) ε + g ( n − 1 , t ) ε . = g ( n − 1 , t ) − g ( n , t ) .

  3. Expected Squared Distance Geometric and Exponential Summary Problem 1: Pick two points X and Y independently and uniformly at random in [ 0 , 1 ] . The geometric and exponential distributions are similar. They are What is E [( X − Y ) 2 ] ? both memoryless. Analysis: One has Consider flipping a coin every 1 / N second with Pr [ H ] = p / N , where Continuous Probability 3 E [ X 2 + Y 2 − 2 XY ] E [( X − Y ) 2 ] N ≫ 1. = 3 + 1 1 3 − 21 1 Let X be the time until the first H . = 2 2 ◮ Continuous RVs are essentially the same as discrete RVs Fact: X ≈ Expo ( p ) . 3 − 1 2 2 = 1 = 6 . ◮ Think that X ≈ x with probability f X ( x ) ε Analysis: Note that ◮ Sums become integrals, .... Problem 2: What about in a unit square? Pr [ X > t ] ≈ Pr [ first Nt flips are tails ] ◮ The exponential distribution is magical: memoryless. ( 1 − p N ) Nt ≈ exp {− pt } . Analysis: One has = E [ || X − Y || 2 ] E [( X 1 − Y 1 ) 2 ]+ E [( X 2 − Y 2 ) 2 ] = N ) N ≈ exp {− a } . Indeed, ( 1 − a 2 × 1 = 6 . Problem 3: What about in n dimensions? n 6 .

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend