last time sanity check
play

Last Time... Sanity Check Let X be a RV that takes on values in A . - PowerPoint PPT Presentation

Last Time... Sanity Check Let X be a RV that takes on values in A . Expectation describes the weighted Expectation Continued: Tail Let Y be a RV that takes on values in B . average of a RV. Sum, Coupon Collector, and Let c R be a


  1. Last Time... Sanity Check Let X be a RV that takes on values in A . ◮ Expectation describes the weighted Expectation Continued: Tail Let Y be a RV that takes on values in B . average of a RV. Sum, Coupon Collector, and Let c ∈ R be a constant. ◮ For more complicated RVs, use linearity Functions of RVs Both c · X and X + Y are also RVs! Today: ◮ Proof of linearity of expectation CS 70, Summer 2019 ◮ The tail sum formula Lecture 20, 7/29/19 ◮ Expectations of Geometric and Poisson ◮ Expectation of a function of an RV 1 / 26 2 / 26 3 / 26 Proof of Linearity of Expectation I Proof of Linearity of Expectation II The Tail Sum Formula Recall linearity of expectation: Next, we show E [ X + Y ] = E [ X ] + E [ Y ] . Let X be a RV with values in { 0 , 1 , 2 , . . . , n } . We use “tail” to describe P [ X ≥ i ] . E [ X 1 + . . . + X n ] = E [ X 1 ] + . . . + E [ X n ] What does � ∞ i = 1 P [ X ≥ i ] look like? For constant c , E [ cX i ] = c · E [ X i ] Small example: X only takes values { 0 , 1 , 2 } : First, we show E [ cX i ] = c · E [ X i ] : Two variables to n variables? 4 / 26 5 / 26 6 / 26

  2. The Tail Sum Formula Expectation of a Geometric I Expectation of a Geometric II The tail sum formula states that: Let X ∼ Geometric( p ) . Use memorylessness : the fact that the geometric RV “resets” after each trial. ∞ P [ X ≥ i ] = � E [ X ] = P [ X ≥ i ] i = 1 Apply the tail sum formula : Two Cases: Proof: Let p i = P [ X = i ] . 7 / 26 8 / 26 9 / 26 Expectation of a Geometric III Coupon Collector I Coupon Collector II Lastly, an intuitive but non-rigorous idea. (Note 19.) I’m out collecting trading cards. Let X i = There are n types total. I get a random trading Let X i be an indicator variable for success in a card every time I buy a cereal box. What is the dist. of X 1 ? single trial. Recall trials are i.i.d. What is the expected number of boxes I need to What is the dist. of X 2 ? X i ∼ buy in order to get all n trading cards? What is the dist. of X 3 ? E [ X 1 + X 2 + . . . + X k ] = High level picture: In general, what is the dist. of X i ? 10 / 26 11 / 26 12 / 26

  3. Coupon Collector III Aside: (Partial) Harmonic Series Break Harmonic Series: � ∞ 1 Let X = A Bad Harmonic Series Joke... k = 1 k Approximation for � n 1 X = A countably infinite number of mathematicians k in terms of n ? k = 1 walk into a bar. The first one orders a pint of E [ X ] = beer, the second one orders a half pint, the third one orders a third of a pint, the fourth one orders a fourth of a pint, and so on. The bartender says ... 13 / 26 14 / 26 15 / 26 Expectation of a Poisson I Expectation of a Poisson II Rest of Today: Functions of RVs! Recall the Poisson distribution: values 0 , 1 , 2 , . . . , Optional but intuitive / non-rigorous approach: Recall X from Lecture 19:  P [ X = i ] = λ i Think of a Poisson( λ ) as a Bin( n , λ 1 wp 0.4 n ) distribution, i ! e − λ   taken as n → ∞ . 1 X = wp 0.25 2 − 1  wp 0.35  We can use the definition to find E [ X ] ! Let X ∼ Bin( n , λ 2 n ) . Refresh your memory: What is X 2 ? X = 16 / 26 17 / 26 18 / 26

  4. Example: Functions of RVs In General: Functions of RVs Square of a Bernoulli � Let X be a RV with values in A . Let X ∼ Bernoulli( p ) . 1 wp 0.4 X 2 = Distribution of f ( X ) : Write out the distribution of X . 1 wp 0.6 4 What is E [ X 2 ] ? What is X 2 ? E [ X 2 ] ? E [ f ( X )] = What is E [ 3 X 2 − 5 ] ? 19 / 26 20 / 26 21 / 26 Product of RVs Product of Two Bernoullis Square of a Binomial I Let X be a RV with values in A . Let X ∼ Bernoulli( p 1 ) , and Y ∼ Bernoulli( p 2 ) . Let X ∼ Bin( n , p ) . Let Y be a RV with values in B . X and Y are independent . Decompose into X i ∼ Bernoulli( p ) . XY is also a RV! What is its distribution? What is the distribution of XY ? X = (Use the joint distribution! ) E [ X ] = What is E [ XY ] ? 22 / 26 23 / 26 24 / 26

  5. Square of a Binomial II Summary Recall, E [ X 2 i ] = p , and E [ X i X j ] = p 2 . Today: ◮ Proof of linearity of expectation: did not use independence, but did use joint distribution ◮ Tail sum for non-negative int.-valued RVs! ◮ Coupon Collector: break problem down into a sum of geometrics. ◮ Expectation of a function of an RV: can apply definition and linearity of expectation (after expanding) as well!! 25 / 26 26 / 26

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend