foundations of computing ii
play

Foundations of Computing II Lecture 20: Continuous Random Variables - PowerPoint PPT Presentation

CSE 312 Foundations of Computing II Lecture 20: Continuous Random Variables Stefano Tessaro tessaro@cs.washington.edu 1 Review Continuous RVs Probability Density Function (PDF). Cumulative Density Function (CDF). 3 !: s.t.


  1. CSE 312 Foundations of Computing II Lecture 20: Continuous Random Variables Stefano Tessaro tessaro@cs.washington.edu 1

  2. Review – Continuous RVs Probability Density Function (PDF). Cumulative Density Function (CDF). 3 !: ℝ → ℝ s.t. 0 1 = 2 !(%) d% ! % ≥ 0 for all % ∈ ℝ • *+ ,+ ! % d% = 1 • ∫ *+ 0(1) !(%) 1 = 1 Theorem. ! % = 67(8) 68 2

  3. Review – Continuous RVs !(%) 9 : A ℙ < ∈ [9, :] = 2 ! % d% = 0 : − 0(9) @ 3

  4. Uniform Distribution 1 % ∈ [9, :] ! C % = D : − 9 We also say that < < ∼ Unif(9, :) 0 else follows the uniform distribution / is uniformly distributed ,+ 1 1 2 ! C % d% = : − 9 : − 9 = 1 *+ 0 9 : 4

  5. Uniform Density – Expectation 1 % ∈ [9, :] ! C % = D : − 9 < ∼ Unif(9, :) 0 else ,+ M < = 2 ! C % ⋅ % d% *+ A % P A 1 1 = O = : − 9 2 % d% : − 9 2 @ @ : P − 9 P = (: − 9)(9 + :) = 9 + : 1 = 2(: − 9) 2 : − 9 2 5

  6. Uniform Density – Variance 1 % ∈ [9, :] ! C % = D : − 9 < ∼ Unif(9, :) 0 else ,+ M < P = 2 C % ⋅ % P d% ! *+ A A % S 1 1 % P d% = : − 9 2 O = : − 9 3 @ @ = : S − 9 S 3(: − 9) = (: − 9)(: P + 9: + 9 P ) = : P + 9: + 9 P 3(: − 9) 3 6

  7. M < P = : P + 9: + 9 P Uniform Density – Variance M < = 9 + : 2 3 < ∼ Unif(9, :) Var < = M < P − M < P = : P + 9: + 9 P − 9 P + 29: + : P 3 4 = 4: P + 49: + 49 P − 39 P + 69: + 3: P 12 12 = : P − 29: + 9 P = : − 9 P 12 12 7

  8. Exponential Density Assume expected # of occurrences of an event per unit of time is Z • Cars going through intersection • Number of lightning strikes • Requests to web server • Patients admitted to ER Numbers of occurrences of event: Poisson distribution ℙ < = [ = \ *] Z ^ (Discrete) [! How long to wait until next event? Exponential density! Let’s define it and then derive it! 8

  9. Exponential Distribution Definition. An exponential random variable < with parameter Z ≥ 0 is follows the exponential density C % = `Z\ *]8 % ≥ 0 ! 0 % < 0 We write < ∼ Exp Z and say < that follows the exponential distribution. CDF: For 1 ≥ 0 , 3 3 Z\ *]8 d% = Z (−1/Z)\ *]8 = 1 − \ *]3 0 C 1 = 2 c b b 9

  10. [Densities are all 0 on negative Exponential Distribution reals] 2 Z = 2 1.5 Z = 1.5 Z = 1 1 0.5 Z = 0.5 0 0 1 2 3 4 5 6 7 10

  11. Derivation – Number of Cars ] Discretize: In each interval of length 1/j , probability k = l of car arriving. 1/j 1 1 0 1 1 0 1 0 0 0 0 (infinitely many < l = time until first arrival intervals) 3l ℙ < l > 1 = 1 − Z (scaled) geometric j 3l l→+ 1 − Z = \ *]3 = 1 − 0 l→+ ℙ < l > 1 = lim lim C (1) j 11

  12. C % = `Z\ *]8 % ≥ 0 Expectation ! 0 % < 0 ,+ M < = 2 ! C % ⋅ % d% M < = 1 *+ ,+ Z%\ *]8 ⋅ % d% Z = 2 b Var < = 1 o −(% + 1 = 1 Z)\ *]8 = lim p Z P Z o→+ b 12

  13. Memorylessness Definition . A random variable is memoryless if for all q, r > 0 , ℙ < > q + r < > q) = ℙ < > r . Fact . < ∼ Exp(Z) is memoryless. Assuming exp distr, if you’ve waited q minutes, prob of waiting r more is exactly same as q = 0 13

  14. Assuming exp distr, if you’ve waited q minutes, prob of waiting r more is exactly same as q = 0 Memorylessness of Exponential Fact . < ∼ Exp(Z) is memoryless. Proof. ℙ < > q + r < > q) = ℙ < > q + r ∩ < > q ℙ(< > q) = ℙ < > q + r ℙ(< > q) = \ *](t,u) = \ *]u = ℙ(< > r) \ *]t 14

  15. The Normal Distribution Definition. A Gaussian (or normal) random variable with parameters v ∈ ℝ and w ≥ 0 has density Carl Friedrich Pyz \ * {|} ~ Gauss x ! C % = ~~ (We say that < follows the Normal Distribution, and write < ∼ Ä(v, w P ) ) We will see next time why the normal distribution is (in some sense) the most important distribution. 15

  16. Aka a “Bell Curve” (imprecise name) The Normal Distribution 0.4 v = 7 , w P = 1 0.35 0.3 v = 0 , w P = 3 0.25 v = −7 , 0.2 w P = 6 0.15 v = 0 , w P = 8 0.1 0.05 0 -18 -15 -12 -9 -6 -3 0 3 6 9 12 15 18 16

  17. Two Facts Fact. If < ∼ Ä v, w P , then M < = v Proof is easy because density curve is symmetric, and % ⋅ ! C v − % = % ⋅ ! C (v + %) Fact. If < ∼ Ä v, w P , then Var < = w P 17

  18. Shifting and Scaling Fact. If < ∼ Ä v, w P , then É = 9< + : ∼ Ä 9v + :, 9 P w P How do we prove this? (Likely) see next week for a simple proof. Standard (unit) normal = Ä 0, 1 à \ *8 ~ /P d% for Ü ∼ Ä 0, 1 x Definition. Φ Ö = ℙ Ü ≤ Ö = Py ∫ *+ Note: Φ Ö has no closed form – generally given via tables If < ∼ Ä v, w P , then 0 C*â à*â à*â C Ö = ℙ < ≤ Ö = ℙ ≤ = Φ( z ) z z 18

  19. Table of Standard Cumulative Normal Density 19

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend