CSE 312
Foundations of Computing II
Lecture 20: Continuous Random Variables
Stefano Tessaro
tessaro@cs.washington.edu
1
Foundations of Computing II Lecture 20: Continuous Random Variables - - PowerPoint PPT Presentation
CSE 312 Foundations of Computing II Lecture 20: Continuous Random Variables Stefano Tessaro tessaro@cs.washington.edu 1 Review Continuous RVs Probability Density Function (PDF). Cumulative Density Function (CDF). 3 !: s.t.
1
2
Probability Density Function (PDF). !: ℝ → ℝ s.t.
*+ ,+! % d% = 1
Cumulative Density Function (CDF). 0 1 = 2
*+ 3
!(%) d%
68
!(%) = 1 0(1)
1
3
!(%) 9 :
@ A
4
!
C % = D
1 : − 9 % ∈ [9, :] else
2
*+ ,+
!
C % d% = : − 9
1 : − 9 = 1 1
< ∼ Unif(9, :) 9 :
We also say that < follows the uniform distribution / is uniformly distributed
5
!
C % = D
1 : − 9 % ∈ [9, :] else < ∼ Unif(9, :) M < = 2
*+ ,+
!
C % ⋅ % d%
= 1 : − 9 2
@ A
% d% = 1 : − 9 O %P 2
@ A
= 1 : − 9 :P − 9P 2 = (: − 9)(9 + :) 2(: − 9) = 9 + : 2
6
!
C % = D
1 : − 9 % ∈ [9, :] else < ∼ Unif(9, :) M <P = 2
*+ ,+
!
C % ⋅ %P d%
= 1 : − 9 2
@ A
%P d% = 1 : − 9 O %S 3
@ A
= :S − 9S 3(: − 9) = (: − 9)(:P + 9: + 9P) 3(: − 9) = :P + 9: + 9P 3
7
< ∼ Unif(9, :)
M <P = :P + 9: + 9P 3 M < = 9 + : 2
Var < = M <P − M < P = :P + 9: + 9P 3 − 9P + 29: + :P 4 = 4:P + 49: + 49P 12 − 39P + 69: + 3:P 12 = :P − 29: + 9P 12 = : − 9 P 12
8
Assume expected # of occurrences of an event per unit of time is Z
Numbers of occurrences of event: Poisson distribution
(Discrete)
How long to wait until next event? Exponential density! Let’s define it and then derive it!
9
follows the exponential density !
C % = `Z\*]8
% ≥ 0 % < 0
CDF: For 1 ≥ 0,
C 1 = 2 b 3
Z\*]8 d% = Z c (−1/Z)\*]8
b 3
= 1 − \*]3
We write < ∼ Exp Z and say < that follows the exponential distribution.
10
0.5 1 1.5 2 1 2 3 4 5 6 7
Z = 2 Z = 1.5 [Densities are all 0 on negative reals] Z = 1 Z = 0.5
1 1 1 1 1
Discretize: In each interval of length 1/j, probability k =
] l of car arriving.
11
<l = time until first arrival
1/j
(infinitely many intervals)
3l
(scaled) geometric
l→+ ℙ <l > 1 = lim l→+ 1 − Z
3l
C(1)
12
!
C % = `Z\*]8
% ≥ 0 % < 0 M < = 2
*+ ,+
!
C % ⋅ % d%
= 2
b ,+
Z%\*]8 ⋅ % d% = lim
p −(% + 1 Z)\*]8
b
Z Var < = 1 ZP M < = 1 Z
ℙ < > q + r < > q) = ℙ < > r .
13
Assuming exp distr, if you’ve waited q minutes, prob of waiting r more is exactly same as q = 0
14
ℙ < > q + r < > q) = ℙ < > q + r ∩ < > q ℙ(< > q) = ℙ < > q + r ℙ(< > q) = \*](t,u) \*]t = \*]u = ℙ(< > r) Proof.
Assuming exp distr, if you’ve waited q minutes, prob of waiting r more is exactly same as q = 0
15
parameters v ∈ ℝ and w ≥ 0 has density !
C % = x Pyz \* {|} ~
~~
(We say that < follows the Normal Distribution, and write < ∼ Ä(v, wP))
Carl Friedrich Gauss
We will see next time why the normal distribution is (in some sense) the most important distribution.
16
0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4
3 6 9 12 15 18
v = 0, wP = 3 v = 0, wP = 8 v = −7, wP = 6 v = 7, wP = 1 Aka a “Bell Curve” (imprecise name)
17
Proof is easy because density curve is symmetric, and % ⋅ !
C v − % = % ⋅ ! C(v + %)
18
How do we prove this? (Likely) see next week for a simple proof.
Standard (unit) normal = Ä 0, 1
x Py ∫ *+ à \*8~/Pd% for Ü ∼ Ä 0, 1
Note: Φ Ö has no closed form – generally given via tables
If < ∼ Ä v, wP , then 0
C Ö = ℙ < ≤ Ö = ℙ C*â z
≤
à*â z
= Φ(
à*â z )
19