Foundations of Computing II Lecture 20: Continuous Random Variables - - PowerPoint PPT Presentation

foundations of computing ii
SMART_READER_LITE
LIVE PREVIEW

Foundations of Computing II Lecture 20: Continuous Random Variables - - PowerPoint PPT Presentation

CSE 312 Foundations of Computing II Lecture 20: Continuous Random Variables Stefano Tessaro tessaro@cs.washington.edu 1 Review Continuous RVs Probability Density Function (PDF). Cumulative Density Function (CDF). 3 !: s.t.


slide-1
SLIDE 1

CSE 312

Foundations of Computing II

Lecture 20: Continuous Random Variables

Stefano Tessaro

tessaro@cs.washington.edu

1

slide-2
SLIDE 2

Review – Continuous RVs

2

Probability Density Function (PDF). !: ℝ → ℝ s.t.

  • ! % ≥ 0 for all % ∈ ℝ

*+ ,+! % d% = 1

Cumulative Density Function (CDF). 0 1 = 2

*+ 3

!(%) d%

  • Theorem. ! % = 67(8)

68

!(%) = 1 0(1)

1

slide-3
SLIDE 3

Review – Continuous RVs

3

!(%) 9 :

ℙ < ∈ [9, :] = 2

@ A

! % d% = 0 : − 0(9)

slide-4
SLIDE 4

Uniform Distribution

4

!

C % = D

1 : − 9 % ∈ [9, :] else

2

*+ ,+

!

C % d% = : − 9

1 : − 9 = 1 1

< ∼ Unif(9, :) 9 :

We also say that < follows the uniform distribution / is uniformly distributed

slide-5
SLIDE 5

Uniform Density – Expectation

5

!

C % = D

1 : − 9 % ∈ [9, :] else < ∼ Unif(9, :) M < = 2

*+ ,+

!

C % ⋅ % d%

= 1 : − 9 2

@ A

% d% = 1 : − 9 O %P 2

@ A

= 1 : − 9 :P − 9P 2 = (: − 9)(9 + :) 2(: − 9) = 9 + : 2

slide-6
SLIDE 6

Uniform Density – Variance

6

!

C % = D

1 : − 9 % ∈ [9, :] else < ∼ Unif(9, :) M <P = 2

*+ ,+

!

C % ⋅ %P d%

= 1 : − 9 2

@ A

%P d% = 1 : − 9 O %S 3

@ A

= :S − 9S 3(: − 9) = (: − 9)(:P + 9: + 9P) 3(: − 9) = :P + 9: + 9P 3

slide-7
SLIDE 7

Uniform Density – Variance

7

< ∼ Unif(9, :)

M <P = :P + 9: + 9P 3 M < = 9 + : 2

Var < = M <P − M < P = :P + 9: + 9P 3 − 9P + 29: + :P 4 = 4:P + 49: + 49P 12 − 39P + 69: + 3:P 12 = :P − 29: + 9P 12 = : − 9 P 12

slide-8
SLIDE 8

Exponential Density

8

Assume expected # of occurrences of an event per unit of time is Z

  • Cars going through intersection
  • Number of lightning strikes
  • Requests to web server
  • Patients admitted to ER

Numbers of occurrences of event: Poisson distribution

ℙ < = [ = \*] Z^ [!

(Discrete)

How long to wait until next event? Exponential density! Let’s define it and then derive it!

slide-9
SLIDE 9

Exponential Distribution

9

  • Definition. An exponential random variable < with parameter Z ≥ 0 is

follows the exponential density !

C % = `Z\*]8

% ≥ 0 % < 0

CDF: For 1 ≥ 0,

C 1 = 2 b 3

Z\*]8 d% = Z c (−1/Z)\*]8

b 3

= 1 − \*]3

We write < ∼ Exp Z and say < that follows the exponential distribution.

slide-10
SLIDE 10

Exponential Distribution

10

0.5 1 1.5 2 1 2 3 4 5 6 7

Z = 2 Z = 1.5 [Densities are all 0 on negative reals] Z = 1 Z = 0.5

slide-11
SLIDE 11

1 1 1 1 1

Derivation – Number of Cars

Discretize: In each interval of length 1/j, probability k =

] l of car arriving.

11

<l = time until first arrival

1/j

(infinitely many intervals)

ℙ <l > 1 = 1 − Z j

3l

(scaled) geometric

lim

l→+ ℙ <l > 1 = lim l→+ 1 − Z

j

3l

= \*]3 = 1 − 0

C(1)

slide-12
SLIDE 12

Expectation

12

!

C % = `Z\*]8

% ≥ 0 % < 0 M < = 2

*+ ,+

!

C % ⋅ % d%

= 2

b ,+

Z%\*]8 ⋅ % d% = lim

  • →+

p −(% + 1 Z)\*]8

b

  • = 1

Z Var < = 1 ZP M < = 1 Z

slide-13
SLIDE 13

Memorylessness

  • Definition. A random variable is memoryless if for all q, r > 0,

ℙ < > q + r < > q) = ℙ < > r .

13

  • Fact. < ∼ Exp(Z) is memoryless.

Assuming exp distr, if you’ve waited q minutes, prob of waiting r more is exactly same as q = 0

slide-14
SLIDE 14

Memorylessness of Exponential

14

  • Fact. < ∼ Exp(Z) is memoryless.

ℙ < > q + r < > q) = ℙ < > q + r ∩ < > q ℙ(< > q) = ℙ < > q + r ℙ(< > q) = \*](t,u) \*]t = \*]u = ℙ(< > r) Proof.

Assuming exp distr, if you’ve waited q minutes, prob of waiting r more is exactly same as q = 0

slide-15
SLIDE 15

The Normal Distribution

15

  • Definition. A Gaussian (or normal) random variable with

parameters v ∈ ℝ and w ≥ 0 has density !

C % = x Pyz \* {|} ~

~~

(We say that < follows the Normal Distribution, and write < ∼ Ä(v, wP))

Carl Friedrich Gauss

We will see next time why the normal distribution is (in some sense) the most important distribution.

slide-16
SLIDE 16

The Normal Distribution

16

0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4

  • 18
  • 15
  • 12
  • 9
  • 6
  • 3

3 6 9 12 15 18

v = 0, wP = 3 v = 0, wP = 8 v = −7, wP = 6 v = 7, wP = 1 Aka a “Bell Curve” (imprecise name)

slide-17
SLIDE 17

Two Facts

17

  • Fact. If < ∼ Ä v, wP , then M < = v

Proof is easy because density curve is symmetric, and % ⋅ !

C v − % = % ⋅ ! C(v + %)

  • Fact. If < ∼ Ä v, wP , then Var < = wP
slide-18
SLIDE 18

Shifting and Scaling

  • Fact. If < ∼ Ä v, wP , then É = 9< + : ∼ Ä 9v + :, 9PwP

18

How do we prove this? (Likely) see next week for a simple proof.

Standard (unit) normal = Ä 0, 1

  • Definition. Φ Ö = ℙ Ü ≤ Ö =

x Py ∫ *+ à \*8~/Pd% for Ü ∼ Ä 0, 1

Note: Φ Ö has no closed form – generally given via tables

If < ∼ Ä v, wP , then 0

C Ö = ℙ < ≤ Ö = ℙ C*â z

à*â z

= Φ(

à*â z )

slide-19
SLIDE 19

Table of Standard Cumulative Normal Density

19