Expectation Continued: Tail Sum, Coupon Collector, and Functions of RVs
CS 70, Summer 2019 Lecture 20, 7/29/19
1 / 26
Expectation Continued: Tail Sum, Coupon Collector, and Functions of - - PowerPoint PPT Presentation
Expectation Continued: Tail Sum, Coupon Collector, and Functions of RVs CS 70, Summer 2019 Lecture 20, 7/29/19 1 / 26 Last Time... I Expectation describes the weighted average of a RV. I For more complicated RVs, use linearity Today: I Proof
Expectation Continued: Tail Sum, Coupon Collector, and Functions of RVs
CS 70, Summer 2019 Lecture 20, 7/29/19
1 / 26Last Time...
I Expectation describes the weighted
average of a RV.
I For more complicated RVs, use linearity
Today:
I Proof of linearity of expectation I The tail sum formula I Expectations of Geometric and Poisson I Expectation of a function of an RV
2 / 26Sanity Check
Let X be a RV that takes on values in A. Let Y be a RV that takes on values in B. Let c ∈ R be a constant. Both c · X and X + Y are also RVs!
3 / 26Xty
Values
a
,a
Values :{
atb
, AEA,bEB}Probe
:probs
:IPCC X
= ca ]=p[X=a ]ipfxty
Proof of Linearity of Expectation I
Recall linearity of expectation: E[X1 + . . . + Xn] = E[X1] + . . . + E[Xn] For constant c, E[cXi] = c · E[Xi] First, we show E[cXi] = c · E[Xi]:
4 / 26Xi
values in A
.Efc Xi )
=I
¢
a) IPCX
=a ]
A E A =c ÷ ,
a . IPEXProof of Linearity of Expectation II
Next, we show E[X + Y ] = E[X] + E[Y ]. Two variables to n variables?
5 / 26X
has
valuesMA
,y
hasvalues
inB
.Efxty
]=
I
(
at b) p[X=a , Yrewrite
" 4*94 . b)ate ,£⇒b
.PCx=aY=b ]
AFAFEB
EE
7¥
¥=b]
BEBAEA
=qfaa.ipcx-ay-qf.gg
b
TEXT TEE
Induction
.The Tail Sum Formula
Let X be a RV with values in {0, 1, 2, . . . , n}. We use “tail” to describe P[X ≥ i]. What does P∞
i=1 P[X ≥ i] look like?
Small example: X only takes values {0, 1, 2}:
6 / 26 A nonintegers
team
€7
PEX
Z i ] =PEX
21 ]
t
PEX
22 ] /
\ =p Cx
]
tPIX
= 2) t¥
EX
= 2 ]Or PEX
+1
.IPCX
EF
The Tail Sum Formula
The tail sum formula states that: E[X] =
∞
X
i=1
P[X ≥ i] Proof: Let pi = P[X = i].
7 / 26x Ronning
ers =
€,9P[ x
Zi ]
=PEX Z IT
tP EX Z 2)
t PEX 233? =( p , t¥pz
t . . . ) t (Pat
. . .) ! +I p ,
tPg
t . . . )O
' PI
. p ,t
2
' p a t3
. Pz t .E EXT
Expectation of a Geometric I
Let X ∼ Geometric(p). P[X ≥ i] = Apply the tail sum formula:
8 / 26i
H
÷
,]
=It
( I
series
first
Esma =L
,r
Expectation of a Geometric II
Use memorylessness: the fact that the geometric RV “resets” after each trial. Two Cases:
9 / 26X
ft
art
trials
failed
P
first
trial
.fall 1st trial
. " win in2day
. " "win
in 1€E[X]
days
" "reset
, "day
2looks
identical
today
I .ECX ]
=p
(1)
t
( I
ITEM
]
Solve
forECX ]
.PIECX I
'Http
Efxtipt
Expectation of a Geometric III
Lastly, an intuitive but non-rigorous idea. Let Xi be an indicator variable for success in a single trial. Recall trials are i.i.d. Xi ∼ E[X1 + X2 + . . . + Xk] =
10 / 26X
Cp) Ber ( p )
uselinearity
expectation
Efx
, ItECX.IT
. . .tECXn ] =KEEX
, ] =Kp
Need
K
= pt infor
ECX
, t . . . tXx )
=I
Coupon Collector I
(Note 19.) I’m out collecting trading cards. There are n types total. I get a random trading card every time I buy a cereal box. What is the expected number of boxes I need to buy in order to get all n trading cards? High level picture:
11 / 26Time
T ,Tz
X4
get
get
get
1St 2nd
3rd card !
card ! card
.Coupon Collector II
Let Xi = What is the dist. of X1? What is the dist. of X2? What is the dist. of X3? In general, what is the dist. of Xi?
12 / 26time /
" #boxes
"between
Ci
card and
the
Geom
CL)
X ,
=L
always
.Xz
~Geom
( ht )
Xz
~Geom
( MT ) Xi
Coupon Collector III
Let X = X = E[X] =
13 / 26total
# boxes
to
get
all
n
cards
X ,
t
Xzt
.IE [ Xitxzt
. . .txn ]
linearity
=Efx
, ]HECK )
t .d
Xi
nGeom I
n-ci
,
It
IT
t#
t
. . . thy%
= nEi e-? ?
Aside: (Partial) Harmonic Series
Harmonic Series: P∞
k=1 1 k
Approximation for Pn
k=1 1 k in terms of n?
14 / 26Diverges
E.it#fIdx=lnx/7--lnn-In1/
"↳
x
inn
. ⇒coupon collector
:E EXT
FN
log N
.¥¥IE¥
.EE
Break
A Bad Harmonic Series Joke... A countably infinite number of mathematicians walk into a bar. The first one orders a pint of beer, the second one orders a half pint, the third
a fourth of a pint, and so on. The bartender says ...
15 / 26Expectation of a Poisson I
Recall the Poisson distribution: values 0, 1, 2, . . . , P[X = i] = λi i! e−λ We can use the definition to find E[X]!
16 / 26X
ECXI
e-
a)
touswrwiesfor
⇒te
. = =Het
Expectation of a Poisson II
Optional but intuitive / non-rigorous approach: Think of a Poisson(λ) as a Bin(n, λ
n) distribution,
taken as n → ∞. Let X ∼ Bin(n, λ
n).
X =
17 / 26* I 71¥
Rest of Today: Functions of RVs!
Recall X from Lecture 19: X = 8 > < > : 1 wp 0.4
1 2
wp 0.25 −1
2
wp 0.35 Refresh your memory: What is X 2?
18 / 26XI
Wp
.0.4
Wp
0.25
¥
Wp
0.35
= { ¥WP 0.4 Wp
0.6
Example: Functions of RVs
X 2 = ( 1 wp 0.4
1 4
wp 0.6 What is E[X 2]? What is E[3X 2 − 5]?
19 / 26Efx 21
I
. 0.4
t
t
.0.6=0.55
linearity
exp
.IE [3×2]
3EHF
310.55
)
In General: Functions of RVs
Let X be a RV with values in A. Distribution of f (X): E[f (X)] =
20 / 26f Cx )
= {f !
)
wppxf-ajac.CA
Ha )
pfX=a ]
Square of a Bernoulli
Let X ∼ Bernoulli(p). Write out the distribution of X. What is X 2? E[X 2]?
21 / 26X
wpp
Wp tp
X
'If Ip
ECM =p
.Product of RVs
Let X be a RV with values in A. Let Y be a RV with values in B. XY is also a RV! What is its distribution? (Use the joint distribution!)
22 / 26Exercise
Product of Two Bernoullis
Let X ∼ Bernoulli(p1), and Y ∼ Bernoulli(p2). X and Y are independent. What is the distribution of XY ? What is E[XY ]?
23 / 26Exercise
Square of a Binomial I
Let X ∼ Bin(n, p). Decompose into Xi ∼ Bernoulli(p). X = E[X] =
24 / 26X ,
t X z t
. . . t X nIE [ X ,
t
X at
. . . t X n ] =Efx
. I t IEC X a It . . .t Ef X n ]
=n p
.Square of a Binomial II
Recall, E[X 2
i ] = 1, and E[XiXj] = p2.
25 / 26P
ECXZ ]=Ef(
X
,tXzt
. ..tt/n)2J=tEfCXi2tXz2t...tXn4t(XiXztXiXst
. . . ))MIN
terms ]
TEAMS
terms ] them ."
"
:*
.
ECXIZ ]
Efx ,Xz3=PZ
> =nptnln
Summary
Today:
I Proof of linearity of expectation: did not use
independence, but did use joint distribution
I Tail sum for non-negative int.-valued RVs! I Coupon Collector: break problem down into a
sum of geometrics.
I Expectation of a function of an RV: can
apply definition and linearity of expectation (after expanding) as well!!
26 / 26