Probability and Statistics for Computer Science Who discovered - - PowerPoint PPT Presentation

probability and statistics
SMART_READER_LITE
LIVE PREVIEW

Probability and Statistics for Computer Science Who discovered - - PowerPoint PPT Presentation

Probability and Statistics for Computer Science Who discovered this? n 1 + 1 e = lim n n Credit: wikipedia Hongye Liu, Teaching Assistant Prof, CS361, UIUC, 09.24.2020 the number ? what is an pkqn = In - k " ( Pt q ) k


slide-1
SLIDE 1

ì

Probability and Statistics for Computer Science

Who discovered this?

Hongye Liu, Teaching Assistant Prof, CS361, UIUC, 09.24.2020 Credit: wikipedia

e = lim

n→∞
  • 1 + 1

n n

slide-2
SLIDE 2

what is

the number ?

( Pt q) " = In an pkqn
  • k
k -
  • O

an

= ?

( I)

Pt f- I

chen

au

  • ( Te)

⇒ E- aiapkq

" - " = ,

k=#

slide-3
SLIDE 3 How many

paths

that

lead

A - B ?

i

!

in

slide-4
SLIDE 4

Last time

Random

variable

, R

* Review

  • f

expectations

*

Markov 's

Inequality

chebyshev

's

inequality

* The

weak

law of

Lyga numbers

slide-5
SLIDE 5

Proof of Weak law of large numbers

Apply Chebyshev’s inequality SubsQtute and

E[X] = E[X]

var[X] = var[X] N

P(|X − E[X]| ≥ ) ≤ var[X] N2

P(|X − E[X]| ≥ ) ≤ var[X] 2

lim

N→∞P(|X − E[X]| ≥ ) = 0 N → ∞
slide-6
SLIDE 6

Applications of the Weak law of large numbers

The law of large numbers jus$fies using

simula$ons (instead of calculaQon) to esQmate the expected values of random variables

The law of large numbers also jus$fies using

histogram of large random samples to approximate the probability distribuQon funcQon , see proof on

  • Pg. 353 of the textbook by DeGroot, et al.

lim

N→∞P(|X − E[X]| ≥ ) = 0

P(x)

slide-7
SLIDE 7

Histogram of large random IID samples approximates the probability distribution

The law of large numbers jusQfies using

histograms to approximate the probability

  • distribuQon. Given N IID random variables X1,

…, XN

According to the law of large numbers As we know for indicator funcQon

E[Yi] = P(c1 ≤ Xi < c2)= P(c1 ≤ X < c2) Y = N

i=1 Yi

N

N → ∞

E[Yi]

read
  • ff
line .

#

textph . so I
  • .
slide-8
SLIDE 8

Objectives

Bernoulli

Distribution

Binomial

Distribution) Bernoulli

trials

Geometric Distribution

Discrete Uniform Distribution continuous Random

Variable

slide-9
SLIDE 9

Random variables

A

random

variable maps

all

  • utcomes
eo Numbers , so ( W ) CK )

Bernoulli

it's

a

function ! !

=

panicky

XXXX w is tail w is head

I

Xcw)

slide-10
SLIDE 10

Bernoulli Random

Variable

X ( w )= f ' w = event A Heard O w = otherwise → T ail

Pc A)

= ?

P fPcX=K)

.

slide-11
SLIDE 11

Bernoulli

Distribution

apex)

ECXI

= ? var Ext = ? x O E- Cx ) = 2- xpcx, = I - P t
  • fi- pi =p
var [XI = ECXT - ETXT = -2*2 pox )
  • p2
=

T.pt o? CEB )

  • P
"

=p

  • p
pups
slide-12
SLIDE 12

Bernoulli

Distribution

apex)

ECXI

= ? i
  • P
var ( x) = ?

i

> x E-( X) = I xp CK) = I xp to - Ci-xp )

=p

u ar Cx) = E EXT - ETH = I ? P t
  • ? c
c - p )
  • p~ =p
.- p ' =p co- p,
slide-13
SLIDE 13

Bernoulli distribution

A random variable X is Bernoulli if it takes on two

values 0 and 1 such that

Credit: wikipedia

E[X] = p

var[X] = p(1 − p)

Jacob Bernoulli (1654-1705)

p

x = I

pcX=24=1

, - p x
  • o
  • therwise
slide-14
SLIDE 14

Bernoulli distribution

Examples Tossing a biased (or fair) coin Making a free throw Rolling a six-sided die and checking if it shows 6 Any indicator func:on of a random variable

IA

= f t Event A happens
  • therwise
PCA > =p E- ( Ita ) = Ix PLA ) to x CI- peas) =

PC Event A)

slide-15
SLIDE 15

Binomial Distribution

Binomial

RV Xs is the sum
  • f
N

independent

Bernoulli

RVs

"

Xicat-fgw-eey.it

Xg

= I Xi w -_ other . i -_ I

Range at Xs

is ?
slide-16
SLIDE 16

Binomial Distribution

Binomial

RV Xs is the sum
  • f
N

independent

Bernoulli

RVs → Toss

µ

times a biased coin , how many heads ? K N -k k

m

  • Pl Xs -
  • A) = ?

(7) p

  • p
. .
  • p
Cl - p)
  • fi -p,
= ( Ye) .

pkci-pY-kk-2-kc.co

, N]
slide-17
SLIDE 17
  • i

t

  • I
  • ④#

O

¥0

O

  • G

positions

e.g

µposit :

.

(ie)

k

head

a

slide-18
SLIDE 18

Expectations of Binomial distribution

A discrete random variable X is binomial if P(X = k) = N k
  • pk(1 − p)N−k
for integer 0 ≤ k ≤ N

E[X] = Np & var[X] = Np(1 − p)

with

I

  • fan = -2 " P "'
when µ= , = I kpck) → Bernoulli = Eh

pickup,

" -k
  • UNH
=

Efx

,
  • ECHT]=TCX
  • '
' T.pa.us
slide-19
SLIDE 19 N

Xs

= Iki [ =/

Elks)

= Effi , xij

Kii

iia

. N

EC*l=EG]

= I Efxi) t E- I

Bernoulli

N RU = 2- p

f- I

=

Np

slide-20
SLIDE 20

vgrfxtTI-vmcxltuarCTJ.at

tdI

vast Xs ) ECCX- ECT's]

= varf -2 Xi )

Xi

are
  • i
ii d ' Rv . = I varlxi ]
  • i

identical

= N.ph
  • p)

independent

  • f

Bernoulli

indent ⇒# mandate

pix,

varix: )

= pctp>

a

slide-21
SLIDE 21

Binomial distribution

Credit: Prof. Grinstead

P = 0.5

(pegs " = ( Ya) pkqlh
  • K )
k=o

ifc peg

  • t )

4245=1
slide-22
SLIDE 22

Binomial distribution: die example

Let X be the number of sixes in 36 rolls of a

fair six-sided die. What is P(X=k) for k =5, 6, 7

Calculate E[X] and var[X]

O

N
  • 36

P =}

P ix. k) = ( Irb) .pk c , -f,36
  • k
  • Efx] = Nip =3Gift
= 6 war CxI= N . pit
  • p) -
  • 36×8×53
= 5
slide-23
SLIDE 23

Geometric

Distribution

µ

peons a timeD= p pl u times) = ( t -p) . p

TH

n .

N

  • pi
7=4-pip

TT

H '

TET

'
  • = FH
.

i

IX

c . 'l l ,

k time

w see a H

Pl K times) = Ci - p)

"- ' p
  • K 31
slide-24
SLIDE 24

Geometric distribution

A discrete random variable X is geometric if Expected value and variance

P(X = k) = (1 − p)k−1p

k ≥ 1

E[X] = 1 p & var[X] = 1 − p p2

H, TH, TTH, TTTH, TTTTH, TTTTTH,…

Edit

slide-25
SLIDE 25

Geometric distribution

P(X = k) = (1 − p)k−1p

k ≥ 1

Credit: Prof. Grinstead P= 0.5 P= 0.2

i.

p go
slide-26
SLIDE 26

Geometric distribution

Examples:

How many rolls of a six-sided die will it take to

see the first 6?

How many Bernoulli trials must be done before

the first 1?

How many experiments needed to have the first

success?

Plays an important role in the theory of queues
slide-27
SLIDE 27

Derivation of geometric expected value

E[X] =

  • k=1

k(1 − p)k−1p = p

  • k=1

k(1 − p)k−1 = p 1 − p

  • k=1

k(1 − p)k = 1 p

ECx7= Expose , kpck, K- K T

=p IT,

Kei- p> K- I Pik, = Ip E Kei-p)" K
  • I
n x

I

nx = n
  • I
slide-28
SLIDE 28

Derivation of geometric expected value

E[X] =

  • k=1

k(1 − p)k−1p = p

  • k=1

k(1 − p)k−1 = p 1 − p

  • k=1

k(1 − p)k = 1 p

slide-29
SLIDE 29

Derivation of geometric expected value

E[X] =

  • k=1

k(1 − p)k−1p = p

  • k=1

k(1 − p)k−1 = p 1 − p

  • k=1

k(1 − p)k

slide-30
SLIDE 30

Derivation of geometric expected value

✺ For we have this power series:

E[X] =

  • k=1

k(1 − p)k−1p = p

  • k=1

k(1 − p)k−1 = p 1 − p

  • k=1

k(1 − p)k

slide-31
SLIDE 31

Derivation of geometric expected value

✺ For we have this power series:
  • n=1

nxn = x (1 − x)2; |x| < 1 E[X] =

  • k=1

k(1 − p)k−1p = p

  • k=1

k(1 − p)k−1 = p 1 − p

  • k=1

k(1 − p)k

' l - p = K
  • ftp.tp#--pt
I
slide-32
SLIDE 32

Derivation of the power series

  • n=1

nxn = x (1 − x)2; |x| < 1

S(x) x =
  • n=1
nxn−1 x S(t) t =
  • n=1
xn = x · 1 1 − x = x 1 − x S(x) x = ( x 1 − x) S(x) = x (1 − x)2

Proof: ; S(x) =

  • n=0
xn = 1 1 − x; |x| < 1 Read atame
slide-33
SLIDE 33

Geometric distribution: die example

Let X be the number of rolls of a fair six-sided

die needed to see the first 6. What is for k = 1, 2?

Calculate E[X] and var[X]

P(X = k)

E[X] = 1 p & var[X] = 1 − p p2
  • p=f

pixel> =p '= }

  • pcx=z)=ctµp=5zxt=z÷
so

ECxI=pt=¥=6

was

=i-p_ =

  • a-
slide-34
SLIDE 34

Betting brainteaser

What would you rather bet on? How many rolls of a fair six-sided die will it

take to see the first 6?

How many sixes will appear in 36 rolls of a fair

six-sided die?

Why?
slide-35
SLIDE 35

Multinomial distribution

A discrete random variable X is MulQnomial if The event of throwing N Qmes the k-sided die

to see the probability of gepng n1 X1, n2 X2, n3 X3…nk Xk

P(X1 = n1, X2 = n2, ..., Xk = nk) = N! n1!n2!...nk!pn1 1 pn2 2 ...pnk k

where N = n1 + n2 + ... + nk

Read off
  • line
slide-36
SLIDE 36

Multinomial distribution

A discrete random variable X is MulQnomial if The event of throwing k-sided die to see the

probability of gepng n1 X1, n2 X2, n3 X3…

P(X1 = n1, X2 = n2, ..., Xk = nk) = N! n1!n2!...nk!pn1 1 pn2 2 ...pnk k

where N = n1 + n2 + ... + nk

8! 3!2!1!1!1!

I L ILLINOIS?

Read
  • ff - line
slide-37
SLIDE 37

Multinomial distribution

Examples

If we roll a six-sided die N Qmes, how many
  • f each value will we see?
What are the counts of N independent and

idenQcal distributed trials?

This is very widely used in geneQcs head
  • ff -line
slide-38
SLIDE 38

Multinomial distribution: die example

What is the probability of seeing 1

  • ne, 2 twos, 3 threes, 4 fours, 5 fives

and 0 sixes in 15 rolls of a fair six- sided die?

solve
  • ft
  • line
slide-39
SLIDE 39

Discrete uniform distribution

A discrete random variable X is uniform if it

takes k different values and

For example: Rolling a fair k-sided die Tossing a fair coin (k=2)

P(X = xi) = 1 k

For all xi that X can take

xwr.si:÷

Xk

÷

slide-40
SLIDE 40

Discrete uniform distribution

ExpectaQon of a discrete random variable X that

takes k different values uniformly

Variance of a uniformly distributed random

variable X .

E[X] = 1 k k
  • i=1
xi

var[X] = 1 k

k
  • i=1

(xi − E[X])2

slide-41
SLIDE 41
slide-42
SLIDE 42

Example of a continuous random variable

The spinner The sample space for all outcomes is

not countable

θ

θ ∈ (0, 2π]

slide-43
SLIDE 43

* What

is the probability of p LO = Oo) ? do is a constant in ( o, UT ]

*⇐

It

what is

the probability of

p l

  • so E Oo ) ?

E

  • 2K
slide-44
SLIDE 44

Probability density function (pdf)

For a conQnuous random variable X, the

probability that X=x is essenQally zero for all (or most) x, so we can’t define

Instead, we define the probability density

func:on (pdf) over an infinitesimally small interval dx,

For a < b

p(x)dx = P(X ∈ [x, x + dx])

b

a

p(x)dx = P(X ∈ [a, b])

P(X = x)

slide-45
SLIDE 45

Properties of the probability density function

resembles the probability funcQon

  • f discrete random variables in that

for all x The probability of X taking all possible

values is 1.

p(x) p(x) ≥ 0

−∞

p(x)dx = 1

slide-46
SLIDE 46

Properties of the probability density function

differs from the probability

distribuQon funcQon for a discrete random variable in that

is not the probability that X = x can exceed 1

p(x) p(x) p(x)

slide-47
SLIDE 47

Probability density function: spinner

Suppose the spinner has equal chance

stopping at any posiQon. What’s the pdf of the angle θ of the spin posiQon?

For this funcQon to be a pdf,

Then

θ

2π c

p(θ) =

  • c

if θ ∈ (0, 2π]

  • therwise

−∞

p(θ)dθ = 1

slide-48
SLIDE 48

Probability density function: spinner

What the probability that the spin angle θ is

within [ ]?

π 12, π 7
slide-49
SLIDE 49

Q: Probability density function: spinner

What is the constant c given the spin angle θ

has the following pdf? θ

p(θ)

π

c

  • A. 1
  • B. 1/π
  • C. 2/π
  • D. 4/π
  • E. 1/2π
slide-50
SLIDE 50

Expectation of continuous variables

Expected value of a conQnuous random

variable X

Expected value of funcQon of conQnuous

random variable

E[X] = ∞

−∞

xp(x)dx E[Y ] = E[f(X)] = ∞

−∞

f(x)p(x)dx

Y = f(X)

x weight
slide-51
SLIDE 51

Probability density function: spinner

Given the probability density of the spin angle θ The expected value of spin angle is

p(θ) = 1

if θ ∈ (0, 2π]

  • therwise

E[θ] = ∞

−∞

θp(θ)dθ

slide-52
SLIDE 52

Properties of expectation of continuous random variables

The linearity of expected value is true for

conQnuous random variables.

And the other properQes that we derived

for variance and covariance also hold for conQnuous random variable

slide-53
SLIDE 53

Q.

Suppose a conQnuous variable has pdf

What is E[X]?

  • A. 1/2
  • B. 1/3
  • C. 1/4
  • D. 1
  • E. 2/3

p(x) =

  • 2(1 − x)

x ∈ [0, 1]

  • therwise

E[X] = ∞

−∞

xp(x)dx

slide-54
SLIDE 54

Variance of a continuous variable

slide-55
SLIDE 55

Assignments

Read Chapter 5 of the textbook Next Qme: more classic known

probability distribuQons

slide-56
SLIDE 56

Additional References

Charles M. Grinstead and J. Laurie Snell

"IntroducQon to Probability”

Morris H. Degroot and Mark J. Schervish

"Probability and StaQsQcs”

slide-57
SLIDE 57

See you next time

See You!