Probability and Statistics for Computer Science I have now used - - PowerPoint PPT Presentation

probability and statistics
SMART_READER_LITE
LIVE PREVIEW

Probability and Statistics for Computer Science I have now used - - PowerPoint PPT Presentation

Probability and Statistics for Computer Science I have now used each of the terms mean, variance, covariance and standard devia5on in two slightly different ways. ---Prof. Forsythe Credit: wikipedia Hongye Liu, Teaching Assistant


slide-1
SLIDE 1

ì

Probability and Statistics for Computer Science

“I have now used each of the terms mean, variance, covariance and standard devia5on in two slightly different ways.” ---Prof. Forsythe

Hongye Liu, Teaching Assistant Prof, CS361, UIUC, 9.17.2020 Credit: wikipedia
slide-2
SLIDE 2

No pain

in

a

hand of 5 from 52 Cards

The probability of drawing

hands of

5- cards that

have

no pairs .

( ng replacement)

* Consider the

  • rder

doesn't

matter

[ El = ?

52×48 x

  • +36

#

=L 's)

. 45

Irl = ?

slide-3
SLIDE 3

No pain

in

a

hand of 5 from 52 Cards

The probability of drawing

hands of

5- cards that

have

no pairs .

( ng replacement)

* Consider the

  • rder

doesn't

matter

( El = ?

(

'f ) . 45
  • decide
  • n
suits for 5 cards

I

  • ↳ choose The number

Irl = ?

( 53)

Hml " " '

' ' "' '
  • Q .
K ?
slide-4
SLIDE 4

'Tei

:

p=Y

make

sure

we

here

in

rYE*,

slide-5
SLIDE 5
  • i. e
.

the

  • ne

card

that

hampers

to

be

a 13

  • s
13 Choi us to

pick

a

heart

  • g- z
→ fz Choices to

pick

a

card

I

  • 4
,
  • ut
  • f

4

Suits

slide-6
SLIDE 6

Last time

Random

variable

XCW)

w →

K

* Definition

×=y low

  • head
w -
  • tail

* probability distribution

apex

  • x)

PDF

,

CDF

a

* Conditional probability

distr:

.

PIXIY )

pixel

→ pcfw . si.xcwt-x.ly
slide-7
SLIDE 7

50

w = head

50-3 50 dollars
  • n

the

bet

X ( W) =/-so

w= tail

pix)

13

K=

50

A

i

" " ""t

is:

  • '
'

so

×

Io

K

.

Summarize ! !

slide-8
SLIDE 8

Objectives

Random

variable

c R
  • V . )

*

Expected

value

Definition

X

properties

f CX)

*

  • *

Variance

&

covariance

fix . 's)

#

*

Markov 's

Inequality

slide-9
SLIDE 9

Expected value

The expected value (or expecta/on)

  • f a random variable X is

The expected value is a weighted sum

  • f the values X can take

E[X] =

  • x

xP(x)

( Discrete

case )

ppl X=x)

1-

all

slide-10
SLIDE 10

Expected value

The expected value of a random

variable X is

The expected value is a weighted sum

  • f the values X can take

E[X] =

  • x

xP(x)

<= 1

I

heard

psi

' O , + I tot 'T

I pose, =/

R 1-

all

slide-11
SLIDE 11

Expected value: profit

A company has a project that has p

probability of earning 10 million and 1-p probability of losing 10 million.

Let X be the return of the project.

00

pcX=K)

EfxT= to .pt C- to)xu

  • p)
A

ifp

)

pp

= up

  • lo

Zo

X

p >I

slide-12
SLIDE 12

Cookies

Solve

at home

FITE ME ¥7

  • I

$1 each

92 each

A) random

draw

I

Expected

value = ?

from

4 items

D) random

draw

1 twice

with

replacement

indeputy

f if

the

two

draws

are

the

game,

you

get

the prize,

Expected

value

=

?

slide-13
SLIDE 13

Linearity of Expectation

For random variables X and Y

and constants k,c

Scaling property Addi5vity And

E[X + Y ] = E[X] + E[Y ]

E[kX] = kE[X]

E[kX + c] = kE[X] + c

slide-14
SLIDE 14

Linearity of Expectation

Proof of the addi5ve property

E[X + Y ] = E[X] + E[Y ]

SIXTY

E[XtY1= ECS ] = -2

s PCs ) s

= E

SE p ix. y, Pc SIS )

f-*HIS

  • key}

+ pcx-IEI.IT/=EEgcxeysPcx.H

slide-15
SLIDE 15

E- Ext'll = ECS ] = E

s PCs ) s

= E

s E p ix. y,

PIG

×, 4=8

Is

  • *His
  • neg}
a s -_ Key )

= § Eycxty)P ' KY

)

= I } xpsx.gs/-ZEgypcx.y

)

E- [ X, -1×4

  • ' ' )

= xz + Try -3,2pA'D

=ECxitECXTt

  • '

=-3

, xp t -39,3%8)

  • = I xpcxlt Tgypcygt

→ e

=

ENT ECT )

slide-16
SLIDE 16
  • Q. What’s the value?

What is E[E[X]+1]?

  • A. E[X]+1 B. 1 C. 0

'

"

elfin

. = ECT t

l

  • Ef EM t
I

=

pay , =

I

= EKITI

ECT) = Yxl

=

EKITI

slide-17
SLIDE 17

Expected value of a function of X

If f is a func5on of a random

variable X , then Y = f (X) is a random variable too

The expected value of Y = f (X) is

E[Y ] = E[f(X)] =

  • x

f(x)P(x)

slide-18
SLIDE 18

The

exchange of

variable

theorem

f- c X = Y

If

each x → each 'y

ECT) -_Eyypcy

)

ECT ) = I ypce , Bi - ie"

  • .
. pity , =P CX
  • x)
=
  • 2 tax) Pix,
sore K T " t multiple
  • x. →
  • ne y I
single xx style y I

x

7

n

E

= -22 Ply ) -
  • I yP↳4tyE±yp
  • g )

YEI

  • g. ply)

peg )

= y . Epix) = E y pix, = Epoxy x
slide-19
SLIDE 19

Expected time of cat

A cat moves with random constant

speed V, either 5mile/hr or 20mile/hr with equal probability, what’s the expected 5me for it to travel 50 miles?

T =

= f 'V)

D-

  • a

Earl

EITI Z type

vs t

= ÷

. Paris + ¥ . p cuz, = x'T t

xtz-6.rs

slide-20
SLIDE 20

Q: Is this statement true?

If there exists a constant such that P(X ≥ a) = 1, then E[X] ≥ a . It is:

  • A. True
  • B. False
  • E CX = I xp ex) = IIP

t

ax

P'I '

  • 3. I a pix)
l

x' a

= a¥9

"'

slide-21
SLIDE 21

Variance and standard deviation

The variance of a random

variable X is

The standard devia5on of a

random variable X is

std[X] =

  • var[X]

var[X] = E[(X − E[X])2]

f- CX) = (X- E Cx) )

'

  • { ki}

work xi } )

N

IC Xi - te

s'

=T

v

slide-22
SLIDE 22

Properties of variance

For random variable X and

constant k

var[kX] = k2var[X] var[X] ≥ 0

slide-23
SLIDE 23

A neater expression for variance

var[X] = E[X2] − E[X]2

var[X] = E[(X − E[X])2]

Variance of Random Variable X is

defined as:

It’s the same as:

Y

X - ECM 22

= E C fix>I

TT

= E CY] = E y pig ,

y

= I C X - E Cx))? pix)

x

slide-24
SLIDE 24

q = Heard

XCW) =/

"
  • w = tail

am

y

  • Ean
  • I

a-Iie

( x

  • E- EYE f

*5=4

w -

  • H

T

w

  • T

slide-25
SLIDE 25

A neater expression for variance

var[X] = E[(X − E[X])2]

slide-26
SLIDE 26

A neater expression for variance

var[X] = E[(X − µ)2] where µ = E[X] var[X] = E[(X − E[X])2]

number

= El x'

  • zuxtle' )

= Fec

IEC-innate cut

=

Efx't - zu ECHT E-Ceil

=

E Ext - ZCEEXIJI E CENT

I ¥472

slide-27
SLIDE 27

A neater expression for variance

var[X] = E[(X − µ)2] where µ = E[X] = E[X2 − 2Xµ + µ2] var[X] = E[(X − E[X])2]

  • =E=fxI2
slide-28
SLIDE 28

Variance: the profit example

For the profit example, what is the

variance of the return? We know E[X]= 20p-10 var[X] = E[X2] − (E[X])2 = (102p + (−10)2(1 − p)) − (20p − 10)2 = 100 − (400p2 − 400p + 100) = 400p(1 − p)

  • *⇐

¥

¥i.

Eh# §

x'

  • pox,

= topatio)-1 Ho) ? pix

= - co) = loop t cool I- p)
slide-29
SLIDE 29

Motivation for covariance

Study the rela5onship between

random variables

Note that it’s the un-normalized

correla5on

Applica5ons include the fire control

  • f radar, communica5ng in the

presence of noise.

slide-30
SLIDE 30

Covariance

The covariance of random

variables X and Y is

Note that

cov(X, Y ) = E[(X − E[X])(Y − E[Y ])]

cov(X, X) = E[(X − E[X])2] = var[X]

slide-31
SLIDE 31

A neater form for covariance

A neater expression for

covariance (similar deriva5on as for variance)

cov(X, Y ) = E[XY ] − E[X]E[Y ]

"

= -zIt€, cy.eim.px.gs

slide-32
SLIDE 32

Correlation coefficient is normalized covariance

The correla5on coefficient is When X, Y takes on values with equal

probability to generate data sets {(x,y)}, the correla5on coefficient will be as seen in Chapter 2.

corr(X, Y ) = cov(X, Y ) σXσY

slide-33
SLIDE 33

Correlation coefficient is normalized covariance

The correla5on coefficient can also be

wrilen as:

corr(X, Y ) = E[XY ] − E[X]E[Y ]

σXσY

slide-34
SLIDE 34

Covariance seen from scatter plots

Posi5ve Covariance Nega5ve Covariance Zero Covariance

Credit: Prof.Forsyth
slide-35
SLIDE 35

When correlation coefficient or covariance is zero

The covariance is 0! That is: This is a necessary property of

independence of random variables * (not equal to independence)

E[XY ] − E[X]E[Y ] = 0 E[XY ] = E[X]E[Y ]

CoV Cx,Y)

=

not sufficient

slide-36
SLIDE 36

Variance of the sum of two random variables

var[X + Y ] = var[X] + var[Y ] + 2cov(X, Y )

in

HW

slide-37
SLIDE 37

These

areequivalent

(1)

Cov CX

, 41=0 ; Corral,Yj=o

( I )

ECXYI

= ECXIECY ]

( II)

var[ Xt's]

  • uarfxfiuascy]

uncorrelated ! !

slide-38
SLIDE 38

Properties of independence in terms

  • f expectations
  • E[XY ] = E[X]E[Y ]
slide-39
SLIDE 39

If

X , Y

are

independent then

Cove X. 41=0 ,

  • Corr CX, -9=0

ECXY)

= EAT ECT]

{ uarfxtyj-uarcxltuas.IT]

slide-40
SLIDE 40

Q: What is this expectation?

We toss two iden5cal coins A & B

independently for three 5mes and 4 5mes respec5vely, for each head we earn $1, we define X is the earning from A and Y is the earning from B. What is E(XY)?

  • A. $2 B. $3 C. $4
slide-41
SLIDE 41

Uncorrelated vs Independent

If two random variables are

uncorrelated, does this mean they are independent? Inves5gate the case X takes -1, 0, 1 with equal probability and Y=X2.

slide-42
SLIDE 42

y

y

Y

'⇐ - '

y=

.
  • pix,-47

" °

pigs

  • _ I z

y= ,

I

I

25-1

"

  • i. *

* 's

'

yer ¥*¥s×

O

l

  • l

y

x y

pix . 2)

Efx'll-0

till

i

÷

pax.yi-tt.MY#-i-

I

  • i

y

'a'I

  • t .
  • '

T

slide-43
SLIDE 43

Assignments

Finish Chapter 4 of the textbook Next 5me: Proof of Chebyshev

inequality & Weak law of large numbers, Con5nuous random variable

slide-44
SLIDE 44

Additional References

Charles M. Grinstead and J. Laurie Snell

"Introduc5on to Probability”

Morris H. Degroot and Mark J. Schervish

"Probability and Sta5s5cs”

slide-45
SLIDE 45

See you next time

See You!