BAYES FORMULA a two-stage experiment Xingru Chen - - PowerPoint PPT Presentation

β–Ά
bayes formula
SMART_READER_LITE
LIVE PREVIEW

BAYES FORMULA a two-stage experiment Xingru Chen - - PowerPoint PPT Presentation

BAYES FORMULA a two-stage experiment Xingru Chen xingru.chen.gr@dartmouth.edu XC 2020 Simplest Bayes Formula | = ! " !($|") . B B !($) Bayes probabilities Bayes theorem links the B B degree of


slide-1
SLIDE 1

BAYES’ FORMULA

a two-stage experiment Xingru Chen xingru.chen.gr@dartmouth.edu

XC 2020

slide-2
SLIDE 2

𝑄 𝐢|𝐡 = ! " !($|")

!($)

.

B B

Simplest Bayes’ Formula

Bayes probabilities

Bayes’ theorem links the degree

  • f

belief in a proposition be before and after accounting for ev eviden ence ce.

B B

XC 2020

slide-3
SLIDE 3

𝑄 𝐢|𝐡 = ! " !($|")

!($)

.

B B

Simplest Bayes’ Formula

= 𝑄 𝐢 𝐡 = !("∩$)

!($) .

= 𝑄 𝐡 𝐢 = !($∩")

!(") .

XC 2020

slide-4
SLIDE 4

We Weather Forecast

Even Event-1: 1: rain 𝑄 rain = 0.6 Even Event-2: windy & cl clou

  • udy

𝑄 windy & cloudy = 0.48 Pr Prior probab ability

The pr prior pr probabili lity of an event (often simply called th the pri rior) is its probability

  • btained

from some prior information.

Evidence ce

The ev eviden ence ce term in Bayes’ theorem refers to the ov

  • verall pr

probabili lity of this new piece

  • f

information.

XC 2020

slide-5
SLIDE 5

Even Event-1: 1: rain Prior probability 𝑄 rain = 0.6 Even Event-2: windy & cl clou

  • udy

Evidence 𝑄 windy & cloudy = 0.48

𝑄 rain|windy & cloudy = ! ()*+ !(,*+-. & 0123-.|()*+)

!(,*+-. & 0123-.)

.

B B

XC 2020

slide-6
SLIDE 6

Even Event-1: 1: rain Prior probability 𝑄 rain = 0.6 Even Event-2: windy & cl clou

  • udy

Evidence 𝑄 windy & cloudy = 0.48 𝑄 rain|windy & cloudy =

! "#$% !('$%() & +,-.()|"#$%) !('$%() & +,-.())

.

B B

windy & cl clou

  • udy

| | rain 𝑄(windy & cloudy|rain) = 0.64 Like kelihood

  • od

The like kelihood represents a conditional

  • probability. It

is the degree to which the first event is consistent with the second event.

XC 2020

slide-7
SLIDE 7

Even Event-1: 1: rain Prior probability 𝑄 rain = 0.6 Even Event-2: windy & cl clou

  • udy

Evidence 𝑄 windy & cloudy = 0.48 𝑄 rain|windy & cloudy =

! "#$% !('$%() & +,-.()|"#$%) !('$%() & +,-.())

.

B B

windy & cl clou

  • udy

| | rain Likelihood 𝑄(windy & cloudy|rain) = 0.64 rain | | windy & cl clou

  • udy

Posterior probability 𝑄 rain|windy & cloudy = β‹― rain | | windy & cl clou

  • udy

Posterior probability 𝑄 rain|windy & cloudy = 0.6Γ—0.64 0.48 = 0.8

XC 2020

slide-8
SLIDE 8

Posterior probability = 4(*2( 5(26)6*1*7.Γ—9*:;1*<22-

=>*-;+0;

.

B B

Pr Prior probab ability ra rain

The pr prior pr probabili lity of an event (often simply called th the pri rior) is its probability

  • btained

from some prior information.

Evidence ce windy & cl clou

  • udy

The ev eviden ence ce term in Bayes’ theorem refers to the ov

  • verall pr

probabili lity of this new piece

  • f

information.

Like kelihood

  • od

windy & cl clou

  • udy

| | rain

The like kelihood represents a conditional

  • probability. It

is the degree to which the first event is consistent with the second event.

Po Posterior probability rain | | windy & cl clou

  • udy

The po post ster erior pr probabili lity represents the up updated pr prior pr probabili lity after taking into account some new piece

  • f

information.

XC 2020

slide-9
SLIDE 9

Bayes probabilities are particularly appropriate for medical diagnosis.

Bayes probabilities

XC 2020

slide-10
SLIDE 10

Solving Inverse Problems Using Bayes Probabilities

Β§ A doctor is trying to decide if a patient has

  • ne
  • f

three diseases 𝑒1, 𝑒2,

  • r

𝑒3. Two tests are to be carried

  • ut,

each

  • f

which results in a positive (+)

  • r

a negative (βˆ’)

  • utcome. There

are four possible test patterns ++, +βˆ’, βˆ’+, and βˆ’βˆ’. Β§ National records have indicated that, for 10,000 people having

  • ne
  • f

these three diseases, the distribution

  • f

diseases and test results are as in Table below.

Di Disease Nu Number having th this disease Nu Number having th this disease + + + + + βˆ’ βˆ’ βˆ’ + βˆ’ βˆ’ βˆ’ 𝑒! 3215 2110 301 704 100 𝑒" 2125 396 132 1187 410 𝑒# 4660 510 3568 73 509 Total 10000 3016 4001 1964 1019

XC 2020

slide-11
SLIDE 11

to compute various posterior probabilities

Use Bayes’ formula

π’†πŸ π’†πŸ‘ π’†πŸ’ + + + + + βˆ’ βˆ’ βˆ’ + βˆ’ βˆ’ βˆ’

XC 2020

slide-12
SLIDE 12

Use Bayes’ formula to compute various posterior probabilities

π’†πŸ π’†πŸ‘ π’†πŸ’ + + + + + βˆ’ βˆ’ βˆ’ + βˆ’ βˆ’ βˆ’

Po Posterior pr probabili lity: 𝒆𝒋 | | + + 𝐐 𝒆𝒋|+ |+ + = 𝑸 + +|𝒆𝒋 𝑸(𝒆𝒋) 𝑸(+ +) Po Posterior pr probabili lity: 𝒆𝒋 | | + βˆ’ 𝐐 𝒆𝒋|+ |+ βˆ’ = 𝑸 + βˆ’|𝒆𝒋 𝑸(𝒆𝒋) 𝑸(+ βˆ’) Po Posterior pr probabili lity: 𝒆𝒋 | βˆ’ βˆ’ + 𝐐 𝒆𝒋|βˆ’ + = 𝑸 βˆ’ +|𝒆𝒋 𝑸(𝒆𝒋) 𝑸(βˆ’ +) Po Posterior pr probabili lity: 𝒆𝒋 | βˆ’ βˆ’ βˆ’ 𝐐 𝒆𝒋|βˆ’ βˆ’ = 𝑸 βˆ’ βˆ’|𝒆𝒋 𝑸(𝒆𝒋) 𝑸(βˆ’ βˆ’)

XC 2020

slide-13
SLIDE 13

Prior Probability

Di Disease Nu Number having th this disease Nu Number having th this disease + + + + + βˆ’ βˆ’ βˆ’ + βˆ’ βˆ’ βˆ’ 𝑒! 3215 2110 301 704 100 𝑒" 2125 396 132 1187 410 𝑒# 4660 510 3568 73 509 Total 10000 3016 4001 1964 1019 Pr Prio ior probabil ilit ity: dis isease 1 𝑄 𝑒! = 3215 10000 = 0.3215 Pr Prio ior probabil ilit ity: dis isease 2 𝑄 𝑒" = 2125 10000 = 0.2125 Pr Prio ior probabil ilit ity: dis isease 3 𝑄 𝑒# = 4660 10000 = 0.466

XC 2020

slide-14
SLIDE 14

Evidence

Di Disease Nu Number having th this disease Nu Number having th this disease + + + + + βˆ’ βˆ’ βˆ’ + βˆ’ βˆ’ βˆ’ 𝑒! 3215 2110 301 704 100 𝑒" 2125 396 132 1187 410 𝑒# 4660 510 3568 73 509 Total 10000 3016 4001 1964 1019 Ev Evide dence: + + 𝑄 + + = 3016 10000 = 0.3016 Ev Evide dence: + βˆ’ 𝑄 + βˆ’ = 4001 10000 = 0.4001 Ev Evide dence: βˆ’ + 𝑄 βˆ’ + = 1964 10000 = 0.1964 Ev Evide dence βˆ’ βˆ’ 𝑄 βˆ’ βˆ’ = 1019 10000 = 0.1019

XC 2020

slide-15
SLIDE 15

Likelihood

Di Disease Nu Number having th this disease Nu Number having th this disease + + + + + βˆ’ βˆ’ βˆ’ + βˆ’ βˆ’ βˆ’ 𝑒! 3215 2110 301 704 100 𝑒" 2125 396 132 1187 410 𝑒# 4660 510 3568 73 509 Total 10000 3016 4001 1964 1019 Like kelihood: d: + + | π’†πŸ 𝑄 + +|𝑒! = 2110 3215 Like kelihood: d: + βˆ’ | π’†πŸ 𝑄 + βˆ’|𝑒! = 301 3215 Like kelihood: d: βˆ’ + | π’†πŸ 𝑄 βˆ’ + = 704 3215 Like kelihood: d: βˆ’ βˆ’ | π’†πŸ 𝑄 βˆ’ βˆ’ = 100 3215

XC 2020

slide-16
SLIDE 16

Posterior Probability

Di Disease Nu Number having th this disease Nu Number having th this disease + + + + + βˆ’ βˆ’ βˆ’ + βˆ’ βˆ’ βˆ’ 𝑒! 3215 2110 301 704 100 𝑒" 2125 396 132 1187 410 𝑒# 4660 510 3568 73 509 Total 10000 3016 4001 1964 1019 Like kelihood: d: + + | π’†πŸ 𝑄 + +|𝑒! = 2110 3215 Pr Prio ior probabil ilit ity: dis isease 1 𝑄 𝑒! = 3215 10000 = 0.3215 Ev Evide dence: + + 𝑄 + + = 3016 10000 = 0.3016 Po Posterior pr probabili lity: π’†πŸ | | + + 𝐐 π’†πŸ|+ |+ + = 𝑸 + +|π’†πŸ 𝑸(π’†πŸ) 𝑸(+ +) = 2110 3215 3215 10000 3016 10000 = 2110 3016 = 𝟏. πŸ–πŸπŸ

XC 2020

slide-17
SLIDE 17

Posterior Probability

π’†πŸ π’†πŸ‘ π’†πŸ’ + + +

0.700 0.131 0.169

+ + βˆ’

0.075 0.033 0.892

βˆ’ βˆ’ +

0.358 0.604 0.038

βˆ’ βˆ’ βˆ’

0.098 0.403 0.499 Po Posterior pr probabili lity: 𝒆𝒋 | | + + 𝐐 𝒆𝒋|+ |+ + = 𝑸 + +|𝒆𝒋 𝑸(𝒆𝒋) 𝑸(+ +) Po Posterior pr probabili lity: 𝒆𝒋 | | + βˆ’ 𝐐 𝒆𝒋|+ |+ βˆ’ = 𝑸 + βˆ’|𝒆𝒋 𝑸(𝒆𝒋) 𝑸(+ βˆ’) Po Posterior pr probabili lity: 𝒆𝒋 | βˆ’ βˆ’ + 𝐐 𝒆𝒋|βˆ’ + = 𝑸 βˆ’ +|𝒆𝒋 𝑸(𝒆𝒋) 𝑸(βˆ’ +) Po Posterior pr probabili lity: 𝒆𝒋 | βˆ’ βˆ’ βˆ’ 𝐐 𝒆𝒋|βˆ’ βˆ’ = 𝑸 βˆ’ βˆ’|𝒆𝒋 𝑸(𝒆𝒋) 𝑸(βˆ’ βˆ’)

XC 2020

slide-18
SLIDE 18

HOW TO GET THE EVIDENCE?

Sometimes the evidence is not directly given to us…

XC 2020

slide-19
SLIDE 19

Bayes’ Formula

Β§ Ba Bayes es p probabil ilit ities ies: given the

  • utcome
  • f

the second stage

  • f

a two-stage experiment, the probability for an

  • utcome

at the first stage. Β§ Suppose we have a set

  • f

hypotheses 𝐼1, 𝐼2, β‹― , 𝐼5, which are pairwise disjoint and such that Ξ© = 𝐼1 βˆͺ 𝐼2 βˆͺ β‹― βˆͺ 𝐼5. We have a set

  • f

prior probabilities 𝑄 𝐼1 , 𝑄 𝐼2 , β‹― , 𝑄(𝐼5) for the hypotheses.

𝐼! 𝐼" 𝐼# 𝐼% 𝐼&

XC 2020

slide-20
SLIDE 20

Bayes’ Formula

Β§ Ba Bayes es p probabil ilit ities ies: given the

  • utcome
  • f

the second stage

  • f

a two-stage experiment, the probability for an

  • utcome

at the first stage. Β§ Suppose we have a set

  • f

hypotheses 𝐼1, 𝐼2, β‹― , 𝐼5, which are pairwise disjoint and such that Ξ© = 𝐼1 βˆͺ 𝐼2 βˆͺ β‹― βˆͺ 𝐼5. We have a set

  • f

prior probabilities 𝑄 𝐼1 , 𝑄 𝐼2 , β‹― , 𝑄(𝐼5) for the hypotheses. Β§ We also have an event evidence ce 𝐹, which can tell us further information about which hypothesis is

  • correct. Suppose

we know 𝑄(𝐹|𝐼6) for all 𝑗: if we know the correct hypothesis, we know the probability for the evidence 𝐹. The conditional probability 𝑄(𝐼6|𝐹) is the probability for the hypothesis given the evidence 𝐹, and is called the po posteri rior pr probabili lity.

XC 2020

slide-21
SLIDE 21

Ba Bayes’ es’ f formula

𝑄 𝐼U 𝐹 = 𝑄 𝐼U 𝑄(𝐹|𝐼U) 𝑄(𝐹) Important Properties

  • If

𝐼!, … , 𝐼' are pairwise disjoint subsets

  • f

Ξ© (i.e., no two

  • f

the 𝐼( have an element in common), then 𝑄 𝐼! βˆͺ β‹― βˆͺ 𝐼' = βˆ‘()!

'

𝑄(𝐼().

  • If

𝐼!, … , 𝐼' are pairwise disjoint subsets with Ξ© = 𝐼! βˆͺ β‹― βˆͺ 𝐼', and let 𝐹 be any

  • event. Then

𝑄 𝐹 = βˆ‘()!

'

𝑄(𝐹 ∩ 𝐼().

= 𝑄 𝐹 𝐼U = !(]∩^!)

!(^!) .

XC 2020

slide-22
SLIDE 22

Ba Bayes’ es’ f formula

𝑄 𝐼U 𝐹 = 𝑄 𝐼U 𝑄(𝐹|𝐼U) 𝑄(𝐹) 𝑄 𝐹 = βˆ‘U_`

a 𝑄(𝐹 ∩ 𝐼U).

=

𝑄 𝐹 ∩ 𝐼6 = 𝑄 𝐹 𝐼6 𝑄(𝐼6).

𝑄 𝐹 = βˆ‘U_`

a 𝑄 𝐹 𝐼U 𝑄(𝐼U).

Ba Bayes’ es’ f formula

𝑄 𝐼U 𝐹 = 𝑄 𝐼U 𝑄(𝐹|𝐼U) βˆ‘U_`

a 𝑄 𝐹 𝐼U 𝑄(𝐼U)

XC 2020

slide-23
SLIDE 23

Snow in Hanover

Β§ Four months later... xue hua piao piao Β§ You are about to get

  • n

the Dartmouth Coach and back to school. You want to know if you should wear a pair

  • f

snow boots. Β§ You ask both the driver and me independently if it is snowing in

  • Hanover. Both
  • f

us have a

# % chance

  • f

telling you the truth and a

! % chance

  • f

messing with you by

  • lying. Both
  • f

us tell you that "YES" it is snowing. Β§ Here in Hanover, the probability that it snows

  • n

any given day in November is

! ". What

is the probability that it is actually snowing?

XC 2020

slide-24
SLIDE 24

Posterior probability = 4(*2( 5(26)6*1*7.Γ—9*:;1*<22-

=>*-;+0;

.

B B

Pr Prior probab ability

The pr prior pr probabili lity of an event (often simply called th the pri rior) is its probability

  • btained

from some prior information.

Evidence ce

The ev eviden ence ce term in Bayes’ theorem refers to the ov

  • verall pr

probabili lity of this new piece

  • f

information.

Like kelihood

  • od

The like kelihood represents a conditional

  • probability. It

is the degree to which the first event is consistent with the second event.

Po Posterior probability

The po post ster erior pr probabili lity represents the up updated pr prior pr probabili lity after taking into account some new piece

  • f

information.

XC 2020

slide-25
SLIDE 25

Even Event-1: 1: sno now Prior probability 𝑄 snow = 1 2 Even Event-2: 2: yes & yes Evidence 𝑄 yes & yes = β‹― Ye Yes, Ye Yes | snow Likelihood 𝑄(yes & yes|snow) = β‹― sn snow | | y yes, es, y yes es Posterior probability 𝑄 snow|yes & yes = β‹―

𝑄 snow|yes & yes =

! b+2, !(.;b & .;b|b+2,) ! .;b & .;b

.

B B

Ba Bayes’ es’ f formula 𝑄 𝐼6 𝐹 = 𝑄 𝐼6 𝑄(𝐹|𝐼6) βˆ‘671

5 𝑄 𝐹 𝐼6 𝑄(𝐼6)

𝐼6: snow, not snow 𝐹: yes & yes

XC 2020

slide-26
SLIDE 26

Snow in Hanover

𝑄 snow|yes & yes =

! b+2, !(.;b & .;b|b+2,) ! .;b & .;b

.

B B

Ba Bayes’ es’ f formula 𝑄 𝐼6 𝐹 = 𝑄 𝐼6 𝑄(𝐹|𝐼6) βˆ‘671

5 𝑄 𝐹 𝐼6 𝑄(𝐼6)

𝐼6: snow, not snow 𝐹: yes & yes 𝑄 yes & yes = 𝑄 snow 𝑄(yes & yes snow + 𝑄 not snow 𝑄(yes & yes not snow = 1 2 (3 4)2+ 1 2 (1 4)2

XC 2020

slide-27
SLIDE 27

𝑄 yes & yes = 𝑄 snow 𝑄(yes & yes snow + 𝑄 not snow 𝑄(yes & yes not snow = 1 2 (3 4)2+ 1 2 (1 4)2 𝑄 snow|yes & yes = 𝑄 snow 𝑄(yes & yes|snow) 𝑄 yes & yes = 1 2 (3 4)2 1 2 (3 4)2+ 1 2 (1 4)2 = 9 10

XC 2020

slide-28
SLIDE 28

Chocolate Box

Β§ Forrest Gump has a box

  • f

assorted

  • chocolate. It

includes 10 pieces

  • f

different flavors. Β§ A piece

  • f

chocolate is taken

  • ut

at

  • random. Let

π‘Œ be the chosen piece. Β§ If Jenny does not buy him a new box

  • f

chocolate and the second piece

  • f

chocolate 𝑍 is chosen from the

  • riginal

box at random. Β§ Consider the distributions

  • f

π‘Œ and 𝑍. Are the two distributions the same?

XC 2020

slide-29
SLIDE 29

Random variable π‘Œ 𝑛 𝑦 = 1 10 Random variable 𝑍 𝑄 𝑍 = 1 = 𝑄 𝑍 = 1 π‘Œ = 1 𝑄 π‘Œ = 1 + 𝑄 𝑍 = 1 π‘Œ β‰  1 𝑄 π‘Œ β‰  1 = 0Γ— 1 10 + 1 9 Γ— 9 10 = 1 10 Chocolate: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10

XC 2020

slide-30
SLIDE 30

Open book Scope: Chapters 1, 2, 3, and 4 Β§ discrete probability distributions Β§ continuous probability densities Β§ permutations and combinations Β§ conditional probability Materials: Slides, homework, quizzes, textbook Date & Time: July 20, 3 hours, 24 hours Office hours: July 20, July 21

Ju July 2020

Mid Midter erm 1

28 28 29 29 30 30 01 01 02 02 03 03 04 04 05 05 06 06 07 07 08 08 09 09 10 10 11 11 12 12 13 13 14 14 15 15 16 16 17 17 18 18 19 19 20 20 21 21 22 22 23 23 24 24 25 25 26 26 27 27 28 28 29 29 01 01 31 31 30 30

Sun Mon Tue Wed Thu Fri Sat

XC 2020