- 4. Conditional Probability
BT 1.3, 1.4
CSE 312 Spring 2015 W.L. Ruzzo
P( )
1
P( ) Spring 2015 W.L. Ruzzo 1 conditional probability - - - PowerPoint PPT Presentation
4. Conditional Probability BT 1.3, 1.4 CSE 312 P( ) Spring 2015 W.L. Ruzzo 1 conditional probability - intuition Roll one fair die. What is the probability that the outcome is 5? 1/6 (5 is one of 6 equally likely outcomes) What is
1
2
3
P(E | F) = |EF| / |F| = |E| / |F| = 1/3
All outcomes are equally likely. Knowing F occurred doesn’t distort relative likelihoods of outcomes within F, so they remain equally likely. There are only 3 of them, one being E, so P(E | F) = 1/3
4
E = {5} (event that roll is 5) S = {1,2, 3, 4, 5, 6} sample space
G = {2, 4, 6} Way 1 (counting): P(E | G) = |EG| / |G| = |∅| / |G| = 0/3 = 0 Way 2 (probabilities): P(E | G) = P(EG) / P(G) = P(∅) / P(G) = (0) / (1/2) = 0 Way 3 (restricted sample space):
Outcomes are equally likely. Knowing G occurred doesn’t distort relative likelihoods of outcomes within G; they remain equally likely. There are 3 of them, none being E, so P(E | G) = 0/3
5
6
7
8
9
[and do you expect it to be larger than P(E), or smaller?]
10
= 0
11
12 BT p. 24
13
14
15
P(E1) = 52/52 = 1 (A♥ can go anywhere) P(E2|E1) = 39/51 (39 of 51 slots not in A♥ pile) P(E3|E1E2 ) = 26/50 (26 not in A♥, A♠ piles) P(E4|E1E2E3) = 13/49 (13 not in A♥, A♠, A♦ piles)
16
A conceptual trick: what’s randomized? a) randomize cards, deal sequentially into 4 piles b) sort cards, aces first, deal randomly into empty slots among 4 piles.
17
18 BT p. 19
19
20
EF = { (n+m)-bit strings | 1st bit = 0 & (k-1)0’s in the next (r-1) }
21 One of the many binomial identities
22 Above eqns, plus the same binomial identity twice. A generally useful trick: Reversing conditioning (more to come)
23 BT p. 28
Note that conditional probability was a means to an end in this example, not the goal itself. One reason conditional probability is important is that this is a common scenario.
24
(Analogous to reasoning by cases; both are very handy.)
weighted average, conditioned on event F happening or not.
25
weighted average, conditioned on which event Fi happened
BT p. 28
26 nice example of the utility of conditioning: future decomposed into two crisp cases instead of being a blurred superposition thereof
aka “Drunkard’s Walk”
0 i N
BT pg. 63 How does pi vary with i?
27
w = ?? r = ?? w = 3 r = 3
BT p. 1.4
28 http://www.amazon.com/Theory-That- Would-Not-Die/dp/0300188226/
ISBN-13: 978-0300188226 Yale University Press, 2011
Los Angeles Times (October 28, 1996) By Leslie Helm, Times Staff Writer
source: http://www.ar-tiste.com/latimes_oct-96.html
30
31
w = ?? r = ??
33
34
I.e., the given data about child raises probability that M1 is father
35
Exercises: What if M2 were (A,A)? What if child were (A,A)? E.g., 1/2 → 2/3
36
↖ P(E) ≈ 1.5% Note difference between conditional and joint probability: P(F|E) = 33% ; P(FE) = 0.49%
37
38
39
There’s nothing new here, versus prior results, but the simple form, and the simple interpretation are convenient.
40
HIV+ HIV- Test + 0.98 = P(E|F) 0.01 = P(E|Fc) Test - 0.02 = P(Ec|F) 0.99 = P(Ec|Fc)
41
HIV+ HIV- Test + 0.98 = P(E|F) 0.01 = P(E|Fc) Test - 0.02 = P(Ec|F) 0.99 = P(Ec|Fc)
42
P(E|F): Conditional probability that E occurs given that F has occurred. Reduce event/sample space to points consistent w/ F (E ∩ F ; S ∩ F) , if equiprobable outcomes. P(EF) = P(E|F) P(F) (“the chain rule”) “P( - | F )” is a probability law, i.e., satisfies the 3 axioms P(E) = P(E|F) P(F) + P(E|Fc) (1-P(F)) (“the law of total probability”)
prior, posterior, odds, prior odds, posterior odds, Bayes factor
43
(P(F) > 0)