Probability
Reading: EC 6.1–6.3 Peter J. Haas INFO 150 Fall Semester 2019
Lecture 14 1/ 19
Probability Reading: EC 6.16.3 Peter J. Haas INFO 150 Fall - - PowerPoint PPT Presentation
Probability Reading: EC 6.16.3 Peter J. Haas INFO 150 Fall Semester 2019 Lecture 14 1/ 19 Probability Introduction Definition of Probability Rules for Computing Probabilities Conditional Probability Bernoulli Trials Lecture 14 2/ 19
Probability
Reading: EC 6.1–6.3 Peter J. Haas INFO 150 Fall Semester 2019
Lecture 14 1/ 19
Probability Introduction Definition of Probability Rules for Computing Probabilities Conditional Probability Bernoulli Trials
Lecture 14 2/ 19
Introduction
Probability: The rules of chance I Most things are uncertain! (Origins of theory circa 1650, in gambling) I Foundational to machine learning, game theory, statistical analysis, ...
Lecture 14 3/ 19
Introduction
Probability: The rules of chance I Most things are uncertain! (Origins of theory circa 1650, in gambling) I Foundational to machine learning, game theory, statistical analysis, ... Goals I Compute the probability of a complex event from probabilities of simpler events
Lecture 14 3/ 19
Introduction
Probability: The rules of chance I Most things are uncertain! (Origins of theory circa 1650, in gambling) I Foundational to machine learning, game theory, statistical analysis, ... Goals I Compute the probability of a complex event from probabilities of simpler events I Conquer your terrible intuition about uncertainty (see Kahneman and Tversky)
Lecture 14 3/ 19
Probability Can be Non-Intuitive
Which is more likely if you pick a word at random from a dictionary: a ”k” in the first position or a ”k” in the third position?
Lecture 14 4/ 19
Probability Can be Non-Intuitive
Which is more likely if you pick a word at random from a dictionary: a ”k” in the first position or a ”k” in the third position? How likely is it that two people in this room share the same birthday?
Lecture 14 4/ 19
Probability Can be Non-Intuitive
Which is more likely if you pick a word at random from a dictionary: a ”k” in the first position or a ”k” in the third position? How likely is it that two people in this room share the same birthday? ”Linda 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations”. Rank the following statements in order of relative likeihood:
Lecture 14 4/ 19
Probability Can be Non-Intuitive
Which is more likely if you pick a word at random from a dictionary: a ”k” in the first position or a ”k” in the third position? How likely is it that two people in this room share the same birthday? ”Linda 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations”. Rank the following statements in order of relative likeihood:
Which sequence of coin flips is more likely: HHTHTT or HHHHHH?
Lecture 14 4/ 19
Probability Can be Non-Intuitive
Which is more likely if you pick a word at random from a dictionary: a ”k” in the first position or a ”k” in the third position? How likely is it that two people in this room share the same birthday? ”Linda 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations”. Rank the following statements in order of relative likeihood:
Which sequence of coin flips is more likely: HHTHTT or HHHHHH? We will learn some mathematical tools for getting the answers right
Lecture 14 4/ 19
k
in
the
third
position
Iexample
bias
"34
people
:79.5%
28 people
:65.4%
Ihas to
be
atleast
as
likely
as3
equally likely
Basic Terminology
Setting: The outcomes of an experiment I Sample space S: The set of possible outcomes I Initially we’ll focus on experiments where outcomes are equally likely
Lecture 14 5/ 19
Basic Terminology
Setting: The outcomes of an experiment I Sample space S: The set of possible outcomes I Initially we’ll focus on experiments where outcomes are equally likely Example: Roll two dice and add up the numbers I Attempt 1: sample space is S = {2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12} (Demo) I Attempt 2: Sample space is S = { double 1, double 2, double 3, double 4, double 5, double 6 {1, 2}, {1, 3}, {1, 4}, {1, 5}, {1, 6}, {2, 3}, {2, 4}, {2, 5}, {2, 6}, {3, 4}, {3, 5}, {3, 6}, {4, 5}, {4, 6}, {5, 6}} (Demo) I Attempt 3: ordered pair representation (see table)
Lecture 14 5/ 19 Green 1 Green 2 Green 3 Green 4 Green 5 Green 6 Red 1 (1,1) (1,2) (1,3) (1,4) (1,5) (1,6) Red 2 (2,1) (2,2) (2,3) (2,4) (2,5) (2,6) Red 3 (3,1) (3,2) (3,3) (3,4) (3,5) (3,6) Red 4 (4,1) (4,2) (4,3) (4,4) (4,5) (4,6) Red 5 (5,1) (5,2) (5,3) (5,4) (5,5) (5,6) Red 6 (6,1) (6,2) (6,3) (6,4) (6,5) (6,6)
Basic Terminology
Setting: The outcomes of an experiment I Sample space S: The set of possible outcomes I Initially we’ll focus on experiments where outcomes are equally likely Example: Roll two dice and add up the numbers I Attempt 1: sample space is S = {2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12} (Demo) I Attempt 2: Sample space is S = { double 1, double 2, double 3, double 4, double 5, double 6 {1, 2}, {1, 3}, {1, 4}, {1, 5}, {1, 6}, {2, 3}, {2, 4}, {2, 5}, {2, 6}, {3, 4}, {3, 5}, {3, 6}, {4, 5}, {4, 6}, {5, 6}} (Demo) I Attempt 3: ordered pair representation (see table) We are interested in proportion of outcomes in which some event occurs I Ex: Sum on two dice equals 10 (proportion of outcomes is 3/36) I An outcome is successful if the event occurs I Technically, an event is a specified set of E outcomes with E ✓ S
Lecture 14 5/ 19 Green 1 Green 2 Green 3 Green 4 Green 5 Green 6 Red 1 (1,1) (1,2) (1,3) (1,4) (1,5) (1,6) Red 2 (2,1) (2,2) (2,3) (2,4) (2,5) (2,6) Red 3 (3,1) (3,2) (3,3) (3,4) (3,5) (3,6) Red 4 (4,1) (4,2) (4,3) (4,4) (4,5) (4,6) Red 5 (5,1) (5,2) (5,3) (5,4) (5,5) (5,6) Red 6 (6,1) (6,2) (6,3) (6,4) (6,5) (6,6)
Definition of Probability (Equally Likely Outcomes)
Definition Given an experiment with a sample space S of equally likely outcomes and an event E, the probability of the event, denoted Prob(E) is the ratio of the number of successful
Prob(E) = n(E) n(S)
Lecture 14 6/ 19
Definition of Probability (Equally Likely Outcomes)
Definition Given an experiment with a sample space S of equally likely outcomes and an event E, the probability of the event, denoted Prob(E) is the ratio of the number of successful
Prob(E) = n(E) n(S) Example 1: E = “sum of two dice equals 10” I E = {(4, 6), (5, 5), (6, 4)}, so Prob(E) =
3 36 = 1 12 ⇡ 0.083 Lecture 14 6/ 19
Definition of Probability (Equally Likely Outcomes)
Definition Given an experiment with a sample space S of equally likely outcomes and an event E, the probability of the event, denoted Prob(E) is the ratio of the number of successful
Prob(E) = n(E) n(S) Example 1: E = “sum of two dice equals 10” I E = {(4, 6), (5, 5), (6, 4)}, so Prob(E) =
3 36 = 1 12 ⇡ 0.083
Example 2: E = “two cards drawn randomly from a 52-card deck have same value” I S = {(AS, 2H), (3C, KD), . . .}, so n(S) = 52 · 51 I E = {(AS, AC), (3D, 3S), . . .}, so n(S) = 52 · 3 I Hence Prob(E) =
52·3 52·51 = 1 17 Lecture 14 6/ 19
Definition of Probability (Equally Likely Outcomes)
Definition Given an experiment with a sample space S of equally likely outcomes and an event E, the probability of the event, denoted Prob(E) is the ratio of the number of successful
Prob(E) = n(E) n(S) Example 1: E = “sum of two dice equals 10” I E = {(4, 6), (5, 5), (6, 4)}, so Prob(E) =
3 36 = 1 12 ⇡ 0.083
Example 2: E = “two cards drawn randomly from a 52-card deck have same value” I S = {(AS, 2H), (3C, KD), . . .}, so n(S) = 52 · 51 I E = {(AS, AC), (3D, 3S), . . .}, so n(S) = 52 · 3 I Hence Prob(E) =
52·3 52·51 = 1 17
Example 3: E = “last two tosses have same value when tossing 5 coins” I S = n(S) = I E = n(E) = I Prob(E) =
Lecture 14 6/ 19
ex
: HTHTTlist of lengths
25--32
"but ending
inHHORTT
"
'skis
into
Probability of the Complement of an Event
Definition I The Complement ¯ E of an event E is the set of outcomes not in E: ¯ E = S E. Complement Rule I For any event E, Prob(E) + Prob( ¯ E) = 1.
Lecture 14 7/ 19
Probability of the Complement of an Event
Definition I The Complement ¯ E of an event E is the set of outcomes not in E: ¯ E = S E. Complement Rule I For any event E, Prob(E) + Prob( ¯ E) = 1. Example 1: What is probability that same result occurs more than once in 3 die rolls? I E = {(1, 6, 1), (2, 2, 4), (3, 3, 3), . . .} I ¯ E = set of rolls that are all different I n( ¯ E) = P(6, 3) = 120 and Prob( ¯ E) = 120
63 = 120 216
I Hence Prob(E) = 1 Prob( ¯ E) = 1 120
216 ⇡ 0.44 Lecture 14 7/ 19
Probability of the Complement of an Event
Definition I The Complement ¯ E of an event E is the set of outcomes not in E: ¯ E = S E. Complement Rule I For any event E, Prob(E) + Prob( ¯ E) = 1. Example 1: What is probability that same result occurs more than once in 3 die rolls? I E = {(1, 6, 1), (2, 2, 4), (3, 3, 3), . . .} I ¯ E = set of rolls that are all different I n( ¯ E) = P(6, 3) = 120 and Prob( ¯ E) = 120
63 = 120 216
I Hence Prob(E) = 1 Prob( ¯ E) = 1 120
216 ⇡ 0.44
Example 2: What is probability that, in a group of 6, at least 2 people were born in same month? I E = {(1, 2, 8, 8, 8, 3), (3, 1, 4, 4, 5, 5), (2, 9, 10, 3, 4, 10) . . .} I ¯ E = set of distinct birthday-month assignments I n( ¯ E) = P(12, 6) I Hence Prob(E) = 1 Prob( ¯ E) = 1 P(12,6)
126
= 1
665,280 2,985,984 ⇡ 0.78 Lecture 14 7/ 19
The Birthday Problem
What is the probability that in a room of n people, at least two share the same birthday? ◮ Assume 365 days in a year (ignore leap years) ◮ Assume that every birth day is equally likely (ignore birth seasonality) ◮ Compute the complement of the event Prob at least two identical birthdays = 1 − Prob all birthdays different = 1 − P(365, n) 365n
20 40 60 80 100 0.0 0.2 0.4 0.6 0.8 1.0 Number of people Prob(shared birthdays)Lecture 14
Disjoint Events
Definition Two events E1 and E2 are disjoint (or mutually exclusive) if they cannot occur simultaneously in the experiment. Formally, E1 \ E2 = ;.
Lecture 14 8/ 19
Disjoint Events
Definition Two events E1 and E2 are disjoint (or mutually exclusive) if they cannot occur simultaneously in the experiment. Formally, E1 \ E2 = ;. Examples I Dice roll: E1 = “getting a 3” and E2 = “getting a 4” [disjoint] I E1 = “die is even” and E2 = “die is odd” [disjoint] I E1 = “3 on first roll” and E2 = “4 on second roll” [not disjoint, e.g., (3, 4)]
Lecture 14 8/ 19
Disjoint Events
Definition Two events E1 and E2 are disjoint (or mutually exclusive) if they cannot occur simultaneously in the experiment. Formally, E1 \ E2 = ;. Examples I Dice roll: E1 = “getting a 3” and E2 = “getting a 4” [disjoint] I E1 = “die is even” and E2 = “die is odd” [disjoint] I E1 = “3 on first roll” and E2 = “4 on second roll” [not disjoint, e.g., (3, 4)] Problem: Which of these of these pairs of events are disjoint? I In 4 coin tosses, E1 = “exactly 3 heads” and E2 = “exactly 2 tails” I When choosing 4 cards, E1 = “cards have same value” and E2 = “cards have same suit” I When choosing a committee of 3 from 8 men and 12 women, E1 = “committee has a woman” and E2 = “committee has a man”
Lecture 14 8/ 19
Disjoint
PLED
TH HH
HTHH
HATH
HHHT
Disjoint
hot
disjoint
.Ex
:EM,w,w }
The Sum Rule for Probability
Sum Rule If E1 and E2 are disjoint events, then Prob(E1 or E2) = Prob(E1) + Prob(E2)
Lecture 14 9/ 19
The Sum Rule for Probability
Sum Rule If E1 and E2 are disjoint events, then Prob(E1 or E2) = Prob(E1) + Prob(E2) Example: For a roll of two dice, E1 = “at least one 5” and E2 = “sum equals 5” I E1 = {(1, 5), (2, 5), (3, 5), (4, 5), (6, 5), (5, 1), (5, 2), (5, 3), (5, 4), (5, 6), (5, 5)}, so n(E1) = 11 I E2 = {(1, 4), (2, 3), (4, 1), (3, 2)}, so n(E2) = 4 I Or n(E1) = 62 52 = 11 and n(E2) = C(r 1, r n) = C(4, 3) = 4 I Hence Prob(E1 or E2) = Prob(E1) + Prob(E2) = 11
36 + 4 36 = 5 12 Lecture 14 9/ 19
The Generalized Sum Rule
Generalized Sum Rule If E1 and E2 are any events, then Prob(E1 or E2) = Prob(E1) + Prob(E2) Prob(E1 and E2) Example: When choosing 3 cards, what is probability of either 3 face cards (E1) or 3 cards of same suit (E2) (or both)? I C(52, 3) ways of selecting 3 cards, 13 cards per suit, 4 suits, 3 face cards per suit (jack, queen, king) I Prob(E1 or E2) = C(12,3)
C(52,3) + 4·C(13,3) C(52,3) 4·1 C(52,3) = 1,360 22,100 ⇡ 0.0615 Lecture 14 10/ 19
4.3=12
face
cards
. waysoeouo.singralueswaysdoho.es/hgsuittotal
waysoepayheoowyfff.waysoechoosingsuit.wo.gs
total #
Independent Events
Definition Two events E1 and E2 are independent if the occurrence of one is not influenced by
Lecture 14 11/ 19
Independent Events
Definition Two events E1 and E2 are independent if the occurrence of one is not influenced by
Examples I Toss coin twice: E1 = “heads on first toss” and E2 = “tails on second toss” [independent] I Choosing 2 cards from a deck: E1 = “first card is an ace” and E2 = “second card is an ace” [not independent]
Lecture 14 11/ 19
Independent Events, Continued
Definition Two events E1 and E2 are independent if the occurrence of one is not influenced by
Problem: Which of these pairs of events are independent? I In 4 die rolls, E1 = “first 2 rolls sum to 7” and E2 = “last 2 rolls sum to 10” I When choosing a committee of 3 from 8 men and 12 women, E1 = “committee has a woman” and E2 = “committee has a man” I In a household with 4 children, E1 = “first child is male” and E2 = “at least half the children are female”
Lecture 14 12/ 19
Independent
Not independent
.if E,
En
is
less
likely
Not
independent .If E ,
Eris
less
likely
.The Product Rule for Probability
Product Rule If E1 and E2 are independent events, then Prob(E1 and E2) = Prob(E1) · Prob(E2) Example: For two rolls of a “loaded” die with Prob(6) = 1/2 and Prob(i) = 1/10 for i 6= 6, E1 = “5 on first roll” and E2 = “6 on second roll” I Prob(E1 and E2) = Prob(E1) · Prob(E2) =
1 10 · 1 2 = 1 20 Lecture 14 13/ 19
The Product Rule, Continued
Product Rule If E1 and E2 are independent events, then Prob(E1 and E2) = Prob(E1) · Prob(E2) Problem: For each example, compute Prob(E1 and E2) and Prob(E1) · Prob(E2) I In 4 die rolls, E1 = “first 2 rolls sum to 7” and E2 = “last 2 rolls sum to 10” I When choosing a committee of 3 from 8 men and 12 women, E1 = “committee has a woman” and E2 = “committee has a man” I In a household with 4 children, E1 = “first child is male” and E2 = “at least half the children are female”
Lecture 14 14/ 19
plan
ED
PLED
. MEDPLED
,
RED
I
RED-0767
plane
. )PLETT PLED f-
Inoting
PLED
PLEI.ph
f)
tPbf )tPl4f)±!¥t tf , -6124¥
: ¥ soPLED
. PLEDNEM Ed
Conditional Probability
Definition Given events E1 and E2, the conditional probability of E1 given E2, denoted by Prob(E1|E2), is the probability that E1 happens given that E2 occurs. If E1 and E2 are independent, then Prob(E1|E2) = Prob(E1).
Lecture 14 15/ 19
Conditional Probability
Definition Given events E1 and E2, the conditional probability of E1 given E2, denoted by Prob(E1|E2), is the probability that E1 happens given that E2 occurs. If E1 and E2 are independent, then Prob(E1|E2) = Prob(E1). Example: Choose a committee of 3 from 8 men and 12 women, let W = “committee has a woman” and M = “committee has a man”, and compute Prob(W |M) I Reduced sample space S0 is set of (equally likely) committees that have a man: n(M) = n(S) n( ¯ M) = C(20, 3) C(12, 3) = 920 I Those that also contain a woman are of the form {M, W , W } or {M, M, W } I Hence Prob(W |M) = C(8,1)·C(12,2)+C(8,2)·C(12,1)
920
= 864
920 = 108 115 Lecture 14 15/ 19
Conditional Probability
Definition Given events E1 and E2, the conditional probability of E1 given E2, denoted by Prob(E1|E2), is the probability that E1 happens given that E2 occurs. If E1 and E2 are independent, then Prob(E1|E2) = Prob(E1). Example: Choose a committee of 3 from 8 men and 12 women, let W = “committee has a woman” and M = “committee has a man”, and compute Prob(W |M) I Reduced sample space S0 is set of (equally likely) committees that have a man: n(M) = n(S) n( ¯ M) = C(20, 3) C(12, 3) = 920 I Those that also contain a woman are of the form {M, W , W } or {M, M, W } I Hence Prob(W |M) = C(8,1)·C(12,2)+C(8,2)·C(12,1)
920
= 864
920 = 108 115
Observation I We calculated Prob(W |M) as n(W \M)
n(M)
I Dividing top and bottom by n(S)—total number possible committees—we have Prob(W |M) = n(W \M)/n(S)
n(M)/n(S)
= Prob(W and M)
Prob(M) Lecture 14 15/ 19
General Product Rule
General Product Rule If E1 and E2 are any events, then Prob(E1 and E2) = Prob(E1) · Prob(E2|E1) = Prob(E2) · Prob(E1|E2) Example: Draw 2 marbles from bag with 3 red, 5 white, 8 green
I Prob(R1 and R2) = Prob(R1) · Prob(R2|R1) = 3 16 · 2 15 = 1 40
I Apply sum rule and then product rule I Prob(W1 and G2) + Prob(G1 and W2) =
Prob(W1) · Prob(G2|W1) + Prob(G1) · Prob(W2|G1) =
5 16 · 8 15 + 8 16 · 5 15 = 1 3 Lecture 14 16/ 19
Drug Testing for Athletes
Properties of a drug test I For steroid users, test is correct with probability 0.995 and wrong with probability 0.005 I For non-steroid users, test is correct with probability 0.98 and wrong with probability 0.02 2% of athletes use steroids Analysis of the test I P = “test is positive for steroid use” and S = “athlete has used steroids” I What is Prob(P and S)? I What is Prob(P and ¯ S)? I What is Prob(P)? I What is Prob(S|P)?
Lecture 14 17/ 19 Prob(E1 and E2) = Prob(E1) · Prob(E2|E1) = Prob(E2) · Prob(E1|E2)
Robb)
. Prob ( Pls)=0.02^0.995--0.0199
Protests)
. prob ( P1 5) =0.98
. 0.02=0.0198Prob ( Pants ) t Prob (p and 5)
=0-0199+0.0198=0.0395
Prob ( P and
5)
pnobcp,
= o.ci#s-0.504
practically a
!
Bayes Rule
Bayes’ Rule Prob(E2|E1) = Prob(E1|E2)·Prob(E2)
Prob(E1)
, which is often written in the form Prob(E2|E1) / Prob(E1|E2) · Prob(E2) I Prob(E2) is called the prior probability, Prob(E1|E2) is called the likelihood, Prob(E2|E1) is called the posterior probability I This is the fundamental equation of statistical machine learning
Lecture 14 18/ 19 Prob(E1 and E2) = Prob(E1) · Prob(E2|E1) = Prob(E2) · Prob(E1|E2)
!
proportional
to
"Bernoulli Trials
We will now look at outcomes that are not equally likely I Look at experiment based on repetitions of simple experiment (“trials”)
Lecture 14 19/ 19
Bernoulli Trials
We will now look at outcomes that are not equally likely I Look at experiment based on repetitions of simple experiment (“trials”) Example: Baseball player gets a hit with probability 1/3 every time he steps to the
Lecture 14 19/ 19
Bernoulli Trials
We will now look at outcomes that are not equally likely I Look at experiment based on repetitions of simple experiment (“trials”) Example: Baseball player gets a hit with probability 1/3 every time he steps to the
I Assume that tries are independent I Define H = “gets a hit” and N = “does not get a hit” I Outcome NNNN more likely than HHHH I Can use product rule for any given sequence: Prob(HNNN) = Prob(H) · Prob(N) · Prob(N) · Prob(N) = ⇣ 1 3 ⌘ · ⇣ 2 3 ⌘ · ⇣ 2 3 ⌘ · ⇣ 2 3 ⌘ = ⇣ 1 3 ⌘⇣ 2 3 ⌘3 I Final solution: Prob(E) = Prob(HNNN or NHNN or NNHN or NNNH} = ⇣ 1 3 ⌘⇣ 2 3 ⌘3 + ⇣ 1 3 ⌘⇣ 2 3 ⌘3 + ⇣ 1 3 ⌘⇣ 2 3 ⌘3 + ⇣ 1 3 ⌘⇣ 2 3 ⌘3 = 4 ⇣ 1 3 ⌘⇣ 2 3 ⌘3 I Q: What is probability of at least one hit in 4 times at bat?
Lecture 14 19/ 19
= Ihits )
4
Bernoulli Trials: General Case
Theorem (Bernoulli Trials) For a sequence of n Bernoulli trials with success probability p, the probability of exactly k successes is C(n, k) · pk · (1 p)nk.
Lecture 14 20/ 19
Bernoulli Trials: General Case
Theorem (Bernoulli Trials) For a sequence of n Bernoulli trials with success probability p, the probability of exactly k successes is C(n, k) · pk · (1 p)nk. Proof
Lecture 14 20/ 19
Bernoulli Trials: General Case
Theorem (Bernoulli Trials) For a sequence of n Bernoulli trials with success probability p, the probability of exactly k successes is C(n, k) · pk · (1 p)nk. Proof
Example 1: Probability of exactly five 6’s in 10 rolls of a die I C(10, 5) = 252 ordered lists of length 10 having five S’s and five F’s I Probability of each such list, e.g., SSSSSFFFFF, is 1
6
5 5
6
5 I So answer is C(10, 5) 1
6
5 5
6
5
Lecture 14 20/ 19
Bernoulli Trials: General Case
Theorem (Bernoulli Trials) For a sequence of n Bernoulli trials with success probability p, the probability of exactly k successes is C(n, k) · pk · (1 p)nk. Proof
Example 1: Probability of exactly five 6’s in 10 rolls of a die I C(10, 5) = 252 ordered lists of length 10 having five S’s and five F’s I Probability of each such list, e.g., SSSSSFFFFF, is 1
6
5 5
6
5 I So answer is C(10, 5) 1
6
5 5
6
5 Example 2: Given a biased coin with Prob(heads) = 4/7, what is the probability of at least 8 heads in 10 tosses? I Disjoint cases: 8 heads, 9 heads, 10 heads I using Theorem and sum rule, we get C(10, 8) 4
7
8 3
7
2 + C(10, 9) 4
7
9 3
7
1 + C(10, 10) 4
7
10 3
7
0 ⇡ 0.1255
Lecture 14 20/ 19
Bernoulli Trials: General Case
Theorem (Bernoulli Trials) For a sequence of n Bernoulli trials with success probability p, the probability of exactly k successes is C(n, k) · pk · (1 p)nk. Proof
Example 1: Probability of exactly five 6’s in 10 rolls of a die I C(10, 5) = 252 ordered lists of length 10 having five S’s and five F’s I Probability of each such list, e.g., SSSSSFFFFF, is 1
6
5 5
6
5 I So answer is C(10, 5) 1
6
5 5
6
5 Example 2: Given a biased coin with Prob(heads) = 4/7, what is the probability of at least 8 heads in 10 tosses? I Disjoint cases: 8 heads, 9 heads, 10 heads I using Theorem and sum rule, we get C(10, 8) 4
7
8 3
7
2 + C(10, 9) 4
7
9 3
7
1 + C(10, 10) 4
7
10 3
7
0 ⇡ 0.1255
Lecture 14 20/ 19