Chapter 2
Discrete Random Variables
Peng-Hua Wang
Graduate Institute of Communication Engineering National Taipei University
Chapter 2 Discrete Random Variables Peng-Hua Wang Graduate - - PowerPoint PPT Presentation
Chapter 2 Discrete Random Variables Peng-Hua Wang Graduate Institute of Communication Engineering National Taipei University Chapter Contents 2.1 Basic Concepts 2.2 Probability Mass Functions 2.3 Functions of Random Variables 2.4 Expectation,
Graduate Institute of Communication Engineering National Taipei University
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 2/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 3/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 4/58
■ For an experiment, a random variable is a particular
■ “Mathematically, a random variable is a real-valued
■ We can assign probabilities to values of a random
■ When do we use random variables? ◆ Outcomes are numerical: dice roll, stock prices, ... ◆ Outcomes are not numerical, but associated with
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 5/58
■ A sequence of 3 tosses of a coin ■ The outcomes are
■ The number of heads in the sequence is a random
◆ Let X be The number of heads. We have
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 6/58
■ Deterministic function of a random variable is also a
◆ Let Y = X2 is the square function of X. ◆ P(Y = 4) = P(X = 2) =
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 7/58
■ Random variables are real-valued functions of the
■ Deterministic functions of a random variable are also
■ Each random variable can be associated with certain
■ A random variable can be conditioned on an event or on
■ We can define independence between random variables.
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 8/58
■ If the range of a random variable (all values that it can
■ If the range of a random variable is uncountably infinite,
◆ Select a number a from the interval [0, 1]. ◆ The random variable X = a2 is not discrete. ◆ The random variable
■ We focus on discrete random variables in this chapter.
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 9/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 10/58
■ For a discrete random variable X, the probability mass
■ For example, toss of two fair coins. Let X be the number
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 11/58
■ Let S be the set of all possible values of a random
■ Let A be a set of some values of a random variable X.
■ For example, toss of two fair coins. Let X be the number
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 12/58
■ Bernoulli random variable X, X = 0 or 1 is defined by
■ We can use Bernoulli rv for modeling a coin toss. p is the
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 13/58
■ Binomial random variable X, X = 0, 1, ...n is defined by
■ We can use binomial rv for modeling the number of
■ ∑n
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 14/58
■ Toss a coin repeatedly. Let X be the number of tosses
■ X is called a geometric rv. ■ ∑∞
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 15/58
■ Let λ be the average typos per n words, or the “typo
■ If n is large but λ remains fixed (i.e., p is very small), we
■ This is called the Poisson random variable.
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 16/58
■ ∑∞
■ We can use Poisson rv for modeling ◆ The number of miss-spelled words. ◆ The number of cars involved in accidents in a city on a
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 17/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 18/58
■ Let X is a random variable. We can generate another
■ If X is a random variable, then Y = g(X) is also a
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 19/58
■ Let X is a uniform random variable. X = −4, −3, ...3, 4.
■ Let Y = |X|. Find the PMF of Y. ■ Let Z = X2. Find the PMF of Z.
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 20/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 21/58
■ The PMF of a random variable X provides us with all
■ If we want to obtain a summary of X, we can use the
■ The expected value is a weighted average of all possible
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 22/58
■ The means of some random variables do not exists, or
■ The mean is well-defined if
■ Example: pX(2k) = 2−k, k = 1, 2, ... ■ Example: pX(2k) = pX(−2k) = 2−k, k = 2, 3, .... This
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 23/58
■ Two independent coin tosses, each with a 3/4
■ Let X be the number of heads obtained. ■ Binomial random variable with parameters n = 2 and
■ Find the PMF of X. ■ Find E[X]
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 24/58
■ Let Y = g(X) where X and Y are random variables. ■ E[X] = ∑x xpX(x). ■ E[Y] = E[g(X)] = ∑x g(x)pX(x). ■ We do not need to calculate the PMF of Y. ■ Example. ◆ Let X is a uniform random variable.
◆ Let Y = |X|. Find E[Y]. ◆ Let Z = X2. Find E[Z].
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 25/58
■ Let X is a random variable with PMF pX(x). ■ We want to find a number c to summarize X. That is, the
■ We can use squared difference between c and values of X
■ That is, we should find a constant c to minimize
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 26/58
■ The answer c = E[X]. (proof). ■ The corresponding minimized error E[(X − E[X])2] is
■ That is, E[x] is the minimized-mean-squared estimate
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 27/58
■ Definition.
■ Standard deviation
■ In general, E[Xn] is called the nth moment of X. Mean is
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 28/58
■ Let X is a uniform random variable. X = −4, −3, ...3, 4.
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 29/58
■ If, Y = aX + b, then E[Y] = aE[X] + b and
■ var(X) = E[X2] − (E[X])2
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 30/58
■ If the weather is good (with probability 0.6), Alice walks
■ What is the mean of the time T to get to class? ■ E[T] = E[2/V] = E[2/V]
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 31/58
■ Tossing a coin which comes up a head with probability p
■ Find E[X], E[X2] and var(X)
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 32/58
■ What is the mean and variance associated with a roll of a
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 33/58
■ Find E[X] of a Poisson rv with pmf
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 34/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 35/58
■ Two discrete random variables X and Y associated with
■ If (x, y) is a pair of possible values of X and Y, the
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 36/58
■ We can calculate the PMFs of X and Y by
■ We sometimes refer to PX(x) and pY(y)as the marginal
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 37/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 38/58
■ Functions of Multiple RVs
■ Mean of Functions of Multiple RVs
■ Linearity
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 39/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 40/58
■ Joint PMF
■ Linearity
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 41/58
■ Y is a binomial RV with parameters (n, p). Then Y is the
■ The mean of Y
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 42/58
■ n people throw their hats in a box and then each picks
■ Each hat can be picked by only one person, and each
■ What is the expected value of X, the number of people
■ Let a random variable Xi = 1 if the ith person selects
■ X = X1 + X2 + · · · + Xn
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 43/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 44/58
■ The conditional PMF of a random variable X,
■ The event P({X = x} ∩ A) is disjoint for distinct x,
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 45/58
■ Let X be the roll of a fair six-sided die and let A be the
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 46/58
■ Let X and Y be two random variables associated with
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 47/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 48/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 49/58
■ The conditional expectation of X given an event A with
■ For function g(X),
■ The conditional expectation of X given a value y of Y is
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 50/58
■ If A1, · · · An form a partition of the sample space with
■ “Total expectation theorem”
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 51/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 52/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 53/58
■ The random variable X is independent of the event A if
■ Since P(X = x, A) = pX|A(x)p(A), the definition of
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 54/58
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 55/58
■ Two random variables X and Y are independent if
■ Since pX,Y(x, y) = pX|Y(x|y)pY(y), the definition of
■ If X and Y are independent, then ◆ E[XY] = E[X]E[Y] ◆ E[g(X)h(Y)] = E[g(X)]E[h(Y)] ◆ var(X + Y) = var(X) + var(Y)
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 56/58
■ Three random variables X, Y and Z are independent if
◆ f (X), g(Y), and h(Z), are independent. ◆ g(X, Y) and h(Z) are independent. ◆ In general, g(X, Y) and h(Y, Z) are NOT independent. ■ If X1, X2, . . . , Xn are independent random variables, then
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 57/58
■ If Y is a binomial rv with parameters (n, p), then
■ E[Xk] = p, E[X2
■ Var(Y) = np(1 − p)
Peng-Hua Wang, April 3, 2012 Probability, Chap 2 - p. 58/58
■ If Y is a binomial rv with parameters (n, p), then
■ Poisson RV is the limiting case of n → ∞, p → 0, np = λ ■ For this limiting case, we have