Generating Functions Saravanan Vijayakumaran sarva@ee.iitb.ac.in - - PowerPoint PPT Presentation

generating functions
SMART_READER_LITE
LIVE PREVIEW

Generating Functions Saravanan Vijayakumaran sarva@ee.iitb.ac.in - - PowerPoint PPT Presentation

Generating Functions Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay March 14, 2014 1 / 11 Generating Functions Definition The generating function of a sequence of real


slide-1
SLIDE 1

Generating Functions

Saravanan Vijayakumaran sarva@ee.iitb.ac.in

Department of Electrical Engineering Indian Institute of Technology Bombay

March 14, 2014

1 / 11

slide-2
SLIDE 2

Generating Functions

Definition

The generating function of a sequence of real numbers {ai : i = 0, 1, 2, . . .} is defined by G(s) =

  • i=0

aisi for s ∈ R for which the sum converges.

Example

Consider the sequence ai = 2−i, i = 0, 1, 2, . . .. G(s) =

  • i=0

s 2 i = 1 1 − s

2

for |s| < 2.

Definition

Suppose X is a discrete random variable taking non-negative integer values {0, 1, 2, . . .}. The generating function of X is the generating function of its probability mass function. G(s) =

  • i=0

P[X = i]si = E[sX]

2 / 11

slide-3
SLIDE 3

Examples of Generating Functions

  • Constant RV: Suppose P(X = c) = 1 for some fixed c ∈ Z+

G(s) = E(sX) = sc

  • Bernoulli RV: P(X = 1) = p and P(X = 0) = 1 − p

G(s) = 1 − p + ps

  • Geometric RV: P(X = k) = p(1 − p)k−1 for k ≥ 1

G(s) =

  • k=1

skp(1 − p)k−1 = ps 1 − s(1 − p)

  • Poisson RV: P[X = k] = e−λλk

k!

for k ≥ 0 G(s) =

  • k=0

sk e−λλk k! = eλ(s−1)

3 / 11

slide-4
SLIDE 4

Moments from the Generating Function

Theorem

If X has generating function G(s) then

  • E[X] = G(1)(1)
  • E[X(X − 1) · · · (X − k + 1)] = G(k)(1)

where G(k) is the kth derivative of G(s).

Result

var(X) = G(2)(1) + G(1)(1) − G(1)(1)2

Example (Geometric RV)

A geometric RV X has generating function G(s) =

ps 1−s(1−p). var(X) =?

G(1)(1) = ∂ ∂s ps 1 − s(1 − p)

  • s=1

= 1 p G(2)(1) = ∂2 ∂s2 ps 1 − s(1 − p)

  • s=1

= 2(1 − p) p + 2(1 − p)2 p2 var(X) = 1 − p p2

4 / 11

slide-5
SLIDE 5

Generating Function of a Sum of Independent RVs

Theorem

If X and Y are independent, GX+Y(s) = GX(s)GY(s)

Example (Binomial RV)

Using above theorem, how can we find the generating function of a binomial random variable? A binomial random variable with parameters n and p is a sum of n independent Bernoulli random variables. S = X1 + X2 + · · · + Xn−1 + Xn where each Xi has generating function G(s) = 1 − p + ps = q + ps. GS(s) = [G(s)]n = [q + ps]n

Example (Sum of independent Poisson RVs)

Let X and Y be independent Poisson random variables with parameters λ and µ respectively. What is the distribution of X + Y? Poisson with parameter λ + µ

5 / 11

slide-6
SLIDE 6

Sum of a Random Number of Independent RVs

Theorem

Let X1, X2, . . . is a sequence of independent identically distributed (iid) random variables with common generating function GX(s). Let N be a random variable which is independent of the Xi’s and has generating function GN(s). Then S = X1 + X2 + · · · + XN has generating function given by GS(s) = GN (GX(s))

Example

A group of hens lay N eggs where N has a Poisson distribution with parameter λ. Each egg results in a healthy chick with probability p independently of the other eggs. Let K be the number of healthy chicks. Find the distribution of K. Solution Poisson with parameter λp

6 / 11

slide-7
SLIDE 7

Joint Generating Function

Definition

The joint generating function of random variables X and Y taking values in the non-negative integers is defined by GX,Y(s1, s2) =

  • i=0

  • j=0

P[X = i, Y = j]si

1sj 2 = E[sX 1 sY 2 ]

Theorem

Random variables X and Y are independent if and only if GX,Y(s1, s2) = GX(s1)GY(s2), for all s1 and s2.

7 / 11

slide-8
SLIDE 8

Application: Coin Toss Game

A biased coin which shows heads with probability p is tossed repeatedly. Player A wins if m heads appear before n tails, and player B wins otherwise. What is the probability of A winning?

  • Let pm,n be the probability that A wins
  • Let q = 1 − p. We have the following recurrence relation

pm,n = ppm−1,n + qpm,n−1, for m, n ≥ 1

  • For m, n > 0, we have pm,0 = 0 and p0,n = 1. Let p0,0 = 0.
  • Consider the generating function

G(x, y) =

  • m=0

  • n=0

pm,nxmy n

  • Multiplying the recurrence relation by xmy n and sum over m, n ≥ 1

  • m=1

  • n=1

pm,nxmy n =

  • m=1

  • n=1

ppm−1,nxmy n +

  • m=1

  • n=1

qpm,n−1xmy n

8 / 11

slide-9
SLIDE 9

Coin Toss Game

  • Providing the terms corresponding to m = 0 and n = 0

G(x, y) −

  • m=1

pm,0xm −

  • n=1

p0,ny n = px

  • m=1

  • n=1

pm−1,nxm−1y n + qy

  • m=1

  • n=1

pm,n−1xmy n−1

  • Using the boundary conditions we have

G(x, y) − y 1 − y = pxG(x, y) + qy

  • G(x, y) −

y 1 − y

  • =

⇒ G(x, y) = y(1 − qy) (1 − y)(1 − px − qy)

  • The coefficient of xmy n n G(x, y) gives pm,n

9 / 11

slide-10
SLIDE 10

Application: Random Walk

  • Let X1, X2, . . . be independent random variables taking value 1 with

probability p and value −1 with probability 1 − p

  • The sequence Sn = n

i=1 Xi is a random walk starting at the origin

  • What is the probability that the walker ever returns to the orgin?
  • Let f0(n) = Pr(S1 = 0, . . . , Sn−1 = 0, Sn = 0) be the probability that the

first return to the origin occurs after n steps

  • Let p0(n) = Pr(Sn = 0) be the probability of being at the origin after n

steps

  • Consider the following generating functions

P0(s) =

  • n=0

p0(n)sn, F0(s) =

  • n=0

f0(n)sn.

  • P0(s) = 1 + P0(s)F0(s)
  • P0(s) = (1 − 4pqs2)− 1

2

  • F0(s) = 1 − (1 − 4pqs2)

1 2

n=1 f0(n) = F0(1) = 1 − |p − q| 10 / 11

slide-11
SLIDE 11

Reference

  • Chapter 5, Probability and Random Processes, Grimmett

and Stirzaker, Third Edition, 2001.

11 / 11