Moments and independence Theorem : If X 1 , . . . , X p are - - PDF document

moments and independence theorem if x 1 x p are
SMART_READER_LITE
LIVE PREVIEW

Moments and independence Theorem : If X 1 , . . . , X p are - - PDF document

Moments and independence Theorem : If X 1 , . . . , X p are independent and each X i is integrable then X = X 1 X p is in- tegrable and E ( X 1 X p ) = E ( X 1 ) E ( X p ) . Moment Generating Functions Defn : The moment


slide-1
SLIDE 1

Moments and independence Theorem: If X1, . . . , Xp are independent and each Xi is integrable then X = X1 · · · Xp is in- tegrable and E(X1 · · · Xp) = E(X1) · · · E(Xp) . Moment Generating Functions Defn: The moment generating function of a real valued X is MX(t) = E(etX) defined for those real t for which the expected value is finite. Defn: The moment generating function of X ∈ Rp is MX(u) = E[eutX] defined for those vectors u for which the ex- pected value is finite.

56

slide-2
SLIDE 2

Formal connection to moments: MX(t) =

  • k=0

E[(tX)k]/k! =

  • k=0

µ′

ktk/k! .

Sometimes can find power series expansion of MX and read off the moments of X from the coefficients of tk/k!. Example: : X has density f(u) = uα−1e−u1(u > 0)/Γ(α) MGF is MX(t) =

etuuα−1e−udu/Γ(α) Substitute v = u(1 − t) to get MX(t) = (1 − t)−α

57

slide-3
SLIDE 3

For α = 1 get exponential distribution. Have power series expansion MX(t) = 1/(1 − t) =

  • tk

Write tk = k!tk/k!. Coeff of tk/k! is E(Xk) = k! Example: : Z ∼ N(0, 1). MZ(t) =

−∞ etz−z2/2dz/

√ 2π =

−∞ e−(z−t)2/2+t2/2dz/

√ 2π = et2/2 because rest of integrand is N(t, 1) density.

58

slide-4
SLIDE 4

Theorem: If M is finite for all t ∈ [−ǫ, ǫ] for some ǫ > 0 then

  • 1. Every moment of X is finite.
  • 2. M is C∞ (in fact M is analytic).
  • 3. µ′

k = dk dtkMX(0).

Note: C∞ means has continuous derivatives

  • f all orders.

Analytic means has convergent power series expansion in neighbourhood of each t ∈ (−ǫ, ǫ). Theorem: Suppose X and Y have mgfs MX and MY which are finite for all t ∈ [−ǫ, ǫ]. If MX(t) = MY (t) for all t ∈ [−ǫ, ǫ] then X and Y have the same distribution. The proofs, and many other facts about mgfs, rely on techniques of complex variables.

59

slide-5
SLIDE 5

MGFs and Sums If X1, . . . , Xp are independent and Y = Xi then the moment generating function of Y is the product of those of the individual Xi: E(etY ) =

  • i

E(etXi)

  • r MY = MXi.

Note: also true for multivariate Xi. Example: : If Xi ∼ N(µi, σ2

i ) then

MXi(t) =E

  • et(σiZi+µi)

=etµiMZi(tσi) =etµiet2σ2

i /2

=eσ2

i t2/2+tµi .

Moment generating function for Y = Xi is MY (t) =

  • exp{σ2

i t2/2 + tµi}

= exp{

  • σ2

i t2/2 +

  • µit}

which is mgf of N( µi, σ2

i ).

60

slide-6
SLIDE 6

Example: Now suppose Z1, . . . , Zν independent N(0, 1) rvs. By definition: Sν = ν

1 Z2 i has χ2 ν distribution.

It is easy to check S1 = Z2

1 has density

(u/2)−1/2e−u/2/(2√π) and then the mgf of S1 is (1 − 2t)−1/2 . It follows that MSν(t) = (1 − 2t)−ν/2 which is mgf of Gamma(ν/2, 2) rv. So: χ2

ν dstbn has Gamma(ν/2, 2) density:

(u/2)(ν−2)/2e−u/2/(2Γ(ν/2)) .

61

slide-7
SLIDE 7

Example: The Cauchy density is 1 π(1 + x2) ; corresponding moment generating function is M(t) =

−∞

etx π(1 + x2)dx which is +∞ except for t = 0 where we get 1. Every t distribution has exactly same mgf. So: can’t use mgf to distinguish such distributions. Problem: these distributions do not have in- finitely many finite moments. So: in STAT 801, develop substitute for mgf which is defined for every distribution, namely, the characteristic function.

62