statistics i supplements for chapters 5 and 6 moment
play

Statistics I Supplements for Chapters 5 and 6 Moment Generating - PowerPoint PPT Presentation

Statistics I Chapters 5 and 6 Supplements, Fall 2012 1 / 46 Statistics I Supplements for Chapters 5 and 6 Moment Generating Functions Ling-Chieh Kung Department of Information Management National Taiwan University October 31, 2012


  1. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 1 / 46 Statistics I – Supplements for Chapters 5 and 6 Moment Generating Functions Ling-Chieh Kung Department of Information Management National Taiwan University October 31, 2012

  2. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 2 / 46 Introduction ◮ Today we will study an important mathematical tool for Probability and Statistics: The moment generating function. ◮ It is useful in deriving means and variances. ◮ It is useful in finding the distribution of a random variable. ◮ It is required to understand materials in Chapters 7 to 9. ◮ To memorize them, you do not need it. ◮ To know why they are true , you need it. ◮ But it may be hard...

  3. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 3 / 46 Moment generating functions (MGF) Road map ◮ Moment generating functions (MGF) . ◮ MGF for distributions. ◮ MGF for independent sums.

  4. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 4 / 46 Moment generating functions (MGF) Moments ◮ For a random variable, we typically use its mean and variance to describe it. ◮ In general, we may use moments : Definition 1 (Moments) The k th moment of a random variable X is defined as X k � µ ′ � k ≡ E .

  5. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 5 / 46 Moment generating functions (MGF) Moments: an example ◮ Consider the uniform distribution Uni(0 , 1): ◮ f ( x ) = 1. 1 = E [ X ] = 1 ◮ µ ′ 2 . � 1 2 = E [ X 2 ] = 0 x 2 dx = 1 ◮ µ ′ 3 . � 1 3 = E [ X 3 ] = 0 x 3 dx = 1 ◮ µ ′ 4 . ◮ In general, µ ′ 1 k = k +1 .

  6. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 6 / 46 Moment generating functions (MGF) Moments: the general case ◮ The first moment : 1 ≡ E [ X 1 ] = E [ X ] = µ . ◮ µ ′ ◮ The second moment : 2 ≡ E [ X 2 ]. ◮ µ ′ ◮ Moreover, σ 2 = E [ X 2 ] − E [ X ] 2 = µ ′ 2 − µ 2 . ◮ For most practical random variables, there are infinitely many moments.

  7. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 7 / 46 Moment generating functions (MGF) Moments and distributions ◮ When we use moments to describe distributions: ◮ When two RV have the same mean and variance (and thus the same second moment), they may follow different distributions. ◮ When their first, second, and third moments are all the same, it is more likely that they are the same. ◮ When their first four moments are all the same... ◮ In all moments are the same: Proposition 1 (Moments and distributions) If two random variables have all their moments identical, they must follow exactly the same distribution. Proof. Beyond the scope of this course.

  8. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 8 / 46 Moment generating functions (MGF) Moment generating functions ◮ The proposition is attractive but hard to use. ◮ It will be a nightmare to calculate all the (infinitely many) moments of a random variable. ◮ Fortunately, statisticians have found an easier way through moment generating functions (MGF). Definition 2 The moment-generating function m ( t ) for a random variable X is defined as � e tX � m ( t ) ≡ E .

  9. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 9 / 46 Moment generating functions (MGF) Moment generating functions ◮ m ( t ) ≡ E [ e tX ] is called the moment generating function because it generates moments . Why? ◮ Recall that you may do a Taylor expansion on e tx as e tx = 1 + tx + ( tx ) 2 + ( tx ) 3 + · · · . 2! 3!

  10. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 10 / 46 Moment generating functions (MGF) Moment generating functions ◮ With this, the MGF (assuming X is discrete) satisfies e tx Pr( x ) � e tX � � = E x ∈ S 1 + tx + ( tx ) 2 + ( tx ) 3 � � � = + · · · Pr( x ) 2! 3! x ∈ S x Pr( x ) + t 2 x 2 Pr( x ) + t 3 x 3 Pr( x ) + · · · � � � � = Pr( x ) + t 2! 3! x ∈ S x ∈ S x ∈ S x ∈ S 1 + t 2 2 + t 3 = 1 + tµ ′ 2! µ ′ 3! µ ′ 3 + · · · .

  11. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 11 / 46 Moment generating functions (MGF) Moment generating functions ◮ Now consider the first-order derivative of m ( t ): 2 + t 2 d 1 + t dtm ( t ) = µ ′ 1! µ ′ 2! µ ′ 3 + · · · . ◮ If we plug in t = 0 into the above equation, we get � d � dtm ( t ) = µ ′ 1 , � � t =0 which is the first moment.

  12. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 12 / 46 Moment generating functions (MGF) Moment generating functions ◮ Now consider the second-order derivative of m ( t ): d 2 2 + t dt 2 m ( t ) = µ ′ 1! µ ′ 3 + · · · ◮ If we plug in t = 0 into the above equation, we get d 2 � � = µ ′ dt 2 m ( t ) 2 , � � t =0 which is the second moment. ◮ The k th -order derivative generates the k th moment: d k � � dt k m ( t ) = µ ′ k . � � t =0

  13. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 13 / 46 Moment generating functions (MGF) MGF of the Poisson distribution ◮ As our first example, we derive the MGF of a Poisson RV: Proposition 2 (MGF of the Poisson distribution) The moment generating function for X ∼ Poi( λ ) is m ( t ) = e λ ( e t − 1) . Proof. First, we have λe t � x ∞ ∞ � e tx λ x e − λ � � e tX ] = = e − λ � m ( t ) = E . x ! x ! x =0 x =0

  14. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 14 / 46 Moment generating functions (MGF) MGF of the Poisson distribution Proof (cont’d). Now, note that the summation is another Taylor expansion: λe t � x ∞ � e λe t = � . x ! x =0 Therefore, we have m ( t ) = e − λ e λe t = e λ ( e t − 1) and the proof is complete.

  15. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 15 / 46 Moment generating functions (MGF) MGF of the Poisson distribution ◮ Let’s apply the MGF of the Poisson distribution: Proposition 3 Let X ∼ Poi( λ ) , then E [ X ] = Var( X ) = λ. Proof. We have m ′ ( t ) = d � e λ ( e t − 1) � = λe t · e λ ( e t − 1) dt and thus m ′ (0) = E [ X ] = λ .

  16. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 16 / 46 Moment generating functions (MGF) MGF of the Poisson distribution Proof (cont’d). Moreover, we have m ′′ ( t ) = d � λe t · e λ ( e t − 1) � dt λe t � 2 · e λ ( e t − 1) = λe t · e λ ( e t − 1) + � = λe t · e λ ( e t − 1) � 1 + λe t � and thus m ′′ (0) = E [ X 2 ] = λ (1 + λ ) = λ + λ 2 . It then follows that Var( X ) = E [ X 2 ] − E [ X ] 2 = λ .

  17. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 17 / 46 Moment generating functions (MGF) MGF of the Bernoulli distribution ◮ So with the MGF, it can (sometimes) be much easier to find the mean and variance of a given random variable. ◮ As another example, let’s consider the Bernoulli distribution. Proposition 4 Let X ∼ Ber( p ) , then E [ X ] = p and Var( X ) = p (1 − p ) . Proof. The MGF m ( t ) = E [ e tX ] = p · e t + (1 − p ) · 1. Then we have m ′ ( t ) = pe t and m ′ (0) = E [ X ] = p . Moreover, we have m ′′ ( t ) = pe t and m ′′ (0) = E [ X 2 ] = p . Then Var( X ) = E [ X 2 ] − E [ X ] 2 = p − p 2 = p (1 − p ).

  18. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 18 / 46 Moment generating functions (MGF) Summary ◮ You may treat the MGF as a pure mathematical tool. ◮ It is an expectation and thus not a random variable. ◮ It generates moments through differentiation. ◮ It can be used to find means and variances.

  19. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 19 / 46 MGF for distributions Road map ◮ Moment generating functions (MGF). ◮ MGF for distributions . ◮ MGF for independent sums.

  20. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 20 / 46 MGF for distributions Two properties of MGFs ◮ There are two very important properties of MGFs: Proposition 5 (Uniqueness of MGF) For any random variable, its MGF is unique. Proof. Beyond the scope of this course. Proposition 6 (MGF and distributions) If two random variables have the same MGF, then they follow the same distribution. Proof. Having identical MGF means having all moments identical, which mean the distributions are identical.

  21. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 21 / 46 MGF for distributions MGFs for distributions ◮ How may we apply the above proposition to derive the distribution of a random variable? ◮ As an example, suppose for a random variable X we find its MGF is e 4( e t − 1) . ◮ Also we know the MGF of Poi( λ ) is e λ ( e t − 1) . ◮ Then we may conclude that X ∼ Poi(4). ◮ In other words, we need to first find the MGF or those well-known distributions (binomial, Poisson, exponential, normal, etc.) before we use this method.

  22. Statistics I – Chapters 5 and 6 Supplements, Fall 2012 22 / 46 MGF for distributions MGFs for distributions Distribution MGF m ( t ) Distribution MGF m ( t ) pe t + (1 − p ) Ber( p ) Uni( a, b ) ? Bi( n, p ) ? Exp( λ ) ? HG( N, A, n ) ? ND( µ, σ ) ? e λ ( e t − 1) Poi( λ ) Gamma( α, β ) ? χ 2 ( n ) ?

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend