10 2 e z z f z dz z z 1 pillai however the situation here
play

(10-2) = = E ( Z ) z f ( z ) dz . Z Z 1 - PowerPoint PPT Presentation

10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent the information contained in the joint p.d.f of two r.vs. Given two r.vs X and Y and a function


  1. 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent the information contained in the joint p.d.f of two r.vs. Given two r.vs X and Y and a function define the r.v g ( x , y ), = (10-1) Z g ( X , Y ) Using (6-2), we can define the mean of Z to be + ∞ ∫ (10-2) µ = = E ( Z ) z f ( z ) dz . Z Z − ∞ 1 PILLAI

  2. However, the situation here is similar to that in (6-13), and it is possible to express the mean of in terms Z = g ( X , Y ) of without computing To see this, recall from f XY ( x , y ) f Z ( z ). (5-26) and (7-10) that ( ) ( ) < ≤ + ∆ = ∆ = < ≤ + ∆ P z Z z z f ( z ) z P z g ( X , Y ) z z Z ∑ ∑ = ∆ ∆ f ( x , y ) x y (10-3) XY ∈ ( x , y ) D ∆ z D ∆ where is the region in xy plane satisfying the above z inequality. From (10-3), we get ∑ ∑ (10-4) ∆ = ∆ ∆ z f ( ) z z g x y ( , ) f ( , x y ) x y . Z XY ∈ ( x y , ) D ∆ z As covers the entire z axis, the corresponding regions ∆ z D ∆ z are nonoverlapping, and they cover the entire xy plane. 2 PILLAI

  3. By integrating (10-4), we obtain the useful formula +∞ +∞ +∞ ∫ ∫ ∫ = = E Z ( ) z f ( ) z dz g x y f ( , ) ( , ) x y dxdy . (10-5) Z XY −∞ −∞ −∞ or = ∫ +∞ +∞ ∫ (10-6) E g X Y [ ( , )] g x y f ( , ) ( , ) x y dxdy . XY −∞ −∞ If X and Y are discrete-type r.vs, then ∑∑ (10-7) = = = E g X Y [ ( , )] g x y ( , ) ( P X x Y , y ). i j i j i j Since expectation is a linear operator, we also get   = ∑ ∑ E a g ( X Y , ) a E g [ ( X Y , )]. (10-8)   k k k k   k k 3 PILLAI

  4. If X and Y are independent r.vs, it is easy to see that Z = g ( X ) and are always independent of each other. In that W = h ( Y ) case using (10-7), we get the interesting result + ∞ + ∞ ∫ ∫ = E [ g ( X ) h ( Y )] g ( x ) h ( y ) f ( x ) f ( y ) dxdy X Y − ∞ − ∞ + ∞ + ∞ ∫ ∫ (10-9) = = g ( x ) f ( x ) dx h ( y ) f ( y ) dy E [ g ( X )] E [ h ( Y )]. X Y − ∞ − ∞ However (10-9) is in general not true (if X and Y are not independent). In the case of one random variable (see (10- 6)), we defined the parameters mean and variance to represent its average behavior. How does one parametrically represent similar cross-behavior between two random variables? Towards this, we can generalize the variance definition given in (6-16) as shown below: 4 PILLAI

  5. Covariance : Given any two r.vs X and Y , define [ ] . (10-10) = − µ − µ Cov ( X , Y ) E ( X ) ( Y ) X Y By expanding and simplifying the right side of (10-10), we also get = − µ µ = − Cov ( X , Y ) E ( XY ) E ( XY ) E ( X ) E ( Y ) X Y ____ __ __ (10-11) = − XY X Y . It is easy to see that ≤ Cov ( X , Y ) Var ( X ) Var ( Y ) . (10-12) To see (10-12), let so that = + U aX Y , [ ] { } = − µ + − µ 2 Var ( U ) E a ( X ) ( Y ) X Y (10-13) = + + ≥ 2 a Var ( X ) 2 a Cov ( X , Y ) Var ( Y ) 0 . 5 PILLAI

  6. The right side of (10-13) represents a quadratic in the variable a that has no distinct real roots (Fig. 10.1). Thus the roots are imaginary (or double) and hence the discriminant [ ] 2 − Cov ( X , Y ) Var ( X ) Var ( Y ) must be non-positive, and that gives (10-12). Using (10-12), we may define the normalized parameter Cov ( X , Y ) Cov ( X , Y ) ρ = = − ≤ ρ ≤ (10-14) , 1 1 , XY XY σ σ Var ( X ) Var ( Y ) X Y or Var ( U ) = ρ σ σ (10-15) Cov ( X , Y ) XY X Y and it represents the correlation a coefficient between X and Y . Fig. 10.1 6 PILLAI

  7. Uncorrelated r.vs : If then X and Y are said to be ρ = 0 , XY uncorrelated r.vs. From (11), if X and Y are uncorrelated, then (10-16) = E ( XY ) E ( X ) E ( Y ). Orthogonality : X and Y are said to be orthogonal if (10-17) = E ( XY ) 0 . From (10-16) - (10-17), if either X or Y has zero mean, then orthogonality implies uncorrelatedness also and vice-versa. Suppose X and Y are independent r.vs. Then from (10-9) with we get = = g ( X ) X , h ( Y ) Y , = E ( XY ) E ( X ) E ( Y ), and together with (10-16), we conclude that the random variables are uncorrelated, thus justifying the original definition in (10-10). Thus independence implies uncorrelatedness. 7 PILLAI

  8. Naturally, if two random variables are statistically independent, then there cannot be any correlation between them However, the converse is in general not ρ = ( 0 ). XY true. As the next example shows, random variables can be uncorrelated without being independent. Example 10.1: Let ∼ ∼ Suppose X and Y X U ( 0 , 1 ), Y U ( 0 , 1 ). are independent. Define Z = X + Y , W = X - Y . Show that Z and W are dependent, but uncorrelated r.vs. = + = − Solution: gives the only solution set to be z x y , w x y + − z w z w = = x , y . 2 2 Moreover < < − < < + ≤ − ≤ > 0 z 2 , 1 w 1 , z w 2 , z w 2 , z | w | and = | J ( z , w ) | 1 / 2 . 8 PILLAI

  9. Thus (see the shaded region in Fig. 10.2) < < − < < + ≤ − ≤ <  1 / 2 , 0 z 2 , 1 w 1 , z w 2 , z w 2 , | w | z , = f ZW ( z , w )  0 , otherwise ,  (10-18) w 1 z 2 − 1 and hence Fig. 10.2  1 z ∫ = < < dw z , 0 z 1 ,  2 − z  ∫ = = f ( z ) f ( z , w ) dw  Z ZW  1 2 -z ∫ = − < < dw 2 z , 1 z 2 ,   2 z- 2 or by direct computation ( Z = X + Y ) 9 PILLAI

  10. < <  z , 0 z 1 ,  (10-19) = ⊗ = − < < f ( z ) f ( z ) f ( z ) 2 z , 1 z 2 ,  Z X Y  0 , otherwise ,  and − − < <  1 | w |, 1 w 1 , 1 − 2 | w | (10-20) ∫ ∫ = = = f ( w ) f ( z , w ) dz dz  W ZW 2 0 , otherwise .  |w| Clearly Thus Z and W are not ≠ f ( z , w ) f ( z ) f ( w ). ZW Z W independent. However [ ] = + − = − = 2 2 (10-21) E ( ZW ) E ( X Y )( X Y ) E ( X ) E ( Y ) 0 , and = − = E ( W ) E ( X Y ) 0 , and hence (10-22) = − = Cov ( Z , W ) E ( ZW ) E ( Z ) E ( W ) 0 implying that Z and W are uncorrelated random variables. 10 PILLAI

  11. Example 10.2: Let Determine the variance of Z = + Z aX bY . in terms of and ρ σ , X σ . XY Y Solution: µ = = + = µ + µ E Z ( ) E aX ( bY ) a b Z X Y and using (10-15) ( )   2   σ = = − µ = − µ + − µ 2 2 Var Z ( ) E ( Z ) E a X ( ) b Y ( )     Z Z X Y ( ) = − µ + − µ − µ + − µ 2 2 2 2 a E X ( ) 2 abE ( X )( Y ) b E Y ( ) X X Y Y (10-23) = σ + ρ σ σ + σ 2 2 2 2 a 2 ab b . X XY X Y Y In particular if X and Y are independent, then and ρ = 0 , XY (10-23) reduces to (10-24) σ = σ + σ 2 2 2 2 2 a b . Z X Y Thus the variance of the sum of independent r.vs is the sum 11 of their variances = b = ( a 1 ). PILLAI

  12. Moments : + ∞ + ∞ ∫ ∫ (10-25) = k m k m E [ X Y ] x y f ( x , y ) dx dy , XY − ∞ − ∞ represents the joint moment of order ( k , m ) for X and Y . Following the one random variable case, we can define the joint characteristic function between two random variables which will turn out to be useful for moment calculations. Joint characteristic functions : The joint characteristic function between X and Y is defined as = ∫ ( ) +∞ +∞ ∫ Φ = + + j Xu Yv ( ) j Xu Yv ( ) (10-26) ( , ) u v E e e f ( , x y dxdy ) . XY XY −∞ −∞ Note that Φ ≤ Φ = ( u , v ) ( 0 , 0 ) 1 . XY XY 12 PILLAI

  13. It is easy to show that ∂ 2 Φ 1 ( u , v ) = E ( XY ) XY . (10-27) ∂ ∂ 2 j u v = = u 0 , v 0 If X and Y are independent r.vs, then from (10-26), we obtain (10-28) Φ = juX jvY = Φ Φ ( u , v ) E ( e ) E ( e ) ( u ) ( v ). XY X Y Also Φ = Φ Φ = Φ ( ) u ( , 0), u ( ) v (0, ). v (10-29) X XY Y XY More on Gaussian r.vs : From Lecture 7, X and Y are said to be jointly Gaussian as if their joint p.d.f has the form in (7- µ µ σ σ ρ 2 2 N ( , , , , ), X Y X Y 23). In that case, by direct substitution and simplification, we obtain the joint characteristic function of two jointly 13 Gaussian r.vs to be PILLAI

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend