lecture 19 more than two random variables
play

Lecture 19: More Than Two Random Variables 0/ 17 Definition If X 1 - PowerPoint PPT Presentation

Lecture 19: More Than Two Random Variables 0/ 17 Definition If X 1 , X 2 , . . . , X n are discrete random variables defined on the same sample space then their joint pmf is the function P X 1 , X 2 ,..., X n ( x 1 , x 2 , . . . , x n ) = P ( X 1


  1. Lecture 19: More Than Two Random Variables 0/ 17

  2. Definition If X 1 , X 2 , . . . , X n are discrete random variables defined on the same sample space then their joint pmf is the function P X 1 , X 2 ,..., X n ( x 1 , x 2 , . . . , x n ) = P ( X 1 = x 1 , . . . , X n = x n ) If X 1 , X 2 , . . . , X n are continuous then their joint pdf is the function f X 1 , X 2 ,..., X n ( x 1 , x 2 , . . . , x n ) such that 1/ 17 Lecture 19: More Than Two Random Variables

  3. Definition (Cont.) for any region A in R n � � P (( X 1 , X 2 , . . . , X n ) ∈ A ) = f X 1 , X 2 ,..., X n ( x 1 , . . . , x n ) dx 1 , . . . , dx n . . . A � ��������������������������������������������������� �� ��������������������������������������������������� � n-fold multiple integral Definition The discrete random variables X 1 , X 2 , . . . , X n are independent if P X 1 ,..., X n ( x 1 , . . . , x n ) = P X 1 ( x 1 ) P X 2 ( x 2 ) . . . P X n ( x n ) . Equivalently P ( X 1 = x 1 , . . . , X n = x n ) = P ( X 1 = x 1 ) . . . P ( X n = x n ) 2/ 17 Lecture 19: More Than Two Random Variables

  4. The continuous random variables X 1 , X 2 , . . . , X n are independent if f X 1 , X 2 ,..., X n ( x 1 , x 2 , . . . , x n ) = f X 1 ( x 1 ) f X 2 ( x 2 ) . . . f X n ( x n ) Definition X 1 , X 2 , . . . , X n are pairwise independent if each pair X i , X j ( i � j ) is independent. We will now see Pairwise independence � = ⇒ Independence of random variables ⇐ = of random variables 3/ 17 Lecture 19: More Than Two Random Variables

  5. First we will prove ⇐ = Theorem X 1 , X 2 , . . . , X n independent ⇒ X 1 , X 2 , . . . , X n are pairwise independent. From now on we will restrict to the case n = 3 so we have THREE random variables X , Y , Z . 4/ 17 Lecture 19: More Than Two Random Variables

  6. How do we get P X , Y ( x , y ) P X , Y , Z ( x , y , z ) from Answer (left to you to prove) � P X , Y ( x , y ) = P X , Y , Z ( x , y , z ) ( # ) all z Now we can prove X , Y , Z independent. = ⇒ X , Y independent 5/ 17 Lecture 19: More Than Two Random Variables

  7. Since X , Y , Z are independent we have P X , Y , Z ( x , y , z ) = P X ( x ) P Y ( y ) P Z ( z ) ( ## ) Now play the RHS of ( ## ) into the RHS of ( # ) 6/ 17 Lecture 19: More Than Two Random Variables

  8. This proves X and Y are independent Identical proofs prove the pairs X , Z and Y , Z are independent. Now we construct X , Y , Z (actually X A , X B , X C ) so that each pair is independent but the triple X , Y , Z is not independent . 7/ 17 Lecture 19: More Than Two Random Variables

  9. The multinomial coefficient � � n The multinomial coefficient is defined by k 1 , k 2 , ··· , k r � � n n ! = k 1 , k 2 , · · · , k r k 1 ! k 2 ! · · · k r ! Suppose an experiment has r outcomes denoted 1 , 2 , 3 , · · · , r with probabilities p 1 , p 2 , · · · , p r respectively. Repeat the experiment n times and assume the trials are independent. � � n k 1 , k 2 , ··· , k r 8/ 17 Lecture 19: More Than Two Random Variables

  10. A Variation on the Cool Counter example Lets go back to the “cool counter example”, Lecture 16, page 18 of three events A , B , C which are pairwise independent but no independent so P ( A ∩ B ∩ C ) � P ( A ) P ( B ) P ( C ) The idea is to convert the three events to random variables X A , X B , X C so that X A = 1 on A and O on A ’ etc. 9/ 17 Lecture 19: More Than Two Random Variables

  11. In fact we won’t need the corner points ( − 1 , − 1 ) , ( − 1 , 1 ) , ( 1 , − 1 ) and ( 1 , 1 ) we put S 1 = ( 0 , 1 ) , S 2 = ( − 1 , 0 ) , S 3 = ( 0 , 1 ) , S 4 = ( 1 , 0 ) and retain their probabilities so P ( { S j } ) = 1 4 , 1 ≤ j ≤ 4 10/ 17 Lecture 19: More Than Two Random Variables

  12. We define A = { s 1 , s 2 } B = { s 1 , s 3 } C = { s 1 , s 4 } We define X A , X B , X C on S by  1 , if s j ∈ A    X A ( s j ) =   0 , if s j � A  11/ 17 Lecture 19: More Than Two Random Variables

  13.  1 , if s j ∈ B    X B ( s j ) =   0 , if s j � B   1 , if s j ∈ C    X C ( s j ) =   0 , if s j � C  P ( X A = 1 ) = P ( { S 1 , S 2 } ) = 1 So 2 P ( X A = 0 ) = P ( { S 3 , S 4 } ) = 1 2 and similarly for X B and X C . So X A , X B and X C are Bernoulli random variables 12/ 17 Lecture 19: More Than Two Random Variables

  14. Let’s compute the joint pmf of X A and X B . We know the margin ❍❍❍❍❍ X B 0 1 X A ❍ 0 1 / 2 1 1 / 2 1 / 1 / 2 2 The subset where X A = 1 is the subset { s 1 , s 2 } so we write an equality of events ( X A = 1 ) = { s 1 , s 2 } Similarly ( X A = 0 ) = { s 3 , s 4 } ( X B = 1 ) = { s 1 , x 3 } , ( X B = 0 ) = { s 2 , s 4 } ( X C = 1 ) = { s 1 , s 4 } , ( X C = 0 ) = { s 2 , s 3 } 13/ 17 Lecture 19: More Than Two Random Variables

  15. Hence ( X A = 0 ) ∩ ( X B = 0 ) = { S 4 } P ( X A = 0 , X B = 0 ) = 1 so 4 ( X A = 0 ) ∩ ( X B = 1 ) = { S 3 } P ( X A = 0 , X B = 1 ) = 1 so 4 ( X A = 1 ) ∩ ( X B = 0 ) = { S 2 } P ( X A = 1 , X B = 0 ) = 1 4 ( X A = 1 ) ∩ ( X B = 1 ) = { S 1 } P ( X A = 1 , X B = 1 ) = P ( { S 1 } ) = 1 4 etc. 14/ 17 Lecture 19: More Than Two Random Variables

  16. So the joint proof of X A and X B is ❍❍❍❍❍ X B 0 1 X A ❍ 0 1 / 1 / 1 / 4 4 2 1 1 / 1 / 1 / 4 4 2 1 / 1 / 2 2 so X A and X B are independent. The same is true for X A and X C and X B and χ C . Now we show the triple X A , X B and X C is NOT independent. 15/ 17 Lecture 19: More Than Two Random Variables

  17. We will show P ( X A = 1 , X B = 1 , X C = 1 ) � P ( X A = 1 ) P ( X B = 1 ) P ( X C = 1 ) � 1 � � 1 � � 1 � = 1 The RHS = 2 2 2 8 The left-hand side is the probability of the event ( X A = 1 ) ∩ ( X B = 1 ) ∩ ( X C = 1 ) = { S 1 , S 2 } ∩ { S 1 , S 3 } ∩ { S 1 , S 4 } = { S 1 } . 16/ 17 Lecture 19: More Than Two Random Variables

  18. So P ( X A = 1 , X B = 1 , X C = 1 ) = P ( { S 1 } ) = 1 4 so LHS = 1 4 RHS = 1 8 Remark This counter example is more or less the some as the “cool counter example”. We just replaced (more or less) A , B , C by their “characteristic functions”. 17/ 17 Lecture 19: More Than Two Random Variables

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend