joint probability distributions
play

Joint Probability Distributions In many experiments, two or more - PowerPoint PPT Presentation

ST 380 Probability and Statistics for the Physical Sciences Joint Probability Distributions In many experiments, two or more random variables have values that are determined by the outcome of the experiment. For example, the binomial experiment


  1. ST 380 Probability and Statistics for the Physical Sciences Joint Probability Distributions In many experiments, two or more random variables have values that are determined by the outcome of the experiment. For example, the binomial experiment is a sequence of trials, each of which results in success or failure. If � if the i th trial is a success 1 X i = 0 otherwise , then X 1 , X 2 , . . . , X n are all random variables defined on the whole experiment. 1 / 15 Joint Probability Distributions Introduction

  2. ST 380 Probability and Statistics for the Physical Sciences To calculate probabilities involving two random variables X and Y such as P ( X > 0 and Y ≤ 0) , we need the joint distribution of X and Y . The way we represent the joint distribution depends on whether the random variables are discrete or continuous. 2 / 15 Joint Probability Distributions Introduction

  3. ST 380 Probability and Statistics for the Physical Sciences Two Discrete Random Variables If X and Y are discrete, with ranges R X and R Y , respectively, the joint probability mass function is p ( x , y ) = P ( X = x and Y = y ) , x ∈ R X , y ∈ R Y . Then a probability like P ( X > 0 and Y ≤ 0) is just � � p ( x , y ) . x ∈ R X : x > 0 y ∈ R Y : y ≤ 0 3 / 15 Joint Probability Distributions Two Discrete Random Variables

  4. ST 380 Probability and Statistics for the Physical Sciences Marginal Distribution To find the probability of an event defined only by X , we need the marginal pmf of X : � p X ( x ) = P ( X = x ) = p ( x , y ) , x ∈ R X . y ∈ R Y Similarly the marginal pmf of Y is � p Y ( y ) = P ( Y = y ) = p ( x , y ) , y ∈ R Y . x ∈ R X 4 / 15 Joint Probability Distributions Two Discrete Random Variables

  5. ST 380 Probability and Statistics for the Physical Sciences Two Continuous Random Variables If X and Y are continuous, the joint probability density function is a function f ( x , y ) that produces probabilities: �� P [( X , Y ) ∈ A ] = f ( x , y ) d y d x . A Then a probability like P ( X > 0 and Y ≤ 0) is just � ∞ � 0 f ( x , y ) d y d x . 0 −∞ 5 / 15 Joint Probability Distributions Two Continuous Random Variables

  6. ST 380 Probability and Statistics for the Physical Sciences Marginal Distribution To find the probability of an event defined only by X , we need the marginal pdf of X : � ∞ f X ( x ) = f ( x , y ) d y , −∞ < x < ∞ . −∞ Similarly the marginal pdf of Y is � ∞ f Y ( y ) = f ( x , y ) d x , −∞ < y < ∞ . −∞ 6 / 15 Joint Probability Distributions Two Continuous Random Variables

  7. ST 380 Probability and Statistics for the Physical Sciences Independent Random Variables Independent Events Recall that events A and B are independent if P ( A and B ) = P ( A ) P ( B ) . Also, events may be defined by random variables, such as A = { X ≥ 0 } = { s ∈ S : X ( s ) ≥ 0 } . We say that random variables X and Y are independent if any event defined by X is independent of every event defined by Y . 7 / 15 Joint Probability Distributions Independent Random Variables

  8. ST 380 Probability and Statistics for the Physical Sciences Independent Discrete Random Variables Two discrete random variables are independent if their joint pmf satisfies p ( x , y ) = p X ( x ) p Y ( y ) , x ∈ R X , y ∈ R Y . Independent Continuous Random Variables Two continuous random variables are independent if their joint pdf satisfies f ( x , y ) = f X ( x ) f Y ( y ) , −∞ < x < ∞ , −∞ < y < ∞ . Random variables that are not independent are said to be dependent . 8 / 15 Joint Probability Distributions Independent Random Variables

  9. ST 380 Probability and Statistics for the Physical Sciences More Than Two Random Variables Suppose that random variables X 1 , X 2 , . . . , X n are defined for some experiment. If they are all discrete, they have a joint pmf : p ( x 1 , x 2 , . . . , x n ) = P ( X 1 = x 1 , X 2 = x 2 , . . . , X n = x n ) . If they are all continuous, they have a joint pdf : P ( a 1 < X 1 ≤ b 1 , . . . , a n < X n ≤ b n ) � b 1 � b n = f ( x 1 , . . . , x n ) d x n . . . d x 1 . . . . a 1 a n 9 / 15 Joint Probability Distributions Independent Random Variables

  10. ST 380 Probability and Statistics for the Physical Sciences Full Independence The random variables X 1 , X 2 , . . . , X n are independent if their joint pmf or pdf is the product of the marginal pmfs or pdfs. Pairwise Independence Note that if X 1 , X 2 , . . . , X n are independent, then every pair X i and X j are also independent. The converse is not true: pairwise independence does not , in general, imply full independence. 10 / 15 Joint Probability Distributions Independent Random Variables

  11. ST 380 Probability and Statistics for the Physical Sciences Conditional Distribution If X and Y are discrete random variables, then P ( Y = y | X = x ) = P ( X = x and Y = y ) = p ( x , y ) p X ( x ) . P ( X = x ) We write this as p Y | X ( y | x ). If X and Y are continuous random variables, we still need to define the distribution of Y given X = x . But P ( X = x ) = 0, so the definition is not obvious; however, f Y | X ( y | x ) = f ( x , y ) f X ( x ) may be shown to have the appropriate properties. 11 / 15 Joint Probability Distributions Independent Random Variables

  12. ST 380 Probability and Statistics for the Physical Sciences Expected Value, Covariance, and Correlation As you might expect, for a function of several discrete random variables: � � E [ h ( x 1 , . . . , x n )] = · · · h ( x 1 , . . . , x n ) p ( x 1 , . . . , x n ) x 1 x n For a function of several continuous random variables: � ∞ � ∞ E [ h ( x 1 , . . . , x n )] = h ( x 1 , . . . , x n ) f ( x 1 , . . . , x n ) d x n . . . d x 1 . . . . −∞ −∞ Expected Value, Covariance, and 12 / 15 Joint Probability Distributions Correlation

  13. ST 380 Probability and Statistics for the Physical Sciences Covariance Recall the variance of X : ( X − µ X ) 2 � � V ( X ) = E . The covariance of two random variables X and Y is Cov( X , Y ) = E [( X − µ X )( Y − µ Y )] . Note: the covariance of X with itself is its variance. Expected Value, Covariance, and 13 / 15 Joint Probability Distributions Correlation

  14. ST 380 Probability and Statistics for the Physical Sciences Correlation Just as the units of V ( X ) are the square of the units of X , so the units of Cov( X , Y ) are the product of the units of X and Y . The corresponding dimensionless quantity is the correlation : Corr( X , Y ) = ρ X , Y = Cov( X , Y ) Cov( X , Y ) = . � V ( X ) V ( Y ) σ X σ Y Expected Value, Covariance, and 14 / 15 Joint Probability Distributions Correlation

  15. ST 380 Probability and Statistics for the Physical Sciences Properties of Correlation If ac > 0, Corr( aX + b , cY + d ) = Corr( X , Y ) . For any X and Y , − 1 ≤ Corr( X , Y ) ≤ 1 . If X and Y are independent, then Corr( X , Y ) = 0, but not conversely. That is, Corr( X , Y ) = 0 does not in general mean that X and Y are independent. If Corr( X , Y ) = ± 1, then X and Y are exactly linearly related: Y = aX + b for some a � = 0 . Expected Value, Covariance, and 15 / 15 Joint Probability Distributions Correlation

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend