219323 probability and statistics for software and
play

219323 Probability and Statistics for Software and Knowledge - PowerPoint PPT Presentation

219323 Probability and Statistics for Software and Knowledge Engineers Lecture 3: Random Variables II Monchai Sopitkamon, Ph.D. Outline Discrete Random Variables (2.1) Continuous Random Variables (2.2) Expectation of a Random


  1. 219323 Probability and Statistics for Software and Knowledge Engineers Lecture 3: Random Variables II Monchai Sopitkamon, Ph.D.

  2. Outline � Discrete Random Variables (2.1) � Continuous Random Variables (2.2) � Expectation of a Random Variable (2.3) � The Variance of a Random Variable (2.4) � Jointly Distributed Random Variables (2.5) � Functions and Combinations of Random Variables (2.6)

  3. The Variance of a Random Variable (2.4) � Definition and Interpretation (2.4.1) � Examples of Variance Calculations (2.4.2) � Chebyshev’s Inequality (2.4.3) � Quantiles of Random Variables (2.4.4)

  4. The Variance of a Random Variable: Definition and Interpretation I (2.4.1) � Measures the spread or variability in the values of the RV, or the deviation of the RV around its expected (mean) value. Var( X ) = σ 2 = E (( X - E ( X )) 2 ) = E ( X 2 ) - ( E ( X )) 2 � Variance must be positive � The greater the variance value, the more spread of the distribution from its mean � Standard deviation ( σ ) = σ 2 � SD has the same unit as the RV, while Var( X ) has the square of the unit

  5. The Variance of a Random Variable: Definition and Interpretation II (2.4.1) Two distributions with different mean values, but identical variances Two distributions with identical mean values, but different variances

  6. The Variance of a Random Variable: Examples of Variance Calculations I (2.4.2) � Do examples 1 pg.103 and 14 pg.104

  7. The Variance of a Random Variable: Examples of Variance Calculations II (2.4.2) Ex. The quiz scores for a particular student are given below: 22, 25, 20, 18, 12, 20, 24, 20, 20, 25, 24, 25, 18 Find the variance and standard deviation. 12 18 20 22 24 25 Value 1 2 4 1 2 3 Frequency .08 .15 .31 .08 .15 .23 Probability μ = 21 ( ) ( ) ( ) 2 2 2 = − μ + − μ + + − μ V X ( ) p x p x ... p x 1 1 2 2 n n σ = V X ( )

  8. The Variance of a Random Variable: Examples of Variance Calculations III (2.4.2) ( ) ( ) ( ) 2 2 2 V X = − + − + − ( ) .08 12 21 .15 18 21 .31 20 21 ( ) ( ) ( ) 2 2 2 + − + − + − .08 22 21 .15 24 21 .23 25 21 V X = ( ) 13.25 σ = = ≈ V X ( ) 13.25 3.64

  9. The Variance of a Random Variable: Chebyshev’s Inequality I (2.4.3) � Provides bounds on the prob. that a RV can take values greater than so many SD away from its expected value. � If a RV has a mean μ and a variance σ 2 , then P ( μ − c σ ≤ X ≤ μ + c σ ) ≥ 1 − 1 c 2 for c ≥ 1 � Refer to next slide for c = 2 and c = 3 � Do example 14 pg.108

  10. The Variance of a Random Variable: Chebyshev’s Inequality II (2.4.3) Illustration of Chebyshev’s inequality

  11. The Variance of a Random Variable: Quantiles of Random Variables I (2.4.4) � The p th quantile of a RV X with a cdf F ( x ) is the value of x for which F ( x ) = p � p x 100 th percentile � There is a prob. of p that the RV takes a value < the p th quantile � Spread of distribution can also be obtained by computing its quartiles. � The upper quartile = 75 th percentile � The lower quartile = 25 th percentile � The interquartile range = distance between the lower and upper quartiles. � Do example 14 pg. 110

  12. The Variance of a Random Variable: Quantiles of Random Variables II (2.4.4) Illustration of 70th percentile Illustration of quartiles and median

  13. The Variance of a Random Variable: Quantiles of Random Variables III (2.4.4) Interquartile range Interquartile range for metal cylinder diameters

  14. Outline � The Variance of a Random Variable (2.4) � Jointly Distributed Random Variables (2.5) � Functions and Combinations of Random Variables (2.6)

  15. Jointly Distributed Random Variables (2.5) � Joint Probability Distributions (2.5.1) � Marginal Probability Distributions (2.5.2) � Independence and Covariance (2.5.4)

  16. Jointly Distributed Random Variables: Joint Probability Distributions I (2.5.1) � Consider two RVs X and Y and their joint prob distribution. � The joint prob distribution of two RVs X and Y is a set of prob values P ( X = x i , Y = y j ) = p ij for discrete RVs, or � A joint pdf f ( x , y ) for continuous RVs. ∑∑ = � The joint pmf must satisfy p 1 ij i j � While the joint pdf must satisfy ∫∫ statespace = f ( x , y ) dxdy 1

  17. Jointly Distributed Random Variables: Joint Probability Distributions II (2.5.1) � The prob that a ≤ X ≤ b and c ≤ Y ≤ d is obtained from the joint pdf as b d ∫ ∫ f ( x , y ) dydx = = x a y c � The joint cdf F ( x , y ) = P ( X ≤ x , Y ≤ y ) ∑ ∑ = F ( x , y ) p For discrete RVs ij ≤ ≤ i : x x j : y y i j y x ∫ ∫ = For continuous F ( x , y ) f ( w , z ) dzdw RVs = −∞ = −∞ w z � See example 19 pg. 114

  18. Jointly Distributed Random Variables: Joint Probability Distributions III (2.5.1) Ex 19: Joint probability mass function for air conditioner maintenance example ∑∑ = + + + = K p 0 . 12 0 . 08 0 . 07 1 . 00 ij i j

  19. Jointly Distributed Random Variables: Joint Probability Distributions IV (2.5.1) Joint cumulative distribution function for air conditioner maintenance example y x ∑∑ = ≤ ≤ = F ( x , y ) P ( X x , Y y ) p ij = = i 1 j 1 F (2, 2) = p 11 + p 12 + p 21 + p 22

  20. Jointly Distributed Random Variables: Marginal Probability Distributions I (2.5.2) � For two discrete RVs X and Y , the prob values of the marginal distribution of X are: ∑ = = = P ( X x ) p p + i i ij j � For two continuous RVs, the prob density functions of the marginal distribution of X is: ∞ ∫ = ( ) ( , ) f X x f x y dy − ∞

  21. Jointly Distributed Random Variables: Marginal Probability Distributions II (2.5.2) P ( Y = j ) 3 ∑ = = = + + = P ( X 1 ) p 0 . 12 0 . 08 0 . 01 0 . 21 1 j = j 1 4 ∑ = = E ( X ) iP ( X i ) = i 1 = (1x0.21)+… …+(4x0.25) = 2.59 P ( X = i )

  22. Jointly Distributed Random Variables: Marginal Probability Distributions III (2.5.2) 4 ∑ = = E ( X ) iP ( X i ) = i 1 = (1x0.21)+… …+(4x0.25) = 2.59 4 ∑ = = 2 2 E ( X ) i P ( X i ) = i 1 = ( 1 x0.21)+… …+( 16 x0.25) = 7.87 Var (X ) = E ( X 2 ) – ( E ( X )) 2 = 7.87 – 2.59^2 = 1.162 Marginal probability mass SD ( σ ) = √ 1.162 = 1.08 function of service time

  23. Jointly Distributed Random Variables: Marginal Probability Distributions IV (2.5.2) Y Marginal probability mass function of number of air conditioner units

  24. Jointly Distributed Random Variables: Independence and Covariance I (2.5.4) � Two RVs X and Y are independent if the value taken by one RV is “unrelated” to the value taken by the other RV. � Two RVs are independent if their joint pmf or pdf is the product of their two marginal distributions. � If the RVs are discrete, then they are independent if p ij = p i+ p +j for all values of x i and y j � If the RVs are continuous, then they are independent if f ( x , y ) = f X ( x ) f Y ( y ) for all values of x and y

  25. Jointly Distributed Random Variables: Independence and Covariance II (2.5.4) � See independent example on pg. 122 � The covariance of two RVs X and Y Cov( X , Y ) = E [( X – E ( X ))( Y – E ( Y ))] = E ( XY ) – E ( X ) E ( Y ) � The covariance can be any positive or negative number, and � Independent RVs have a covariance = 0 � The correlation between two RVs X and Y Cov ( X , Y ) = Corr ( X , Y ) Var ( X ) Var ( Y ) � The correlation takes values between -1 and 1, and ind. RVs have a correlation = 0 � See examples on pg. 124, 125

  26. Outline � The Variance of a Random Variable (2.4) � Jointly Distributed Random Variables (2.5) � Functions and Combinations of Random Variables (2.6)

  27. Functions and Combinations of Random Variables (2.6) � Linear Functions of a Random Variable (2.6.1) � Linear Combinations of a Random Variables (2.6.2) � Nonlinear Functions of a Random Variable (2.6.3)

  28. Functions and Combinations of Random Variables: Linear Functions of a Random Variable (2.6.1) � If X is a RV and Y = aX + b for some numbers a and b ∈ R, then E ( Y ) = aE ( X ) + b and Var( Y ) = a 2 Var( X )

  29. Functions and Combinations of Random Variables: Linear Combinations of a Random Variables I (2.6.2) � Sums of RVs – If X 1 and X 2 are two RVs, then E ( X 1 + X 2 ) = E ( X 1 ) + E ( X 2 ) and Var( X 1 + X 2 ) = Var( X 1 ) + 2Cov( X 1 , X 2 ) – If X 1 and X 2 are independent two RVs so that Cov( X 1 , X 2 ) = 0 , then Var( X 1 + X 2 ) = Var( X 1 ) + Var( X 2 )

  30. Functions and Combinations of Random Variables: Linear Combinations of a Random Variables II (2.6.2) � Linear Combination of RVs – If X 1 , …, X n is a sequence of RVs and a 1 , …, a n and b are constants, then E ( a 1 X 1 + … + a n X n + b) = a 1 E ( X 1 ) + … + a n E ( X n ) + b – If the RVs are independent, then Var( a 1 X 1 + … + a n X n + b) = a 1 2 Var( X 1 ) + … + 2 Var( X n ) a n

  31. Functions and Combinations of Random Variables: Linear Combinations of a Random Variables III (2.6.2) � Averaging Independent RVs – Suppose that X 1 , … X n is a sequence of ind. RVs each with an expectation μ and a variance σ 2 , and with an average + + L X X = 1 n X n = μ Then E ( X ) σ 2 = and Var ( X ) n

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend