variance
play

variance Var[ X ] = E[( X - ) 2 ], often denoted 2 . The standard - PowerPoint PPT Presentation

properties of expectation One more linearity of expectation practice problem Linearity, II Given a DNA sequence of length n Let X and Y be two random variables derived from e.g. AAATGAATGAATCC outcomes of a single experiment.


  1. 
 properties of expectation One more linearity of expectation practice problem Linearity, II Given a DNA sequence of length n Let X and Y be two random variables derived from e.g. AAATGAATGAATCC…… outcomes of a single experiment. Then where each position is What is the expected number of occurrences of A with probability p A E [ X+Y ] = E [ X ] + E [ Y ] the substring AATGAAT? T with probability p T Can extend by induction to say that G with probability p G AAATGAATGAATCC C with probability p C . E ( X 1 + X 2 + . . . + X n ) = E ( X 1 ) + E ( X 2 ) + . . . + E ( X n ) AAATGAATGAATCC expectation of sum = sum of expectations 1 2 variance Definitions The variance of a random variable X with mean E[ X ] = μ is variance Var[ X ] = E[( X - μ ) 2 ], often denoted σ 2 . The standard deviation of X is σ = √ Var[ X ] 3 4

  2. properties of variance properties of variance NOT linear; NOT linear; Var[aX+b] = a 2 Var[X] Var[aX+b] = a 2 Var[X] insensitive to location (b), insensitive to location (b), quadratic in scale (a) quadratic in scale (a) 5 6 properties of variance properties of variance NOT linear; Var[aX+b] = a 2 Var[X] insensitive to location (b), quadratic in scale (a) Ex: E[X] = 0 Var[X] = 1 Y = 1000 X E[Y] = E[1000 X] = 1000 E[X] = 0 Var[Y] = Var[10 3 X]=10 6 Var[X] = 10 6 7 8

  3. properties of variance properties of variance Example: What is Var[X] when X is outcome of one fair die? E [( X − µ ) 2 ] Var( X ) = E [ X 2 − 2 µX + µ 2 ] = E[X] = 7/2, so E [ X 2 ] − 2 µE [ X ] + µ 2 = E [ X 2 ] − 2 µ 2 + µ 2 = E [ X 2 ] − µ 2 = E [ X 2 ] − ( E [ X ]) 2 = 9 10 properties of variance properties of variance In general: In general: Var[X+Y] ≠ Var[X+Y] ≠ Var[X] + Var[Y] Var[X] + Var[Y] NOT linear NOT linear ^^^^^^^ ^^^^^^^ Ex 1: Ex 1: Let X = ±1 based on 1 coin flip; Y=-X Let X = ±1 based on 1 coin flip; Y=-X Ex 2: As another example, is Var[X+X] = 2Var[X]? 11 12

  4. properties of variance more variance examples In general: 0.10 Var[X+Y] ≠ Var[X] + Var[Y] NOT linear 0.00 ^^^^^^^ -4 -2 0 2 4 Ex 1: 0.10 Let X = ±1 based on 1 coin flip 0.00 As shown above, E[X] = 0, Var[X] = 1 -4 -2 0 2 4 Let Y = -X; then Var[Y] = (-1) 2 Var[X] = 1 0.10 But X+Y = 0, always, so Var[X+Y] = 0 0.00 -4 -2 0 2 4 Ex 2: 0.20 As another example, is Var[X+X] = 2Var[X]? 0.10 0.00 -4 -2 0 2 4 13 14 more variance examples independence σ 2 = 5.83 0.10 0.00 -4 -2 0 2 4 σ 2 = 10 0.10 0.00 -4 -2 0 2 4 σ 2 = 15 0.10 0.00 -4 -2 0 2 4 of r.v.s 0.20 σ 2 = 19.7 0.10 0.00 -4 -2 0 2 4 15 16

  5. r.v.s and independence r.v.s and independence Defn: r.v. X and event E are independent if the event E is Defn: r.v. X and event E are independent if the event E is independent of the event {X=x} (for any fixed x), i.e. independent of the event {X=x} (for any fixed x), i.e. ∀ x P({X = x} & E) = P({X=x}) • P(E) ∀ x P({X = x} & E) = P({X=x}) • P(E) Defn: T wo r.v.s X and Y are independent if the events {X=x} and {Y=y} are independent (for any fixed x, y), i.e. ∀ x, y P({X = x} & {Y=y}) = P({X=x}) • P({Y=y}) 17 18 r.v.s and independence r.v.s and independence Two random variables X and Y are independent if the events {X=x} Defn: r.v. X and event E are independent if the event E is and {Y=y} are independent (for any x, y), i.e. independent of the event {X=x} (for any fixed x), i.e. ∀ x, y P({X = x} & {Y=y}) = P({X=x}) • P({Y=y}) ∀ x P({X = x} & E) = P({X=x}) • P(E) Ex: Let X be number of heads in first n of 2n coin flips, Y be number Defn: T wo r.v.s X and Y are independent if the events {X=x} in the last n flips, and let Z be the total. X and Y are independent: and {Y=y} are independent (for any fixed x, y), i.e. ∀ x, y P({X = x} & {Y=y}) = P({X=x}) • P({Y=y}) Intuition as before: knowing X doesn’t help you guess Y or E and vice versa. 19 20

  6. r.v.s and independence r.v.s and independence Two random variables X and Y are independent if the events {X=x} Defn: T wo r.v.s X and Y are independent if the events {X=x} and {Y=y} are independent (for any x, y), i.e. and {Y=y} are independent (for any fixed x, y), i.e. ∀ x, y P({X = x} & {Y=y}) = P({X=x}) • P({Y=y}) ∀ x, y P({X = x} & {Y=y}) = P({X=x}) • P({Y=y}) Ex: Let X be number of heads in first n of 2n coin flips, Y be number in the last n flips, and let Z be the total. X and Y are independent: But X and Z are not independent, since, e.g., knowing that X = 0 precludes Z > n. E.g., P(X = 0) and P(Z = n+1) are both positive, but P(X = 0 & Z = n+1) = 0. 21 22 products of independent r.v.s products of independent r.v.s Theorem: If X & Y are independent, then E[X•Y] = E[X]•E[Y] Theorem: If X & Y are independent, then E[X•Y] = E[X]•E[Y] Proof: Note: NOT true in general; see earlier example E[X 2 ] ≠ E[X] 2 independence Note: NOT true in general; see earlier example E[X 2 ] ≠ E[X] 2 23 24

  7. properties of variance variance of independent r.v.s is additive ( Bienaymé, 1853) In general: Theorem: If X & Y are independent, then 
 Var[X+Y] = Var[X]+Var[Y] Var[X+Y] ≠ Var[X] + Var[Y] NOT linear ^^^^^^^ 25 26 variance of independent r.v.s is additive ( Bienaymé, 1853) Theorem: If X & Y are independent, then 
 Var[X+Y] = Var[X]+Var[Y] Proof: 
 a zoo of (discrete) random variables 27 28

  8. discrete uniform random variables Bernoulli random variables A discrete random variable X equally likely to take any An experiment results in “Success” or “Failure” (integer) value between integers a and b , inclusive, is uniform. X is an indicator random variable (1 = success, 0 = failure) P(X=1) = p and P(X=0) = 1-p Notation: X ~ Unif (a,b) X is called a Bernoulli random variable: X ~ Ber(p) Probability: E[X] = Var(X) = E[X 2 ] – (E[X]) 2 = Mean, Variance: 0.22 Example: value shown on one 
 P(X=i) roll of a fair die is Unif(1,6): 0.16 P( X=i ) = 1/6 
 0.10 E[ X ] = 7/2 
 Var[ X ] = 35/12 0 1 2 3 4 5 6 7 29 30 i Bernoulli random variables binomial random variables An experiment results in “Success” or “Failure” Consider n independent random variables Y i ~ Ber(p) X is an indicator random variable (1 = success, 0 = failure) X = Σ i Y i is the number of successes in n trials X is a Binomial random variable: X ~ Bin(n,p) P(X=1) = p and P(X=0) = 1-p X is called a Bernoulli random variable: X ~ Ber(p) Pr (X=k) = ? E[X] = E[X 2 ] = p E(X) = ? Var(X) = E[X 2 ] – (E[X]) 2 = p – p 2 = p(1-p) Var(X) = ? Examples: coin flip random binary digit Jacob (aka James, Jacques) Bernoulli, 1654 – 1705 whether a disk drive crashed 31 32

  9. binomial random variables binomial pmfs PMF for X ~ Bin(10,0.5) PMF for X ~ Bin(10,0.25) Consider n independent random variables Y i ~ Ber(p) X = Σ i Y i is the number of successes in n trials 0.30 0.30 X is a Binomial random variable: X ~ Bin(n,p) 0.25 0.25 0.20 0.20 µ ± σ By Binomial theorem, P(X=k) P(X=k) 0.15 0.15 µ ± σ E[X] = pn 0.10 0.10 Var(X) = p(1-p)n 0.05 0.05 0.00 0.00 Examples 0 2 4 6 8 10 0 2 4 6 8 10 # of heads in n coin flips k k # of 1’s in a randomly generated length n bit string # of disk drive crashes in a 1000 computer cluster 33 34 binomial pmfs mean, variance of the binomial (II) PMF for X ~ Bin(30,0.5) PMF for X ~ Bin(30,0.1) 0.25 0.25 0.20 0.20 0.15 0.15 P(X=k) P(X=k) µ ± σ 0.10 0.10 µ ± σ 0.05 0.05 0.00 0.00 0 5 10 15 20 25 30 0 5 10 15 20 25 30 k k 35 36

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend