Probability Review
Gonzalo Mateos
- Dept. of ECE and Goergen Institute for Data Science
University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ August 30, 2020
Introduction to Random Processes Probability Review 1
Probability Review Gonzalo Mateos Dept. of ECE and Goergen - - PowerPoint PPT Presentation
Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ August 30, 2020 Introduction to Random Processes Probability
Introduction to Random Processes Probability Review 1
Introduction to Random Processes Probability Review 2
◮ An event is something that happens ◮ A random event has an uncertain outcome
◮ I’ve written a student’s name in a piece of paper. Who is she/he? ◮ Event: Student x’s name is written in the paper ◮ Probability: P(x) measures how likely it is that x’s name was written ◮ Probability is a measurement tool
Introduction to Random Processes Probability Review 3
◮ Given a sample space or universe S
◮ Ex: All students in the class S = {x1, x2, . . . , xN} (xn denote names)
◮ Def: An outcome is an element or point in S, e.g., x3 ◮ Def: An event E is a subset of S
◮ Ex: {x1}, student with name x1 ◮ Ex: Also {x1, x4}, students with names x1 and x4
◮ Def: A sigma-algebra F is a collection of events E ⊆ S such that
i=1Ei ∈ F
◮ F is a set of sets
Introduction to Random Processes Probability Review 4
◮ No student and all students, i.e., F0 := {∅, S}
◮ Empty set, women, men, everyone, i.e., F1 := {∅, Women, Men, S}
◮ F2 including the empty set ∅ plus
Introduction to Random Processes Probability Review 5
◮ Define a function P(E) from a sigma-algebra F to the real numbers ◮ P(E) qualifies as a probability if
∞
◮ Triplet (S, F, P(·)) is called a probability space
Introduction to Random Processes Probability Review 6
◮ Implications of the axioms A1)-A3)
N
Introduction to Random Processes Probability Review 7
◮ Let’s construct a probability space for our running example ◮ Universe of all students in the class S = {x1, x2, . . . , xN} ◮ Sigma-algebra with all combinations of students, i.e., F = 2S ◮ Suppose names are equiprobable ⇒ P({xn}) = 1/N for all n
|S|
◮ Q: Is this function a probability?
|S| ≥ 0 ⇒ A2): P(S) = |S| |S| = 1
i=1 Ei
N
i=1 Ei|
|S|
N
i=1 |Ei |
|S|
i=1 P(Ei)
◮ The P(·) just defined is called uniform probability distribution
Introduction to Random Processes Probability Review 8
Introduction to Random Processes Probability Review 9
◮ Consider events E and F, and suppose we know F occurred ◮ Q: What does this information imply about the probability of E? ◮ Def: Conditional probability of E given F is (need P(F) > 0)
◮ Renormalize probabilities to the set F
◮ Discard a piece of S ◮ May discard a piece of E as well
◮ For given F with P(F) > 0, P(·|F) satisfies the axioms of probability
Introduction to Random Processes Probability Review 10
◮ The name I wrote is male. What is the probability of name xn? ◮ Assume male names are F = {x1, . . . , xM} ⇒ P(F) = M N ◮ If name xn is male, xn ∈ F and we have for event E = {xn}
◮ If name is female xn /
Introduction to Random Processes Probability Review 11
◮ Consider event E and events F and F c
◮ F and F c form a partition of the space S (F ∪ F c = S, F ∩ F c = ∅)
◮ Because F ∪ F c = S cover space S, can write the set E as
◮ Because F ∩ F c = ∅ are disjoint, so is [E ∩ F] ∩ [E ∩ F c] = ∅
◮ Use definition of conditional probability
◮ Translate conditional information P(E
Introduction to Random Processes Probability Review 12
◮ In general, consider (possibly infinite)
◮ Sets are disjoint ⇒ Fi ∩Fj = ∅ for i = j ◮ Sets cover the space ⇒ ∪∞ i=1Fi = S
◮ As before, because ∪∞ i=1Fi = S cover the space, can write set E as
∞
◮ Because Fi ∩ Fj = ∅ are disjoint, so is [E ∩ Fi] ∩ [E ∩ Fj] = ∅. Thus
∞
∞
Introduction to Random Processes Probability Review 13
◮ Consider a probability class in some university
◮ Q: What is the probability of the exchange student scoring an A? ◮ Let A = “exchange student gets an A,” S denote senior, and J junior
Introduction to Random Processes Probability Review 14
◮ From the definition of conditional probability
◮ Likewise, for F conditioned on E we have
◮ Quantities above are equal, giving Bayes’ rule
◮ Bayes’ rule allows time reversion. If F (future) comes after E (past),
◮ Models often describe future
Introduction to Random Processes Probability Review 15
◮ Consider the following partition of my email
◮ Let F=“an email contains the word free”
◮ I got an email containing “free”. What is the probability that it is spam? ◮ Apply Bayes’ rule
i=1 P(F
Introduction to Random Processes Probability Review 16
Introduction to Random Processes Probability Review 17
◮ Def: Events E and F are independent if P(E ∩ F) = P(E)P(F)
◮ According to definition of conditional probability
◮ Whether E and F are independent relies strongly on P(·) ◮ Avoid confusing with disjoint events, meaning E ∩ F = ∅ ◮ Q: Can disjoint events with P(E) > 0, P(F) > 0 be independent? No
Introduction to Random Processes Probability Review 18
◮ Wrote one name, asked a friend to write another (possibly the same) ◮ Probability space (S, F, P(·)) for this experiment
|S| as the uniform probability distribution ◮ Consider the events E1 =‘I wrote x1’ and E2 =‘My friend wrote x2’
◮ Dependent events: E1 =‘I wrote x1’ and E3 =‘Both names are male’
Introduction to Random Processes Probability Review 19
◮ Def: Events Ei, i = 1, 2, . . . are called mutually independent if
◮ Ex: Events E1, E2, and E3 are mutually independent if all the following hold
◮ If P(Ei ∩ Ej) = P(Ei)P(Ej) for all (i, j), the Ei are pairwise independent
Introduction to Random Processes Probability Review 20
Introduction to Random Processes Probability Review 21
◮ Def: RV X(s) is a function that assigns a value to an outcome s ∈ S
◮ Throw a ball inside a 1m × 1m square. Interested in ball position ◮ Uncertain outcome is the place s ∈ [0, 1]2 where the ball falls ◮ Random variables are X(s) and Y (s) position coordinates ◮ RV probabilities inferred from probabilities of underlying outcomes
◮ X(s) is the random variable and x a particular value of X(s)
Introduction to Random Processes Probability Review 22
◮ Throw coin for head (H) or tails (T). Coin is fair P(H) = 1/2,
◮ Possible outcomes are H and T ◮ To measure earnings define RV X with values
◮ Probabilities of the RV are
Introduction to Random Processes Probability Review 23
◮ Throw 2 coins. Pay $1 for each H, charge $1 for each T. Earnings? ◮ Now the possible outcomes are HH, HT, TH, and TT ◮ To measure earnings define RV Y with values
◮ Probabilities of the RV are
Introduction to Random Processes Probability Review 24
◮ RVs are easier to manipulate than events ◮ Let s1 ∈ {H, T} be outcome of coin 1 and s2 ∈ {H, T} of coin 2
◮ Throw N coins. Earnings? Enumeration becomes cumbersome ◮ Alternatively, let sn ∈ {H, T} be outcome of n-th toss and define
N
n=1 Xn
Introduction to Random Processes Probability Review 25
◮ Throw a coin until landing heads for the first time. P(H) = p ◮ Number of throws until the first head? ◮ Outcomes are H, TH, TTH, TTTH, . . . Note that |S| = ∞
◮ Let N be a RV counting the number of throws
Introduction to Random Processes Probability Review 26
◮ From A2) we should have P(S) = ∞ n=1 P(N =n) = 1 ◮ Holds because ∞ n=1(1 − p)n−1 is a geometric series ∞
◮ Plug the sum of the geometric series in the expression for P(S) ∞
∞
Introduction to Random Processes Probability Review 27
◮ The indicator function of an event is a random variable ◮ Let s ∈ S be an outcome, and E ⊂ S be an event
◮ Number of throws N until first H. Interested on N exceeding N0
◮ Probability P(IN0 = 1) = P(N > N0) = (1 − p)N0
Introduction to Random Processes Probability Review 28
Introduction to Random Processes Probability Review 29
◮ Discrete RV takes on, at most, a countable number of values ◮ Probability mass function (pmf) pX(x) = P(X = x)
◮ If RV is clear from context, just write pX(x) = p(x)
◮ If X supported in {x1, x2, . . .}, pmf satisfies
i=1 p(xi) = 1
◮ Pmf for “throw to first heads” (p = 0.3)
◮ Cumulative distribution function (cdf)
◮ Cdf for “throw to first heads” (p = 0.3)
1 2 3 4 5 6 7 8 9 10 0.05 0.1 0.15 0.2 0.25 0.3 2 4 6 8 10 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Introduction to Random Processes Probability Review 30
◮ A trial/experiment/bet can succeed w.p. p or fail w.p. q := 1 − p
◮ Bernoulli X can be 0 or 1. Pmf is p(x) = pxq1−x ◮ Cdf is
1 0.1 0.2 0.3 0.4 0.5 0.6 −0.5 0.5 1 1.5 0.2 0.4 0.6 0.8 1 Introduction to Random Processes Probability Review 31
◮ Count number of Bernoulli trials needed to register first success
◮ Number of trials X until success is geometric with parameter p ◮ Pmf is p(x) = p(1 − p)x−1
◮ One success after x − 1 failures, trials are independent
◮ Cdf is F(x) = 1 − (1 − p)x
◮ Recall P (X > x) = (1 − p)x; or just sum the geometric series
1 2 3 4 5 6 7 8 9 10 0.05 0.1 0.15 0.2 0.25 0.3 0.35 2 4 6 8 10 0.2 0.4 0.6 0.8 1 Introduction to Random Processes Probability Review 32
◮ Count number of successes X in n Bernoulli trials
◮ Number of successes X is binomial with parameters (n, p). Pmf is
x
1 2 3 4 5 6 7 8 9 0.05 0.1 0.15 0.2 0.25 1 2 3 4 5 6 7 8 9 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Introduction to Random Processes Probability Review 33
◮ Let Yi, i = 1, . . . n be Bernoulli RVs with parameter p
◮ Can write binomial X with parameters (n, p) as ⇒ X = n
◮ Consider binomials Y and Z with parameters (nY , p) and (nZ, p)
◮ Write Y = nY i=1 Yi and Z = nZ i=1 Zi, thus
nY
nZ
Introduction to Random Processes Probability Review 34
◮ Counts of rare events (radioactive decay, packet arrivals, accidents) ◮ Usually modeled as Poisson with parameter λ and pmf
◮ Q: Is this a properly defined pmf? Yes ◮ Taylor’s expansion of ex = 1 + x + x2/2 + . . . + xi/i! + . . .. Then
∞
∞
1 2 3 4 5 6 7 8 9 0.05 0.1 0.15 0.2 0.25 1 2 3 4 5 6 7 8 9 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
Introduction to Random Processes Probability Review 35
◮ X is binomial with parameters (n, p) ◮ Let n → ∞ while maintaining a constant product np = λ
◮ If we just let n → ∞ number of successes diverges. Boring
◮ Compare with Poisson distribution with parameter λ
◮ λ = 5, n = 6, 8, 10, 15, 20, 50
1 2 3 4 5 6 7 8 9 0.05 0.1 0.15 0.2 0.25 1 2 3 4 5 6 7 8 9 0.05 0.1 0.15 0.2 0.25 1 2 3 4 5 6 7 8 9 0.05 0.1 0.15 0.2 0.25 1 2 3 4 5 6 7 8 9 0.05 0.1 0.15 0.2 0.25 1 2 3 4 5 6 7 8 9 0.05 0.1 0.15 0.2 0.25 1 2 3 4 5 6 7 8 9 0.05 0.1 0.15 0.2 0.25
Introduction to Random Processes Probability Review 36
◮ This is, in fact, the motivation for the definition of a Poisson RV ◮ Substituting p = λ/n in the pmf of a binomial RV
(1−λ/n)x , and reordered terms
◮ In the limit, red term is limn→∞(1 − λ/n)n = e−λ ◮ Black and blue terms converge to 1. From both observations
n→∞ pn(x) = 1λx
Introduction to Random Processes Probability Review 37
◮ Binomial distribution is motivated by counting successes ◮ The Poisson is an approximation for large number of trials n
◮ Sometimes called “law of rare events”
◮ Individual events (successes) happen with small probability p = λ/n ◮ Aggregate event (number of successes), though, need not be rare
◮ Notice that all four RVs seen so far are related to “coin tosses”
Introduction to Random Processes Probability Review 38
◮ Random variables are mappings X(s) : S → R
◮ Let’s construct a probability space for a Bernoulli RV
◮ Let S = [0, 1], F the Borel sigma-field and P ([a, b]) = b − a, a ≤ b
◮ Fix a parameter p ∈ [0, 1] and define
◮ Can do a similar construction for all distributions consider so far
Introduction to Random Processes Probability Review 39
Introduction to Random Processes Probability Review 40
◮ Possible values for continuous RV X form a dense subset X ⊆ R
◮ Probability density function (pdf) fX(x) ≥ 0
◮ Cdf defined as before and related to the pdf
−∞
x→∞ FX(x) = 1
−3 −2 −1 1 2 3 0.1 0.2 0.3 0.4 −3 −2 −1 1 2 3 0.2 0.4 0.6 0.8 1
Introduction to Random Processes Probability Review 41
◮ When the set X = [a, b] is an interval of R
◮ In terms of the pdf it can be written as
a
◮ For small interval [x0, x0 + δx], in particular
x0
◮ Another relationship between pdf and cdf is ⇒ ∂FX(x)
Introduction to Random Processes Probability Review 42
◮ Model problems with equal probability of landing on an interval [a, b] ◮ Pdf of uniform RV is f (x) = 0 outside the interval [a, b] and
◮ Cdf is F(x) = (x − a)/(b − a) in the interval [a, b] (0 before, 1 after) ◮ Prob. of interval [α, β] ⊆ [a, b] is
α f (x)dx = (β − α)/(b − a)
−1.5 −1 −0.5 0.5 1 1.5 0.1 0.2 0.3 0.4 0.5 −1.5 −1 −0.5 0.5 1 1.5 0.2 0.4 0.6 0.8 1 Introduction to Random Processes Probability Review 43
◮ Model duration of phone calls, lifetime of electronic components ◮ Pdf of exponential RV is
◮ Cdf obtained by integrating pdf
−∞
0.5 1 1.5 2 2.5 3 3.5 4 4.5 0.2 0.4 0.6 0.8 1 0.5 1 1.5 2 2.5 3 3.5 4 4.5 0.2 0.4 0.6 0.8 1 Introduction to Random Processes Probability Review 44
◮ Model randomness arising from large number of random effects ◮ Pdf of normal RV is
◮ Cdf F(x) cannot be expressed in terms of elementary functions
−3 −2 −1 1 2 3 0.1 0.2 0.3 0.4 −3 −2 −1 1 2 3 0.2 0.4 0.6 0.8 1
Introduction to Random Processes Probability Review 45
Introduction to Random Processes Probability Review 46
◮ We are asked to summarize information about a RV in a single value
◮ If we are allowed a description with a few values
◮ Expected (mean) values are convenient answers to these questions ◮ Beware: Expectations are condensed descriptions
Introduction to Random Processes Probability Review 47
◮ Discrete RV X taking on values xi, i = 1, 2, . . . with pmf p(x) ◮ Def: The expected value of the discrete RV X is
∞
◮ Weighted average of possible values xi. Probabilities are weights ◮ Common average if RV takes values xi, i = 1, . . . , N equiprobably
N
N
N
Introduction to Random Processes Probability Review 48
◮ Note that ∂qx/∂q = xqx−1 and that derivatives are linear operators
∞
∞
◮ Time to first success is inverse of success probability. Reasonable
Introduction to Random Processes Probability Review 49
◮ First summand in definition is 0, pull λ out, and use x x! = 1 (x−1)!
∞
∞
◮ Sum is Taylor’s expansion of eλ = 1 + λ + λ2/2! + . . . + λx/x!
◮ Poisson is limit of binomial for large number of trials n, with λ = np
◮ Expected number of successes is λ = np
Introduction to Random Processes Probability Review 50
◮ Continuous RV X taking values on R with pdf f (x) ◮ Def: The expected value of the continuous RV X is
−∞
◮ Compare with E [X] := x:p(x)>0 xp(x) in the discrete RV case ◮ Note that the integral or sum are assumed to be well defined
Introduction to Random Processes Probability Review 51
−∞
2σ2 dx
−∞
2σ2 dx
−∞
2σ2 dx +
−∞
2σ2 dx
◮ First integral is 1 because it integrates a pdf in all R ◮ Second integral is 0 by symmetry. Both observations yield
◮ The mean of a RV with a symmetric pdf is the point of symmetry
Introduction to Random Processes Probability Review 52
−∞
a
◮ Makes sense, since pdf is symmetric around midpoint (a + b)/2
0 +
0 − e−λx
0 = 1
Introduction to Random Processes Probability Review 53
◮ Consider a function g(X) of a RV X. Expected value of g(X)? ◮ g(X) is also a RV, then it also has a pmf pg(X)
∞
◮ Weighted average of functional values. No need to find pmf of g(X) ◮ Same can be proved for a continuous RV
−∞
Introduction to Random Processes Probability Review 54
◮ Consider a linear function (actually affine) g(X) = aX + b
∞
∞
∞
∞
∞
◮ Can interchange expectation with additive/multiplicative constants
Introduction to Random Processes Probability Review 55
◮ Let X be a RV and X be a set
◮ Expected value of I {X ∈ X} in the discrete case
◮ Likewise in the continuous case
−∞
◮ Expected value of indicator RV = Probability of indicated event
Introduction to Random Processes Probability Review 56
◮ Def: The n-th moment (n ≥ 0) of a RV is
∞
i p(xi) ◮ Def: The n-th central moment corrects for the mean, that is
∞
◮ 0-th order moment is E
◮ 2-nd central moment is the variance. Measures width of the pmf
Introduction to Random Processes Probability Review 57
∞
∞
∞
∞
∞
∞
Introduction to Random Processes Probability Review 58
Introduction to Random Processes Probability Review 59
◮ Want to study problems with more than one RV. Say, e.g., X and Y ◮ Probability distributions of X and Y are not sufficient
◮ If X, Y clear from context omit subindex to write FXY (x, y) = F(x, y) ◮ Can recover FX(x) by considering all possible values of Y
Introduction to Random Processes Probability Review 60
◮ Consider discrete RVs X and Y
◮ Joint pmf of (X, Y ) defined as
◮ Possible values (x, y) are elements of the Cartesian product X × Y
◮ (x1, y1), (x1, y2), . . ., (x2, y1), (x2, y2), . . ., (x3, y1), (x3, y2), . . .
◮ Marginal pmf pX(x) obtained by summing over all values of Y
Introduction to Random Processes Probability Review 61
◮ Consider continuous RVs X, Y . Arbitrary set A ∈ R2 ◮ Joint pdf is a function fXY (x, y) : R2 → R+ such that
◮ Marginalization. There are two ways of writing P (X ∈ X)
−∞
Introduction to Random Processes Probability Review 62
◮ Consider continuous RVs X, Y . Arbitrary set A ∈ R2 ◮ Joint pdf is a function fXY (x, y) : R2 → R+ such that
◮ Marginalization. There are two ways of writing P (X ∈ X)
−∞
◮ Lipstick on a pig (same thing written differently is still same thing)
−∞
−∞
Introduction to Random Processes Probability Review 63
◮ Consider two Bernoulli RVs B1, B2, with the same parameter p
◮ The pmf of X is
◮ Likewise, the pmf of Y is
◮ The joint pmf of X and Y is
Introduction to Random Processes Probability Review 64
◮ For convenience often arrange RVs in a vector
◮ Consider, e.g., two RVs X and Y . Random vector is X = [X, Y ]T ◮ If X and Y are discrete, vector variable X is discrete with pmf
◮ If X, Y continuous, X continuous with pdf
◮ Vector cdf is ⇒ FX(x) = FX
◮ In general, can define n-dimensional RVs X := [X1, X2, . . . , Xn]T
Introduction to Random Processes Probability Review 65
Introduction to Random Processes Probability Review 66
◮ RVs X and Y and function g(X, Y ). Function g(X, Y ) also a RV ◮ Expected value of g(X, Y ) when X and Y discrete can be written as
◮ When X and Y are continuous
−∞
−∞
Introduction to Random Processes Probability Review 67
◮ Expected value of the sum of two continuous RVs
−∞
−∞
−∞
−∞
−∞
−∞
◮ Remove x (y) from innermost integral in first (second) summand
−∞
−∞
−∞
−∞
−∞
−∞
◮ Expectation ↔ summation ⇒ E
i E [Xi]
Introduction to Random Processes Probability Review 68
◮ Combining with earlier result E [aX + b] = aE [X] + b proves that
◮ Better yet, using vector notation (with a ∈ Rn, X ∈ Rn, b a scalar)
◮ Also, if A is an m × n matrix with rows aT 1 , . . . , aT m and b ∈ Rm a
1 X + b1
2 X + b2
mX + bm
1 E [X] + b1
2 E [X] + b2
mE [X] + bm
◮ Expected value operator can be interchanged with linear operations
Introduction to Random Processes Probability Review 69
◮ Events E and F are independent if P (E ∩ F) = P (E) P (F) ◮ Def: RVs X and Y are independent if events X ≤ x and Y ≤ y are
◮ For discrete RVs equivalent to analogous relation between pmfs
◮ For continuous RVs the analogous is true for pdfs
◮ Independence ⇔ Joint distribution factorizes into product of marginals
Introduction to Random Processes Probability Review 70
◮ Independent Poisson RVs X and Y with parameters λx and λy ◮ Q: Probability distribution of the sum RV Z := X + Y ? ◮ Z = n only if X = k, Y = n − k for some 0 ≤ k ≤ n
n
n
n
x
y
n
xλn−k y
◮ Z is Poisson with parameter λz := λx + λy
Introduction to Random Processes Probability Review 71
◮ Binomial RVs count number of successes in n Bernoulli trials
◮ Can write binomial X = n
n
◮ Expected nr. successes = nr. trials × prob. individual success
◮ Same interpretation that we observed for Poisson RVs
n
◮ Expected nr. successes is still E [Y ] = np
◮ Linearity of expectation does not require independence ◮ Y is not binomial distributed Introduction to Random Processes Probability Review 72
◮ Can show that g(X) and h(Y ) are also independent. Intuitive
◮ Expectation and product can be interchanged if RVs are independent ◮ Different from interchange with linear operations (always possible)
Introduction to Random Processes Probability Review 73
◮ Suppose X and Y continuous RVs. Use definition of independence
−∞
−∞
−∞
−∞
◮ Integrand is product of a function of x and a function of y
−∞
−∞
Introduction to Random Processes Probability Review 74
◮ Let Xn, n = 1, . . . N be independent with E [Xn] = µn, var [Xn] = σ2 n ◮ Q: Variance of sum X := N n=1 Xn? ◮ Notice that mean of X is E [X] = N n=1 µn. Then
N
◮ Expand square and interchange summation and expectation
N
N
Probability Review 75
◮ Separate terms in sum. Then use independence and E(Xn − µn) = 0
N
N
N
N
N
N
n = N
n ◮ If RVs are independent ⇒ Variance of sum is sum of variances ◮ Slightly more general result holds for independent Xi, i = 1, . . . , n
i var [Xi]
Introduction to Random Processes Probability Review 76
◮ Write binomial X with parameters (n, p) as: X = n
◮ Variance of binomial then ⇒ var [X] = n
◮ Sample mean is ¯
n
◮ Expected value ⇒ E
n
◮ Variance ⇒ var
n
Introduction to Random Processes Probability Review 77
◮ Def: The covariance of X and Y is (generalizes variance to pairs of RVs)
◮ If cov(X, Y ) = 0 variables X and Y are said to be uncorrelated ◮ If X, Y independent then E [XY ] = E [X] E [Y ] and cov(X, Y ) = 0
◮ Opposite is not true, may have cov(X, Y ) = 0 for dependent X, Y
◮ Ex: X uniform in [−a, a] and Y = X 2
◮ If cov(X, Y ) > 0 then X and Y tend to move in the same direction
◮ If cov(X, Y ) < 0 then X and Y tend to move in opposite directions
Introduction to Random Processes Probability Review 78
◮ Let X be a zero-mean random signal and Z zero-mean noise
◮ Consider received signals Y1 = X + Z and Y2 = −X + Z
◮ Second term is 0 (E [X] = 0). For first term independence of X, Z
◮ Combining observations ⇒ cov(X, Y1) = E
Introduction to Random Processes Probability Review 79
◮ Same computations ⇒ cov(X, Y2) = −E
◮ Correlation between X and Y1 or X and Y2 comes from causality ◮ Correlation between Y1 and Y2 does not. Latent variables X and Z
Introduction to Random Processes Probability Review 80
◮ Sample space ◮ Outcome and event ◮ Sigma-algebra ◮ Countable union ◮ Axioms of probability ◮ Probability space ◮ Conditional probability ◮ Law of total probability ◮ Bayes’ rule ◮ Independent events ◮ Random variable (RV) ◮ Discrete RV ◮ Bernoulli, binomial, Poisson ◮ Continuous RV ◮ Uniform, Normal, exponential ◮ Indicator RV ◮ Pmf, pdf and cdf ◮ Law of rare events ◮ Expected value ◮ Variance and standard deviation ◮ Joint probability distribution ◮ Marginal distribution ◮ Random vector ◮ Independent RVs ◮ Covariance ◮ Uncorrelated RVs
Introduction to Random Processes Probability Review 81