cs 473 algorithms
play

CS 473: Algorithms Chandra Chekuri Ruta Mehta University of - PowerPoint PPT Presentation

CS 473: Algorithms Chandra Chekuri Ruta Mehta University of Illinois, Urbana-Champaign Fall 2016 Chandra & Ruta (UIUC) CS473 1 Fall 2016 1 / 18 CS 473: Algorithms, Fall 2016 Basics of Discrete Probability Chandra & Ruta (UIUC)


  1. CS 473: Algorithms Chandra Chekuri Ruta Mehta University of Illinois, Urbana-Champaign Fall 2016 Chandra & Ruta (UIUC) CS473 1 Fall 2016 1 / 18

  2. CS 473: Algorithms, Fall 2016 Basics of Discrete Probability Chandra & Ruta (UIUC) CS473 2 Fall 2016 2 / 18

  3. Discrete Probability We restrict attention to finite probability spaces. Definition A discrete probability space is a pair (Ω , Pr) consists of finite set Ω of elementary events and function p : Ω → [0 , 1] which assigns a probability Pr[ ω ] for each ω ∈ Ω such that � ω ∈ Ω Pr[ ω ] = 1 . Chandra & Ruta (UIUC) CS473 3 Fall 2016 3 / 18

  4. Discrete Probability We restrict attention to finite probability spaces. Definition A discrete probability space is a pair (Ω , Pr) consists of finite set Ω of elementary events and function p : Ω → [0 , 1] which assigns a probability Pr[ ω ] for each ω ∈ Ω such that � ω ∈ Ω Pr[ ω ] = 1 . Example An unbiased coin. Ω = { H , T } and Pr[H] = Pr[T] = 1 / 2 . Example A 6 -sided unbiased die. Ω = { 1 , 2 , 3 , 4 , 5 , 6 } and Pr[i] = 1 / 6 for 1 ≤ i ≤ 6 . Chandra & Ruta (UIUC) CS473 3 Fall 2016 3 / 18

  5. Discrete Probability And more examples Example A biased coin. Ω = { H , T } and Pr[H] = 2 / 3 , Pr[T] = 1 / 3 . Example Two independent unbiased coins. Ω = { HH , TT , HT , TH } and Pr[HH] = Pr[TT] = Pr[HT] = Pr[TH] = 1 / 4 . Example A pair of (highly) correlated dice. Ω = { (i , j) | 1 ≤ i ≤ 6 , 1 ≤ j ≤ 6 } . Pr[i , i] = 1 / 6 for 1 ≤ i ≤ 6 and Pr[i , j] = 0 if i � = j . Chandra & Ruta (UIUC) CS473 4 Fall 2016 4 / 18

  6. Events Definition Given a probability space (Ω , Pr) an event is a subset of Ω . In other words an event is a collection of elementary events. The probability of an event A , denoted by Pr[A] , is � ω ∈ A Pr[ ω ] . The complement event of an event A ⊆ Ω is the event Ω \ A frequently denoted by ¯ A . Chandra & Ruta (UIUC) CS473 5 Fall 2016 5 / 18

  7. Events Examples Example A pair of independent dice. Ω = { (i , j) | 1 ≤ i ≤ 6 , 1 ≤ j ≤ 6 } . Let A be the event that the sum of the two numbers on the dice 1 is even. � � � Then A = (i , j) ∈ Ω � (i + j) is even . � Pr[A] = | A | / 36 = 1 / 2 . Let B be the event that the first die has 1 . Then 2 � � B = (1 , 1) , (1 , 2) , (1 , 3) , (1 , 4) , (1 , 5) , (1 , 6) . Pr[B] = 6 / 36 = 1 / 6 . Chandra & Ruta (UIUC) CS473 6 Fall 2016 6 / 18

  8. Independent Events Definition Given a probability space (Ω , Pr) and two events A , B are independent if and only if Pr[A ∩ B] = Pr[A] Pr[B] . Otherwise they are dependent . In other words A , B independent implies one does not affect the other. Chandra & Ruta (UIUC) CS473 7 Fall 2016 7 / 18

  9. Independent Events Definition Given a probability space (Ω , Pr) and two events A , B are independent if and only if Pr[A ∩ B] = Pr[A] Pr[B] . Otherwise they are dependent . In other words A , B independent implies one does not affect the other. Example Two coins. Ω = { HH , TT , HT , TH } and Pr[HH] = Pr[TT] = Pr[HT] = Pr[TH] = 1 / 4 . A is the event that the first coin is heads and B is the event 1 that second coin is tails. A , B are independent. A is the event that the two coins are different. B is the event 2 that the second coin is heads. A , B independent. Chandra & Ruta (UIUC) CS473 7 Fall 2016 7 / 18

  10. Independent Events Examples Example A is the event that both are not tails and B is event that second coin is heads. A , B are dependent. Chandra & Ruta (UIUC) CS473 8 Fall 2016 8 / 18

  11. Dependent or independent? Consider two independent rolls of the dice. A = the event that the first roll is odd. 1 B = the event that the sum of the two rolls is odd. 2 The events A and B are (A) dependent. (B) independent. Chandra & Ruta (UIUC) CS473 9 Fall 2016 9 / 18

  12. Union bound The probability of the union of two events, is no bigger than the probability of the sum of their probabilities. Lemma For any two events E and F , we have that � � � � � � Pr E ∪ F ≤ Pr E + Pr F . Proof. Consider E and F to be a collection of elmentery events (which they are). We have � � � Pr E ∪ F = Pr[x] x ∈ E ∪ F � � � � � � ≤ Pr[x] + Pr[x] = Pr E + Pr F . x ∈ E x ∈ F Chandra & Ruta (UIUC) CS473 10 Fall 2016 10 / 18

  13. Random Variables Definition Given a probability space (Ω , Pr) a (real-valued) random variable X over Ω is a function that maps each elementary event to a real number. In other words X : Ω → R . Chandra & Ruta (UIUC) CS473 11 Fall 2016 11 / 18

  14. Random Variables Definition Given a probability space (Ω , Pr) a (real-valued) random variable X over Ω is a function that maps each elementary event to a real number. In other words X : Ω → R . Example A 6-sided unbiased die. Ω = { 1 , 2 , 3 , 4 , 5 , 6 } and Pr[i] = 1 / 6 for 1 ≤ i ≤ 6 . X : Ω → R where X(i) = i mod 2 . 1 Y : Ω → R where Y(i) = i 2 . 2 Chandra & Ruta (UIUC) CS473 11 Fall 2016 11 / 18

  15. Expectation Definition For a random variable X over a probability space (Ω , Pr) the expectation of X is defined as � ω ∈ Ω Pr[ ω ] X( ω ) . In other words, the expectation is the average value of X according to the probabilities given by Pr[ · ] . Chandra & Ruta (UIUC) CS473 12 Fall 2016 12 / 18

  16. Expectation Definition For a random variable X over a probability space (Ω , Pr) the expectation of X is defined as � ω ∈ Ω Pr[ ω ] X( ω ) . In other words, the expectation is the average value of X according to the probabilities given by Pr[ · ] . Example A 6-sided unbiased die. Ω = { 1 , 2 , 3 , 4 , 5 , 6 } and Pr[i] = 1 / 6 for 1 ≤ i ≤ 6 . X : Ω → R where X(i) = i mod 2 . Then E[X] = 1 / 2 . 1 Y : Ω → R where Y(i) = i 2 . Then 2 6 · i 2 = 91 / 6 . E[Y] = � 6 1 i=1 Chandra & Ruta (UIUC) CS473 12 Fall 2016 12 / 18

  17. Expected number of vertices? Let G = ( V , E ) be a graph with n vertices and m edges. Let H be the graph resulting from independently deleting every vertex of G with probability 1 / 2 . The expected number of vertices in H is (A) n/2. (B) n/4. (C) m/2. (D) m/4. (E) none of the above. Chandra & Ruta (UIUC) CS473 13 Fall 2016 13 / 18

  18. Expected number of edges? Let G = ( V , E ) be a graph with n vertices and m edges. Let H be the graph resulting from independently deleting every vertex of G with probability 1 / 2 . The expected number of edges in H is (A) n/2. (B) n/4. (C) m/2. (D) m/4. (E) none of the above. Chandra & Ruta (UIUC) CS473 14 Fall 2016 14 / 18

  19. Indicator Random Variables Definition A binary random variable is one that takes on values in { 0 , 1 } . Chandra & Ruta (UIUC) CS473 15 Fall 2016 15 / 18

  20. Indicator Random Variables Definition A binary random variable is one that takes on values in { 0 , 1 } . Special type of random variables that are quite useful. Definition Given a probability space (Ω , Pr) and an event A ⊆ Ω the indicator random variable X A is a binary random variable where X A ( ω ) = 1 if ω ∈ A and X A ( ω ) = 0 if ω �∈ A . Chandra & Ruta (UIUC) CS473 15 Fall 2016 15 / 18

  21. Indicator Random Variables Definition A binary random variable is one that takes on values in { 0 , 1 } . Special type of random variables that are quite useful. Definition Given a probability space (Ω , Pr) and an event A ⊆ Ω the indicator random variable X A is a binary random variable where X A ( ω ) = 1 if ω ∈ A and X A ( ω ) = 0 if ω �∈ A . Example A 6-sided unbiased die. Ω = { 1 , 2 , 3 , 4 , 5 , 6 } and Pr[i] = 1 / 6 for 1 ≤ i ≤ 6 . Let A be the even that i is divisible by 3 . Then X A (i) = 1 if i = 3 , 6 and 0 otherwise. Chandra & Ruta (UIUC) CS473 15 Fall 2016 15 / 18

  22. Expectation Proposition For an indicator variable X A , E[X A ] = Pr[A] . Proof. � E[X A ] = X A (y) Pr[y] y ∈ Ω � � = 1 · Pr[y] + 0 · Pr[y] y ∈ A y ∈ Ω \ A � = Pr[y] y ∈ A = Pr[A] . Chandra & Ruta (UIUC) CS473 16 Fall 2016 16 / 18

  23. Linearity of Expectation Lemma Let X , Y be two random variables (not necessarily independent) over a probability space (Ω , Pr) . Then E[X + Y] = E[X] + E[Y] . Proof. � E[X + Y] = Pr[ ω ] (X( ω ) + Y( ω )) ω ∈ Ω � � = Pr[ ω ] X( ω ) + Pr[ ω ] Y( ω ) = E[X] + E[Y] . ω ∈ Ω ω ∈ Ω Chandra & Ruta (UIUC) CS473 17 Fall 2016 17 / 18

  24. Linearity of Expectation Lemma Let X , Y be two random variables (not necessarily independent) over a probability space (Ω , Pr) . Then E[X + Y] = E[X] + E[Y] . Proof. � E[X + Y] = Pr[ ω ] (X( ω ) + Y( ω )) ω ∈ Ω � � = Pr[ ω ] X( ω ) + Pr[ ω ] Y( ω ) = E[X] + E[Y] . ω ∈ Ω ω ∈ Ω Corollary E[a 1 X 1 + a 2 X 2 + . . . + a n X n ] = � n i=1 a i E[X i ] . Chandra & Ruta (UIUC) CS473 17 Fall 2016 17 / 18

  25. Expected number of edges? Let G = ( V , E ) be a graph with n vertices and m edges. Let H be the graph resulting from independently deleting every vertex of G with probability 1 / 2 . The expected number of edges in H is (A) n/2. (B) n/4. (C) m/2. (D) m/4. (E) none of the above. Chandra & Ruta (UIUC) CS473 18 Fall 2016 18 / 18

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend