csci 2570 introduction to nanocomputing
play

CSCI 2570 Introduction to Nanocomputing Probability Theory John E - PowerPoint PPT Presentation

CSCI 2570 Introduction to Nanocomputing Probability Theory John E Savage The Role of Probability The manufacture of devices with nanometer- scale dimensions will necessarily introduce randomness into these devices. Some device


  1. CSCI 2570 Introduction to Nanocomputing Probability Theory John E Savage

  2. The Role of Probability � The manufacture of devices with nanometer- scale dimensions will necessarily introduce randomness into these devices. � Some device dimensions are so small that their position cannot be accurately controlled � For this reason, probability theory will play a central role in this area Lect 08 Probability Theory CSCI 2570 @John E Savage 2

  3. Sample Spaces � Probabilities estimate the frequency of outcomes of random experiments. � Outcomes can be from a finite or countable sample space (set) Ω of events or be tuples drawn over reals R . � Coin toss: Ω = {H,T} � Packets to a URL per day: Ω = N (positive integers) � Rain in cms/month in Prov.: Ω = R (reals) � Rain and sunshine/month: Ω = R 2 Lect 08 Probability Theory CSCI 2570 @John E Savage 3

  4. Probability Space � Sample space : all possible outcomes � Events : A family F of subsets of sample space Ω . � E.g. Ω = {H,T} 3 , F 0 = {TTT, HHT, HTH, THH} (Even no. Hs). F 1 = {HTT, THT, TTH, HHH} (Odd no. Hs). � Events are mutually exclusive if they are disjoint. E.g. F 0 and F 1 above. � A probability distribution is a function � The probability distribution assigns a probability 0 ≤ P(E) ≤ 1 to each event E. Lect 08 Probability Theory CSCI 2570 @John E Savage 4

  5. Properties of Probability Function � For any event E in Ω , 0 ≤ P(E) ≤ 1. � P( Ω ) = 1 � For any finite or countably infinite sequence of disjoint events E 1 , E 2 , … Lect 08 Probability Theory CSCI 2570 @John E Savage 5

  6. Probability Distributions � If Ω = R n , probability density p(x 1 ,…x n ) can be integrated over a volume to give a probability. E.g. Lect 08 Probability Theory CSCI 2570 @John E Savage 6

  7. Sets of Events � Joint probability P(A B) = � Notation: P(A,B) = P(A B) � Probability of a union P(A B) = P(A B) = P(A) + P(B) – P(A B) � Complement of event A: A = Ω –A.P(A A)=1 Lect 08 Probability Theory CSCI 2570 @John E Savage 7

  8. Probabilities of Events � If events A and B are mutually exclusive � P(A B) = 0 � P(A B) = P(A) + P(B) � Conditional probability of A given B, P(A/B) = P(A,B)/P(B) or P(A,B) = P(A/B)P(B). � Events A and B are statistically independent if P(A/B) = P(A), i.e., P(A,B) = P(A)P(B) Lect 08 Probability Theory CSCI 2570 @John E Savage 8

  9. Marginal Probability � Given a sample space Ω = K 2 containing pairs of events A i ,B j over K , the marginal probability is P(A) = ∑ j P(A,B j ), where B j are mutually exclusive. Lect 08 Probability Theory CSCI 2570 @John E Savage 9

  10. Principle of Exclusion/Inclusion � Let |A| = size of A � |A ∪ B| = |A|+|B| - A |A ∩ B| B A ∩ B B ∩ C A ∩ C � |A ∪ B ∪ C| = |A|+|B|+|C| - |A ∩ B| C - |A ∩ C| - |B ∩ C| + A ∩ B ∩ C |A ∩ B ∩ C| Lect 08 Probability Theory CSCI 2570 @John E Savage 10

  11. Principle of Inclusion/Exclusion Proof Use induction. Assume true for n-1 sets. Lect 08 Probability Theory CSCI 2570 @John E Savage 11

  12. Application of Inclusion/Exclusion � For odd, (-1) l+1 = 1 � For even , (-1) l+1 = -1 Lect 08 Probability Theory CSCI 2570 @John E Savage 12

  13. Special Application of Inclusion/Exclusion Lect 08 Probability Theory CSCI 2570 @John E Savage 13

  14. Event Product Spaces � Important sample spaces consists of Cartesian products of spaces � Ω = {(H,H), (H,T), (T,H), (T,T)} = {H,T} 2 � Ω = A n = {e 1 , e 2 , …, e n }, e i in A. � P 1,2 (H,H) = prob. of event (H,H). � E.g. P(H,H) =.04, P(H,T)=P(T,H) =.16,P(T,T) =.64 � They can model occurrences over time or space or both Lect 08 Probability Theory CSCI 2570 @John E Savage 14

  15. Event Product Spaces � Given events A and B with joint probability P(A,B), P(A) is the marginal probability of A. � E.g. � P 1 (H) = P 1,2 (H,H) + P 1,2 (H,T) = .04 + .16 = .20 � P 1 (T) = P 1,2 (T,H) + P 1,2 (T,T) = .16 + .64 = .80 � Consider events H and T on successive trials that are independent. � E.g. P 1,2 (H,T) = P 1 (H) P 2 (T) = .2 x .8 = .16 Lect 08 Probability Theory CSCI 2570 @John E Savage 15

  16. Product Events � Events are identically distributed if they have the same probability distribution. � Outcomes in a pair of H,T trials are i.d. � P 1 = P 2 , that is, P 1 (e) = P 2 (e) for all e in {H,T} � Events are independent and identically distributed ( i.i.d. ) if they are statistically independent and identically distributed. Lect 08 Probability Theory CSCI 2570 @John E Savage 16

  17. Random Variables � A random variable v is a function � E.g. Ω = {H,T}, v(H) = 1, v(T) = 0 � Expectation (average value) of a r.v. v is � E.g. � Expectation of sum is sum of expectations Lect 08 Probability Theory CSCI 2570 @John E Savage 17

  18. Geometric Random Variable Lect 08 Probability Theory CSCI 2570 @John E Savage 18

  19. Moments of Random Variables � Second moment of a r.v. � k th moment or a r.v. � Variance � Standard deviation Lect 08 Probability Theory CSCI 2570 @John E Savage 19

  20. Examples of Probability Distributions � Uniform : P(k) = 1/n for 1 ≤ k ≤ n � Binomial : n i.i.d. trials, Ω ={H,T} n , P(H) = α and P(T) = β = 1- α . P(k) = Pr(k H’s occur) � Poisson : � Is limit of binomial when and n large. Lect 08 Probability Theory CSCI 2570 @John E Savage 20

  21. Means and Variances of Probability Distributions � Uniform : � Binomial : � Poisson : Lect 08 Probability Theory CSCI 2570 @John E Savage 21

  22. Markov’s Inequality � Let X be a positive r.v., Proof Because Lect 08 Probability Theory CSCI 2570 @John E Savage 22

  23. Chebyshev’s Inequality � Let X be a r.v. Proof Note Lect 08 Probability Theory CSCI 2570 @John E Savage 23

  24. Moment Generating Function is a function that can be used to � compute moments and Chernoff bounds on tails of probabilities, i.e. Lect 08 Probability Theory CSCI 2570 @John E Savage 24

  25. Moment Generating Functions � Uniform : � Binomial : � Poisson : Lect 08 Probability Theory CSCI 2570 @John E Savage 25

  26. Chernoff Bound � Let X be a r.v. Proof Because Lect 08 Probability Theory CSCI 2570 @John E Savage 26

  27. Bounding Tails of a Binomial � � Markov � Chebyshev � Chernoff Lect 08 Probability Theory CSCI 2570 @John E Savage 27

  28. Chernoff Bound on Binomial Distribution � � Choose t = t 0 to minimize bound � Note that is convex because its second derivative is positive. � Thus, at t 0 the first derivative is zero. t 0 � That is and � Here Lect 08 Probability Theory CSCI 2570 @John E Savage 28

  29. Comparison of Bounds � n=100, α =.5, β =.5, a=70, E(x)=50, Var(x) = 5 � Markov : � Chebyshev : implies � Chernoff : implies � Exact : Lect 08 Probability Theory CSCI 2570 @John E Savage 29

  30. Birthday Problem � Each person equally likely to have day x as birthday, 1 ≤ x ≤ 365 � In a group of n persons, what is probability P B that at least two have same birthday? � 1-P B = 365(365-1)…(365-n+1)/365 n � P B ≈ .5 when n ≈ 23! Lect 08 Probability Theory CSCI 2570 @John E Savage 30

  31. Balls in Bins � m balls thrown into n bins independently and uniformly at random � How large should m be to ensure that all bins contain at least one ball with prob. ≥ 1- ε ? � Coupon collector problem : � C coupon types � Each box equally likely to contain any coupon type � How many boxes should be purchased to collect all coupons with probability at least 1- ε ? Lect 08 Probability Theory CSCI 2570 @John E Savage 31

  32. Coupon Collector Problem � C coupons, one per box with probability 1/C in a box � What is E(X), X = no. boxes to collect all coupons? � X = x 1 +…+x C , x i = no. boxes until i th coupon is collected. Prob. of a new coupon: p i = 1-(i-1)/C � x i is geometric r.v. with Pr(x i = n) = (1-p i ) n-1 p i � E(x i ) = 1/p i = C/(C-i+1) � E(X)=E(x 1 )+…+E(x C ) = Lect 08 Probability Theory CSCI 2570 @John E Savage 32

  33. Coupon Collector Problem with Failures In this model the probability that a coupon is not collected is 1- p s . The probability that a specific coupon is collected is p s /C . Theorem Let T = no. trials to ensure all C coupons collected with probability = 1- ε in coupon collector problem with failures satisfies Lect 08 Probability Theory CSCI 2570 @John E Savage 33

  34. Special Application of Inclusion/Exclusion Lect 08 Probability Theory CSCI 2570 @John E Savage 34

  35. Coupon Collection with Failures Proof Let E i be event i th coupon not collected after T trials. Also The goal is to find T so that Using Inclusion/Exclusion & Lect 08 Probability Theory CSCI 2570 @John E Savage 35

  36. Coupon Collection with Failures Then Equivalently but this implies Using gives the desired result. Lect 08 Probability Theory CSCI 2570 @John E Savage 36

  37. Conclusion � Methods of bounding tails of probability distributions can be very useful. Lect 08 Probability Theory CSCI 2570 @John E Savage 37

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend