probability and statistics
play

Probability and Statistics for Computer Science Who discovered - PowerPoint PPT Presentation

Probability and Statistics for Computer Science Who discovered this? n 1 + 1 e = lim n n Credit: wikipedia Hongye Liu, Teaching Assistant Prof, CS361, UIUC, 09.24.2020 the number ? what is an pkqn = In - k " ( Pt q ) k


  1. Probability and Statistics ì for Computer Science Who discovered this? � n � 1 + 1 e = lim n n →∞ Credit: wikipedia Hongye Liu, Teaching Assistant Prof, CS361, UIUC, 09.24.2020

  2. the number ? what is an pkqn = In - k " ( Pt q ) k - - O ( I ) = ? an - ( Te ) au - Pt f- I chen ⇒ E- aiapkq " - " , = k=#

  3. A - B ? How many paths lead that i ! in

  4. Last time , R Random variable expectations of * Review Markov 's Inequality * 's inequality chebyshev Lyga numbers law of weak * The

  5. Proof of Weak law of large numbers � Apply Chebyshev’s inequality P ( | X − E [ X ] | ≥ � ) ≤ var [ X ] � 2 var [ X ] = var [ X ] � SubsQtute and E [ X ] = E [ X ] N P ( | X − E [ X ] | ≥ � ) ≤ var [ X ] 0 N � 2 N → ∞ N →∞ P ( | X − E [ X ] | ≥ � ) = 0 lim

  6. Applications of the Weak law of large numbers � The law of large numbers jus$fies using simula$ons (instead of calculaQon) to esQmate the expected values of random variables N →∞ P ( | X − E [ X ] | ≥ � ) = 0 lim � The law of large numbers also jus$fies using histogram of large random samples to approximate the probability distribuQon funcQon , see proof on P ( x ) Pg. 353 of the textbook by DeGroot, et al.

  7. Histogram of large random IID samples approximates the probability distribution � The law of large numbers jusQfies using histograms to approximate the probability distribuQon. Given N IID random variables X 1 , …, X N � According to the law of large numbers read � N i =1 Y i N → ∞ off line E [ Y i ] Y = . N � As we know for indicator funcQon text ph - # . so I . E [ Y i ] = P ( c 1 ≤ X i < c 2 ) = P ( c 1 ≤ X < c 2 )

  8. Objectives Bernoulli Distribution Distribution ) Bernoulli Binomial trials Distribution Geometric Distribution Uniform Discrete Variable Random continuous

  9. Random variables variable A maps random Numbers , outcomes so eo all CK ) ( W ) Bernoulli function ! ! it's a = panicky w is tail XXXX is head w I Xcw )

  10. Bernoulli Random Variable X ( w )= f w = event A Heard → ' → T ail w = otherwise O Pc A) P = ? fPcX=K ) .

  11. Bernoulli Distribution apex ) ECXI = ? var Ext = ? x O 2- xpcx , E- Cx ) = - fi - pi =p I - P t o = ECXT - ETXT var [ XI = - p2 = -2*2 pox ) T.pt o ? CEB ) " - P = - p =p pups

  12. Bernoulli Distribution apex ) ECXI = ? i var ( x ) = ? - P i > x I xp to - Ci - xp ) E- ( X ) = I xp CK ) = =p E EXT - ETH u ar Cx ) = - p~ =p . - p ' I ? P t o ? c c - p ) = =p co - p ,

  13. Bernoulli distribution � A random variable X is Bernoulli if it takes on two values 0 and 1 such that x = I p pcX= 24=1 , - p x - o - otherwise o E [ X ] = p var [ X ] = p (1 − p ) Jacob Bernoulli (1654-1705) Credit: wikipedia

  14. Bernoulli distribution � Examples � Tossing a biased (or fair) coin � Making a free throw � Rolling a six-sided die and checking if it shows 6 � Any indicator func:on of a random variable = f Event A happens t IA otherwise PCA > =p • Ix PLA ) to x CI - peas ) E- ( Ita ) = PC Event A) =

  15. Binomial Distribution N Xs RV of the Binomial sum is RVs Bernoulli independent Xicat-fgw-eey.it " = I Xi Xg w -_ other . i -_ I is ? Range at Xs

  16. Binomial Distribution N Xs RV of the Binomial sum is RVs Bernoulli independent biased coin , times µ → Toss a heads ? how many K N - k k - m (7) p Cl - p ) - fi - p , - A) = ? - p - p Pl Xs - - - . . pkci-pY-kk-2-kc.co = ( Ye ) . , N ]

  17. I t i - - - - ¥0 O O ④ # - G positions e.g µ posit : . ( ie ) k head a

  18. Expectations of Binomial distribution � A discrete random variable X is binomial if � N � p k (1 − p ) N − k P ( X = k ) = for integer 0 ≤ k ≤ N k I - with E [ X ] = Np & var [ X ] = Np (1 − p ) fan = -2 " P " ' when µ = = I kpck ) , → Bernoulli " - k pickup , = Eh - ' T.pa.us - ECHT ]=TCX UNH Efx ' - = ,

  19. N = Iki Xs Kii [ =/ = Effi , xij iia Elks ) . EC*l=EG ] N = I Efxi ) t Bernoulli E- I RU N 2- p = f- I Np =

  20. tdI vgrfxtTI-vmcxltuarCTJ.at vast Xs ) ECCX - ECT 's ] = varf -2 Xi ) - Xi are Rv ii d ' i . = I varlxi ] - identical i independent - p ) = N.ph Bernoulli of varix : ) indent ⇒ # mandate pix , = pctp > a

  21. ⇒ Binomial distribution " ( Ya ) pkqlh - K ) ( pegs = k=o 4245=1 - t ) ifc peg - P = 0.5 Credit: Prof. Grinstead

  22. Binomial distribution: die example O � Let X be the number of sixes in 36 rolls of a fair six-sided die. What is P(X=k) for k =5, 6, 7 P = } - 36 N - P ix. k ) = ( Irb ) .pk c , - f ,36 - k � Calculate E[X] and var[X] - - Efx ] = Nip =3 Gift = 6 - 36 × 8 × 53 . pit - p ) - 5 war CxI= N =

  23. Distribution Geometric a time D= p peons µ . p u times ) = ( t - p ) pl N TH 7=4 - pip n . - pi TT H ' - = FH TET ' i - - . IX ' l l , c . a H k time see w " - ' Pl K times ) = Ci - p ) p - K 31

  24. Geometric distribution � A discrete random variable X is geometric if P ( X = k ) = (1 − p ) k − 1 p k ≥ 1 H, TH, TTH, TTTH, TTTTH, TTTTTH,… � Expected value and variance E [ X ] = 1 var [ X ] = 1 − p & p 2 p Edit

  25. Geometric distribution P ( X = k ) = (1 − p ) k − 1 p k ≥ 1 i. P= 0.2 P= 0.5 p go Credit: Prof. Grinstead

  26. Geometric distribution � Examples: � How many rolls of a six-sided die will it take to see the first 6? � How many Bernoulli trials must be done before the first 1? � How many experiments needed to have the first success? � Plays an important role in the theory of queues

  27. Derivation of geometric expected value ∞ ECx7= Expose , � k (1 − p ) k − 1 p E [ X ] = kpck , K - T K k =1 Pik , ∞ =p IT , K - I Kei - p > � k (1 − p ) k − 1 = p = Ip E Kei - p ) " k =1 ∞ p k (1 − p ) k = 1 � K = - - I 1 − p x p n k =1 I nx = - I - n

  28. Derivation of geometric expected value ∞ � k (1 − p ) k − 1 p E [ X ] = k =1 ∞ � k (1 − p ) k − 1 = p k =1 ∞ p k (1 − p ) k = 1 � = 1 − p p k =1

  29. Derivation of geometric expected value ∞ � k (1 − p ) k − 1 p E [ X ] = k =1 ∞ � k (1 − p ) k − 1 = p k =1 ∞ p � k (1 − p ) k = 1 − p k =1

  30. Derivation of geometric expected value ∞ � k (1 − p ) k − 1 p E [ X ] = k =1 ∞ � k (1 − p ) k − 1 = p k =1 ∞ p � k (1 − p ) k = 1 − p k =1 ✺ For we have this power series:

  31. Derivation of geometric expected value ∞ � k (1 − p ) k − 1 p E [ X ] = k =1 ∞ � k (1 − p ) k − 1 = p k =1 l - p ' = K ∞ p - ftp.tp#--pt � k (1 − p ) k = - 1 − p k =1 ✺ For we have ∞ x nx n = � (1 − x ) 2 ; | x | < 1 this power series: I n =1

  32. Derivation of the power series ∞ x nx n = � (1 − x ) 2 ; | x | < 1 S ( x ) = n =1 ∞ S ( x ) ∞ 1 Proof: ; x n = � nx n − 1 � = | x | < 1 1 − x ; x n =1 n =0 � x ∞ S ( t ) 1 x x n = x · � = 1 − x = 1 − x t 0 n =1 S ( x ) x ′ = ( 1 − x ) Read a tame x x S ( x ) = (1 − x ) 2

  33. Geometric distribution: die example � Let X be the number of rolls of a fair six-sided die needed to see the first 6. What is P ( X = k ) - for k = 1, 2? - pixel > =p '= } p=f pcx=z)=ct µ p=5zxt=z÷ - ECxI=pt=¥=6 � Calculate E[ X ] and var[ X ] so =i-p_ = E [ X ] = 1 var [ X ] = 1 − p was & a- p 2 p -

  34. Betting brainteaser � What would you rather bet on? � How many rolls of a fair six-sided die will it take to see the first 6? � How many sixes will appear in 36 rolls of a fair six-sided die? � Why?

  35. Multinomial distribution � A discrete random variable X is MulQnomial if N ! n 1 ! n 2 ! ...n k ! p n 1 1 p n 2 2 ...p n k P ( X 1 = n 1 , X 2 = n 2 , ..., X k = n k ) = k where N = n 1 + n 2 + ... + n k � The event of throwing N Qmes the k-sided die to see the probability of gepng n 1 X 1 , n 2 X 2, n 3 X 3 …n k X k - line Read off

  36. Multinomial distribution � A discrete random variable X is MulQnomial if N ! n 1 ! n 2 ! ...n k ! p n 1 1 p n 2 2 ...p n k P ( X 1 = n 1 , X 2 = n 2 , ..., X k = n k ) = k where N = n 1 + n 2 + ... + n k � The event of throwing k-sided die to see the probability of gepng n 1 X 1 , n 2 X 2, n 3 X 3 … 8! off - line 3!2!1!1!1! Read ILLINOIS? I L

  37. Multinomial distribution � Examples � If we roll a six-sided die N Qmes, how many of each value will we see? � What are the counts of N independent and idenQcal distributed trials? � This is very widely used in geneQcs off - line head

  38. Multinomial distribution: die example � What is the probability of seeing 1 one, 2 twos, 3 threes, 4 fours, 5 fives and 0 sixes in 15 rolls of a fair six- sided die? - line oft solve

  39. Discrete uniform distribution � A discrete random variable X is uniform if it xwr.si :÷ takes k different values and P ( X = x i ) = 1 Xk For all x i that X can take k ÷ � For example: � Rolling a fair k-sided die � Tossing a fair coin (k=2)

  40. Discrete uniform distribution � ExpectaQon of a discrete random variable X that takes k different values uniformly k E [ X ] = 1 � x i k i =1 � Variance of a uniformly distributed random variable X . k var [ X ] = 1 � ( x i − E [ X ]) 2 k i =1

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend