12 classical statistics
play

12. Classical statistics Andrej Bogdanov Estimators X = ( X 1 , , X - PowerPoint PPT Presentation

ENGG 2430 / ESTR 2004: Probability and Statistics Spring 2019 12. Classical statistics Andrej Bogdanov Estimators X = ( X 1 , , X n ) independent samples ^ Unbiased: E [ Q n ] = q ^ Consistent: Q n converges to q in probability Estimating


  1. ENGG 2430 / ESTR 2004: Probability and Statistics Spring 2019 12. Classical statistics Andrej Bogdanov

  2. Estimators X = ( X 1 , …, X n ) independent samples ^ Unbiased: E [ Q n ] = q ^ Consistent: Q n converges to q in probability

  3. Estimating the mean X = ( X 1 , …, X n ) independent samples of X ^ M = ( X 1 + … + X n ) / n Unbiased? Consistent?

  4. Maximum likelihood Bayesian MAP estimate: maximize f Q | X ( q | x ) = f X | Q ( x | q ) f Q ( q ) Classical ML (maximum likelihood) estimate: maximize f X | Q ( x | q )

  5. Coin flip sequence HHT. What is ML bias estimate?

  6. Maximum likelihood for Bernoulli( p ) k heads, n – k tails. ML bias estimate?

  7. Within the first 3 seconds, raindrops arrive at times 1.2, 1.9, and 2.5. What is the estimated rate? 0 3

  8. Within the first 3 seconds, raindrops arrive at times 1.2, 1.9, and 2.5. What is the estimated rate? 0 3

  9. The first 3 raindrops arrive at 1.2, 1.9, and 2.5 sec. What is the estimated rate? 0

  10. Maximum likelihood for Exponential( l )

  11. A Normal( µ , s ) RV takes values 2.9, 3.3 . What is the ML estimate for µ ?

  12. A Normal( µ , s ) RV takes values 2.9, 3.3 . What is the ML estimate for v = s 2 ?

  13. Maximum likelihood for Normal( µ , s ) ( X 1 , …, X n ) independent Normal( µ , s ) ^ ^ Joint ML estimate ( M , V ) of ( µ , v = s 2 ) : X 1 + … + X n ^ M = n ^ ( X 1 – M ) 2 + … + ( X n – M ) 2 ^ ^ V = n

  14. ^ E [ V ] =

  15. ( X 1 , …, X n ) independent Normal( µ , s ) X 1 + … + X n ^ M = n ^ ( X 1 – M ) 2 + … + ( X n – M ) 2 ^ n ^ V = n – 1 n – 1

  16. A Normal( µ , 1) RV takes values X 1 , X 2 . You estimate ^ the mean by M = ( X 1 + X 2 )/2. What is the probability ^ that | M – µ | > 1 ?

  17. ^ For which value of t can we guarantee | M – µ | ≤ t with 95% probability?

  18. Confidence intervals ^ ^ A p -confidence interval is a pair Q - , Q + so that ^ ^ P ( q is between Q - and Q + ) ≥ p

  19. An car-jack detector outputs Normal(0, 1) if there is no intruder and Normal(1, 1) if there is. You want to catch 95% of intrusions. What is the probability of a false positive?

  20. Hypothesis testing

  21. Neyman-Pearson Lemma Among all X 1 / X 0 tests with given false negative probability, the false positive is minimized by the one that picks samples with largest likelihood ratio f X ( x ) 1 f X ( x ) 0

  22. Rain usually falls at 1 drop/sec. You want to test today’s rate is 5/sec based on first drop. How to set up test with 5% false negative?

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend