an introduction to bayesian inference and mcmc methods
play

An Introduction to Bayesian Inference and MCMC Methods for - PowerPoint PPT Presentation

An Introduction to Bayesian Inference and MCMC Methods for Capture-Recapture Trinity River Restoration Program Workshop on Outmigration: Population Estimation October 68, 2009 An Introduction to Bayesian Inference The Binomial Model 1


  1. An Introduction to Bayesian Inference and MCMC Methods for Capture-Recapture Trinity River Restoration Program Workshop on Outmigration: Population Estimation October 6–8, 2009

  2. An Introduction to Bayesian Inference The Binomial Model 1 Maximum Likelihood Estimation Bayesian Inference and the Posterior Density Summarizing the Posterior Density MCMC Methods and the Binomial Model 2 An Introduction to MCMC An Introduction to WinBUGS Two-Stage Capture-Recapture Models 3 The Simple-Petersen Model The Stratified-Petersen Model The Hierarchical-Petersen Model Further Issues in Bayesian Statistics and MCMC 4 Monitoring MCMC Convergence Model Selection and the DIC Goodness-of-Fit and Bayesian p-values Bayesian Penalized Splines 5

  3. An Introduction to Bayesian Inference The Binomial Model 1 Maximum Likelihood Estimation Bayesian Inference and the Posterior Density Summarizing the Posterior Density

  4. An Introduction to Bayesian Inference The Binomial Model 1 Maximum Likelihood Estimation Bayesian Inference and the Posterior Density Summarizing the Posterior Density

  5. Maximum Likelihood Estimation The Binomial Distribution Setup ◮ a population contains a fixed and known number of marked individuals ( n ) Assumptions ◮ every individual has the same probability of being captured ( p ) ◮ individuals are captured independently Probability Mass Function The probability that m of n individuals are captured is: � n � p m (1 − p ) n − m P ( m | p ) = m Introduction to Bayesian Inference: The Binomial Model, Maximum Likelihood Estimation 5/60

  6. Maximum Likelihood Estimation The Binomial Distribution If n = 30 and p = . 8: Probability Mass Function 0.15 0.10 Probability 0.05 0.00 0 5 10 15 20 25 30 m Introduction to Bayesian Inference: The Binomial Model, Maximum Likelihood Estimation 6/60

  7. Maximum Likelihood Estimation The Likelihood Function Definition The likelihood function is equal to the probability mass function of the observed data allowing the parameter values to change while the data is fixed. The likelihood function for the binomial experiment is: � n � p m (1 − p ) n − m L ( p | m ) = m Introduction to Bayesian Inference: The Binomial Model, Maximum Likelihood Estimation 7/60

  8. Maximum Likelihood Estimation The Likelihood Function If n = 30 and m = 24: 0.15 0.10 Likelihood 0.05 0.00 0.0 0.2 0.4 0.6 0.8 1.0 p Introduction to Bayesian Inference: The Binomial Model, Maximum Likelihood Estimation 8/60

  9. Maximum Likelihood Estimation Maximum Likelihood Estimates Definition The maximum likelihood estimator is the value of the parameter which maximizes the likelihood function for the observed data. The maximum likelihood estimator of p for the binomial experiment is: p = m ˆ n Introduction to Bayesian Inference: The Binomial Model, Maximum Likelihood Estimation 9/60

  10. Maximum Likelihood Estimation Maximum Likelihood Estimates p = 24 If n = 30 and m = 24 then ˆ 30 = . 8: ● 0.15 0.10 Likelihood 0.05 0.00 0.0 0.2 0.4 0.6 0.8 1.0 p Introduction to Bayesian Inference: The Binomial Model, Maximum Likelihood Estimation 10/60

  11. Maximum Likelihood Estimation Measures of Uncertainty Imagine that the same experiment could be repeated many times without changing the value of the parameter. Definition 1 The standard error of the estimator is the standard deviation of the estimates computed from each of the resulting data sets. Definition 2 A 95% confidence interval is a pair of values which, computed in the same manner for each data set, would bound the true value for at least 95% of the repetitions. The standard error for the capture probability is: � SE p = ˆ p (1 − ˆ p ) / m . A 95% confidence interval has bounds: ˆ p − 1 . 96 SE p and ˆ p + 1 . 96 SE p . Introduction to Bayesian Inference: The Binomial Model, Maximum Likelihood Estimation 11/60

  12. Maximum Likelihood Estimation Measures of Uncertainty If n = 30 and m = 24 then: ◮ the standard error of ˆ p is: SE p = . 07 ◮ a 95% confidence interval for ˆ p is: (.66,.94) ● 0.15 0.10 Likelihood 0.05 0.00 0.0 0.2 0.4 0.6 0.8 1.0 p Introduction to Bayesian Inference: The Binomial Model, Maximum Likelihood Estimation 12/60

  13. An Introduction to Bayesian Inference The Binomial Model 1 Maximum Likelihood Estimation Bayesian Inference and the Posterior Density Summarizing the Posterior Density

  14. Bayesian Inference and the Posterior Density Combining Data from Multiple Experiments Pilot Study ◮ Data: n = 20, m = 10 � 20 ◮ Likelihood: � p 10 (1 − p ) 10 10 Full Experiment ◮ Data: n = 30, m = 24 � 30 p 24 (1 − p ) 6 ◮ Likelihood: � 24 Combined Analysis � 20 �� 30 p 34 (1 − p ) 16 ◮ Likelihood: � 10 24 p = 34 ◮ Estimate: ˆ 50 = . 68 Introduction to Bayesian Inference: The Binomial Model, Bayesian Inference and the Posterior Density 14/60

  15. Bayesian Inference and the Posterior Density Combining Data with Prior Beliefs I Prior Beliefs ◮ Hypothetical Data: n = 20, m = 10 � 20 p 10 (1 − p ) 10 ◮ Prior Density: � 10 Full Experiment ◮ Data: n = 30, m = 24 � 30 ◮ Likelihood: � p 24 (1 − p ) 6 24 Posterior Beliefs � 20 �� 30 p 34 (1 − p ) 16 ◮ Posterior Density: � 10 24 ◮ Estimate: ˆ p = 34 50 = . 68 Introduction to Bayesian Inference: The Binomial Model, Bayesian Inference and the Posterior Density 15/60

  16. Bayesian Inference and the Posterior Density Combining Data with Prior Beliefs I Likelihood Prior Posterior 0.0 0.2 0.4 0.6 0.8 1.0 p Introduction to Bayesian Inference: The Binomial Model, Bayesian Inference and the Posterior Density 16/60

  17. Bayesian Inference and the Posterior Density Combining Data with Prior Beliefs II Prior Beliefs ◮ Hypothetical Data: n = 2, m = 1 � 2 ◮ Prior Density: � p (1 − p ) 1 Full Experiment ◮ Data: n = 30, m = 24 � 30 ◮ Likelihood: � p 24 (1 − p ) 6 24 Posterior Beliefs � 30 �� 2 p 25 (1 − p ) 7 ◮ Posterior Density: � 24 1 ◮ Estimate: ˆ p = 25 32 = . 78 Introduction to Bayesian Inference: The Binomial Model, Bayesian Inference and the Posterior Density 17/60

  18. Bayesian Inference and the Posterior Density Combining Data with Prior Beliefs II Likelihood Prior Posterior 0.0 0.2 0.4 0.6 0.8 1.0 p Introduction to Bayesian Inference: The Binomial Model, Bayesian Inference and the Posterior Density 18/60

  19. An Introduction to Bayesian Inference The Binomial Model 1 Maximum Likelihood Estimation Bayesian Inference and the Posterior Density Summarizing the Posterior Density

  20. Summarizing the Posterior Density Fact A Bayesian posterior density is a true probability density which can be used to make direct probability statements about a parameter. Introduction to Bayesian Inference: The Binomial Model, Summarizing the Posterior Density 20/60

  21. Summarizing the Posterior Density Bayesian Point Estimates Classical Point Estimates ◮ Maximum Likelihood Estimate Bayesian Point Estimates ◮ Posterior Mode Introduction to Bayesian Inference: The Binomial Model, Summarizing the Posterior Density 21/60

  22. Summarizing the Posterior Density Bayesian Point Estimates Classical Point Estimates ◮ Maximum Likelihood Estimate Bayesian Point Estimates ◮ Posterior Mode ◮ Posterior Mean ◮ Posterior Median Introduction to Bayesian Inference: The Binomial Model, Summarizing the Posterior Density 21/60

  23. Summarizing the Posterior Density Bayesian Uncertainty Estimates Classical Measures of Uncertainty ◮ Standard Error ◮ 95% Confidence Interval Bayesian Measures of Uncertainty ◮ Posterior Standard Deviation: The standard deviation of the posterior density. ◮ 95% Credible Interval: Any interval which contains 95% of the posterior density. Introduction to Bayesian Inference: The Binomial Model, Summarizing the Posterior Density 22/60

  24. Exercises 1. Bayesian inference for the binomial experiment File: Intro to splines \ Exercises \ binomial 1.R This file contains code for plotting the prior density, likelihood function, and posterior density for the binomial model. Vary the values of n , m , and alpha to see how the shapes of these functions and the corresponding posterior summaries are affected. Introduction to Bayesian Inference: The Binomial Model, Summarizing the Posterior Density 23/60

  25. An Introduction to Bayesian Inference MCMC Methods and the Binomial Model 2 An Introduction to MCMC An Introduction to WinBUGS

  26. An Introduction to Bayesian Inference MCMC Methods and the Binomial Model 2 An Introduction to MCMC An Introduction to WinBUGS

  27. An Introduction to MCMC Sampling from the Posterior Concept If the posterior density is too complicated, then we can estimate posterior quantities by generating a sample from the posterior density and computing sample statistics. Introduction to Bayesian Inference: MCMC Methods and the Binomial Model, An Introduction to MCMC 26/60

  28. An Introduction to MCMC The Very Basics of Markov chain Monte Carlo Definition A Markov chain is a sequence of events such that the probabilities for one event depend only on the outcome of the previous event in the sequence. Key Property If we choose construct the Markov chain properly then the probability density of the events can be made to match any probability density – including the posterior density. Introduction to Bayesian Inference: MCMC Methods and the Binomial Model, An Introduction to MCMC 27/60

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend