prior choice
play

Prior Choice . . . . . A HMAD P ARSIAN S CHOOL OF M ATHEMATICS , - PowerPoint PPT Presentation

. . Prior Choice . . . . . A HMAD P ARSIAN S CHOOL OF M ATHEMATICS , S TATISTICS AND C OMPUTER S CIENCE U NIVERSITY OF T EHRAN A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 1 / 19 Different types of Bayesians - Classical


  1. . . Prior Choice . . . . . A HMAD P ARSIAN S CHOOL OF M ATHEMATICS , S TATISTICS AND C OMPUTER S CIENCE U NIVERSITY OF T EHRAN A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 1 / 19

  2. Different types of Bayesians - Classical Bayesians, - Modern Parametric Bayesians, - Subjective Bayesians. A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 2 / 19

  3. Different types of Bayesians - Classical Bayesians, - Modern Parametric Bayesians, - Subjective Bayesians. Prior Choice - Informative prior based on, - Expert knowledge (subjective), - Historical data (objective). Subjective information is based on personal opinions and feelings rather than facts. Objective information is based on facts. A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 2 / 19

  4. Different types of Bayesians - Classical Bayesians, - Modern Parametric Bayesians, - Subjective Bayesians. Prior Choice - Informative prior based on, - Expert knowledge (subjective), - Historical data (objective). Subjective information is based on personal opinions and feelings rather than facts. Objective information is based on facts. - Uninformative prior, representing ignorance, - Jeffreys prior, - Based on data in some way (reference prior). A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 2 / 19

  5. Classical Bayesians - The prior is a necessary evil, - Choose priors that interject the least information possible. The least = the minimum that should done in a situation. A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 3 / 19

  6. Classical Bayesians - The prior is a necessary evil, - Choose priors that interject the least information possible. The least = the minimum that should done in a situation. Modern Parametric Bayesians - The prior is a useful convenience. - Choose prior distributions with desirable properties (e.g.: conjugacy). - Given a distributional choice, prior parameters are chosen to interject the least information. A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 3 / 19

  7. Classical Bayesians - The prior is a necessary evil, - Choose priors that interject the least information possible. The least = the minimum that should done in a situation. Modern Parametric Bayesians - The prior is a useful convenience. - Choose prior distributions with desirable properties (e.g.: conjugacy). - Given a distributional choice, prior parameters are chosen to interject the least information. Subjective Bayesians - The prior is a summary of old beliefs. - Choose prior distributions based on previous knowledge (either the results of earlier studies or non-scientific opinion.) A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 3 / 19

  8. . Example . . . Modern Parametric Bayesians Suppose X ∼ N ( θ, σ 2 ) . Let τ = 1 /σ 2 . . . . . . A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 4 / 19

  9. . Example . . . Modern Parametric Bayesians Suppose X ∼ N ( θ, σ 2 ) . Let τ = 1 /σ 2 . Q: What prior distribution would a Modern Parametric Bayesians choose to satisfy the demand of convenience? . . . . . A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 4 / 19

  10. . Example . . . Modern Parametric Bayesians Suppose X ∼ N ( θ, σ 2 ) . Let τ = 1 /σ 2 . Q: What prior distribution would a Modern Parametric Bayesians choose to satisfy the demand of convenience? A: Using the definition π ( θ, τ ) = π ( θ | τ ) π ( τ ) , . . . . . A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 4 / 19

  11. . Example . . . Modern Parametric Bayesians Suppose X ∼ N ( θ, σ 2 ) . Let τ = 1 /σ 2 . Q: What prior distribution would a Modern Parametric Bayesians choose to satisfy the demand of convenience? A: Using the definition π ( θ, τ ) = π ( θ | τ ) π ( τ ) , Prior choice is N ( µ, σ 2 θ | τ ∼ 0 ) τ ∼ Gamma ( α, β ) And you know that θ | τ, x ∼ Normal τ | x ∼ Gamma . . . . . A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 4 / 19

  12. . Example . . . (Continued) Q: What prior distribution would a Lazy Modern Parametric Bayesians choose to satisfy the demand of convenience? . . . . . A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 5 / 19

  13. . Example . . . (Continued) Q: What prior distribution would a Lazy Modern Parametric Bayesians choose to satisfy the demand of convenience? A: Using the fact (suppose you do not want to think too hard about the prior) π ( θ, τ ) = π ( θ ) π ( τ ) , . . . . . A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 5 / 19

  14. . Example . . . (Continued) Q: What prior distribution would a Lazy Modern Parametric Bayesians choose to satisfy the demand of convenience? A: Using the fact (suppose you do not want to think too hard about the prior) π ( θ, τ ) = π ( θ ) π ( τ ) , Prior choice is θ | τ ∼ N ( 0 , t ) τ ∼ Gamma ( α, β ) Obviously, the marginal posterior from this model would be a bit difficult analytically (in general), but it is easy to implement the Gibbs Sampler. . . . . . A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 5 / 19

  15. The Main Talk X = ( X 1 , , X n ) ∼ f θ ( x ) A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 6 / 19

  16. The Main Talk X = ( X 1 , , X n ) ∼ f θ ( x ) θ ∼ π ( θ ) A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 6 / 19

  17. The Main Talk X = ( X 1 , , X n ) ∼ f θ ( x ) θ ∼ π ( θ ) θ | x ∼ π ( θ | x ) f θ ( x ) π ( θ ) π ( θ | x ) = , m ( x ) ∫ Where m ( x ) = f θ ( x ) π ( θ ) d θ is marginal dist. of X . A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 6 / 19

  18. N UMERICAL E XAMPLE Let us concentrate on the following problem. Y = ∑ X i ∼ B ( n , θ ) Suppose X 1 , , X n be i.i.d. B ( 1 , θ ) , then Need a prior on θ : A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 7 / 19

  19. N UMERICAL E XAMPLE Let us concentrate on the following problem. Y = ∑ X i ∼ B ( n , θ ) Suppose X 1 , , X n be i.i.d. B ( 1 , θ ) , then Need a prior on θ : Take θ ∼ Beta ( α, β ) (Remember that this is a perfectly Subjective choice and anybody can use their own.) So, θ | y ∼ Beta ( y + α, n − y + β ) . A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 7 / 19

  20. N UMERICAL E XAMPLE Let us concentrate on the following problem. Y = ∑ X i ∼ B ( n , θ ) Suppose X 1 , , X n be i.i.d. B ( 1 , θ ) , then Need a prior on θ : Take θ ∼ Beta ( α, β ) (Remember that this is a perfectly Subjective choice and anybody can use their own.) So, θ | y ∼ Beta ( y + α, n − y + β ) . Under Squared Error Loss (SEL), the Bayes estimate is y + α δ π ( y ) = n + α + β n y α + β α = n + n + α + β n + α + β α + β Which is a linear combination of sample mean and prior mean. A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 7 / 19

  21. N UMERICAL E XAMPLE We have a coin. Is this a fair coin? i.e., is θ = 1 2 ? A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 8 / 19

  22. N UMERICAL E XAMPLE We have a coin. Is this a fair coin? i.e., is θ = 1 2 ? Suppose you flip it 10 times, and it comes up heads 3 times. A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 8 / 19

  23. N UMERICAL E XAMPLE We have a coin. Is this a fair coin? i.e., is θ = 1 2 ? Suppose you flip it 10 times, and it comes up heads 3 times. As a frequentist: We use the sample mean, i.e., ˆ 3 θ = 10 = 0 . 3. A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 8 / 19

  24. N UMERICAL E XAMPLE We have a coin. Is this a fair coin? i.e., is θ = 1 2 ? Suppose you flip it 10 times, and it comes up heads 3 times. As a frequentist: We use the sample mean, i.e., ˆ 3 θ = 10 = 0 . 3. As a Bayesian: We have to completely specify the prior distribution, i.e., we have to choose α and β . The Choice again depends on our belief. Notice that: - To estimate θ , a Bayesian analyst would put a prior dist. on θ and use the posterior dist. of θ to draw various conclusions: estimating θ with posterior mean. - When there is no strong prior opinion on what θ is, it is desirable to pick a prior that is NON-INFORMATIVE. A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 8 / 19

  25. N UMERICAL E XAMPLE If we feel strongly that this coin is like any other coin and therefore really should be a fair coin, we should choose α and β so that the prior puts all its weight at around 1 2 . A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 9 / 19

  26. N UMERICAL E XAMPLE If we feel strongly that this coin is like any other coin and therefore really should be a fair coin, we should choose α and β so that the prior puts all its weight at around 1 2 . α + β = 1 α e.g., α = β = 100, then E ( θ ) = 2 αβ and Var ( θ ) = ( α + β + 1 )( α + β ) 2 = 0 . 0016 Therefore, ( 3 + 100 ) δ π ( 3 ) = ( 10 + 100 + 100 ) = 0 . 4905 A HMAD P ARSIAN (University of Tehran) Prior Choice April 2014 9 / 19

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend