inference under the entropy maximizing bayesian model of
play

Inference under the entropy-maximizing Bayesian model of sufficient - PowerPoint PPT Presentation

Inference under the entropy-maximizing Bayesian model of sufficient evidence The Third International Conference on Mathematical and Computational Medicine 18 May 2016 David Bickel University of Ottawa Loss function example What act is


  1. Inference under the entropy-maximizing Bayesian model of sufficient evidence The Third International Conference on Mathematical and Computational Medicine 18 May 2016 David Bickel University of Ottawa

  2. Loss function example What act is appropriate? • Given multiple priors of sufficient evidence, agreement with the data • With or without confidence sets

  3. Adequate models have sufficient evidence

  4. Assessing multiple models All models Models in conflict with the data Adequate models (models of sufficient evidential support)

  5. Decisions under robust Bayes rules applied to models of sufficient evidence Robust Bayes acts: Adequate • E-admissible Adequate models posteriors • Hurwicz criterion • Γ -minimax Bickel, International Journal of Approximate Reasoning 66 , 53-72 Loss function 1 model 1 posterior Bayes act

  6. Bayesian model assessment Does the prior or parametric family conflict with the data? • Yes, if it has low evidential support compared to alternatives • Measures of support satisfying the criteria of Bickel, International Statistical Review 81 , 188-206: • Likelihood ratio (limited to comparing two simple models) • Bayes factor • Penalized likelihood ratio No Yes Use the model to take Bayes act Change the model (minimize posterior expected loss) to an adequate model, a even if other models are adequate model of sufficient support

  7. Decisions under the entropy-maximizing Bayesian model of sufficient evidence Adequate Adequate models posteriors Bickel, International Journal of Approximate Reasoning 66 , 53-72 Blending Loss function Nested confidence 1 posterior Bayes act sets

  8. Decisions under the entropy-maximizing Bayesian model of sufficient evidence Adequate Adequate models posteriors Bickel, International Journal of Approximate Reasoning 66 , 53-72 Blending Loss function Nested confidence 1 posterior Bayes act sets

  9. Decisions under entropy-based pooling of Bayesian model of sufficient evidence Bickel, International Journal of Approximate Reasoning 66 , 53-72 Adequate Adequate models posteriors Pooling Loss function 1 posterior Minimize expected loss

  10. The three players • Combiner is you, the statistician who combines the candidate distributions • Goal 1: minimize error; Goal 2: beat Chooser • Chooser is the imaginary statistician who instead chooses a single candidate distribution • Goal 1: minimize error; Goal 2: beat Combiner • Nature picks the true distribution among those that are plausible in order to help Chooser

  11. Information game • Information divergence and inferential gain: • Utility paid to Statistician 2 (Combiner or Chooser): • Reduction to game of Combiner v. Nature-Chooser Coalition:

  12. Robust Bayes acts: Adequate • E-admissible Adequate models posteriors • Hurwicz criterion • Γ -minimax Blending Pooling Loss function Nested confidence 1 posterior Bayes act sets Funded by the Natural Sciences and Engineering Research Council of Canada

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend