Inference under the entropy-maximizing Bayesian model of sufficient - - PowerPoint PPT Presentation

inference under the entropy maximizing bayesian model of
SMART_READER_LITE
LIVE PREVIEW

Inference under the entropy-maximizing Bayesian model of sufficient - - PowerPoint PPT Presentation

Inference under the entropy-maximizing Bayesian model of sufficient evidence The Third International Conference on Mathematical and Computational Medicine 18 May 2016 David Bickel University of Ottawa Loss function example What act is


slide-1
SLIDE 1

Inference under the entropy-maximizing Bayesian model of sufficient evidence

The Third International Conference on Mathematical and Computational Medicine 18 May 2016 David Bickel University of Ottawa

slide-2
SLIDE 2

Loss function example

What act is appropriate?

  • Given multiple priors of sufficient evidence, agreement with the data
  • With or without confidence sets
slide-3
SLIDE 3

Adequate models have sufficient evidence

slide-4
SLIDE 4

Assessing multiple models

Adequate models (models of sufficient evidential support)

Models in conflict with the data

All models

slide-5
SLIDE 5

Adequate models

1 posterior Adequate posteriors Robust Bayes acts:

  • E-admissible
  • Hurwicz criterion
  • Γ-minimax

Bayes act Loss function

1 model

Decisions under robust Bayes rules applied to models of sufficient evidence

Bickel, International Journal of Approximate Reasoning 66, 53-72

slide-6
SLIDE 6

Bayesian model assessment

Does the prior or parametric family conflict with the data?

  • Yes, if it has low evidential support compared to alternatives
  • Measures of support satisfying the criteria of Bickel,

International Statistical Review 81, 188-206:

  • Likelihood ratio (limited to comparing two simple models)
  • Bayes factor
  • Penalized likelihood ratio

Yes No

Use the model to take Bayes act (minimize posterior expected loss) even if other models are adequate Change the model to an adequate model, a model of sufficient support

slide-7
SLIDE 7

Adequate models

1 posterior Adequate posteriors

Nested confidence sets

Blending Loss function Bayes act

Decisions under the entropy-maximizing Bayesian model of sufficient evidence

Bickel, International Journal of Approximate Reasoning 66, 53-72

slide-8
SLIDE 8
slide-9
SLIDE 9
slide-10
SLIDE 10
slide-11
SLIDE 11

Adequate models

1 posterior Adequate posteriors

Nested confidence sets

Blending Loss function Bayes act

Decisions under the entropy-maximizing Bayesian model of sufficient evidence

Bickel, International Journal of Approximate Reasoning 66, 53-72

slide-12
SLIDE 12

Adequate models

1 posterior Adequate posteriors Minimize expected loss Pooling Loss function

Decisions under entropy-based pooling of Bayesian model of sufficient evidence

Bickel, International Journal of Approximate Reasoning 66, 53-72

slide-13
SLIDE 13

The three players

  • Combiner is you, the statistician who combines

the candidate distributions

  • Goal 1: minimize error; Goal 2: beat Chooser
  • Chooser is the imaginary statistician who instead

chooses a single candidate distribution

  • Goal 1: minimize error; Goal 2: beat Combiner
  • Nature picks the true distribution among those

that are plausible in order to help Chooser

slide-14
SLIDE 14

Information game

  • Information divergence and inferential gain:
  • Utility paid to Statistician 2 (Combiner or Chooser):
  • Reduction to game of Combiner v. Nature-Chooser Coalition:
slide-15
SLIDE 15

Adequate models

1 posterior Adequate posteriors Robust Bayes acts:

  • E-admissible
  • Hurwicz criterion
  • Γ-minimax

Nested confidence sets

Bayes act Pooling Blending Loss function

Funded by the Natural Sciences and Engineering Research Council of Canada