Uncertainty and Sensitivity Analysis for Complex Simulation Models - - PowerPoint PPT Presentation

uncertainty and sensitivity analysis for complex
SMART_READER_LITE
LIVE PREVIEW

Uncertainty and Sensitivity Analysis for Complex Simulation Models - - PowerPoint PPT Presentation

Uncertainty and Sensitivity Analysis for Complex Simulation Models Jeremy Oakley School of Mathematics and Statistics, University of Sheffield jeremy-oakley.staff.shef.ac.uk Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April


slide-1
SLIDE 1

Uncertainty and Sensitivity Analysis for Complex Simulation Models

Jeremy Oakley

School of Mathematics and Statistics, University of Sheffield

jeremy-oakley.staff.shef.ac.uk

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 1 / 21

slide-2
SLIDE 2

Computer models

Computer model (‘simulator’) represented by function f with inputs x and outputs y y = f(x). f usually not available in closed form.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 2 / 21

slide-3
SLIDE 3

Computer models

Computer model (‘simulator’) represented by function f with inputs x and outputs y y = f(x). f usually not available in closed form. f constructed from modeller’s understanding of the process.

There may be no physical input-output data.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 2 / 21

slide-4
SLIDE 4

Computer models

Computer model (‘simulator’) represented by function f with inputs x and outputs y y = f(x). f usually not available in closed form. f constructed from modeller’s understanding of the process.

There may be no physical input-output data.

f may be deterministic.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 2 / 21

slide-5
SLIDE 5

Computer models

Computer model (‘simulator’) represented by function f with inputs x and outputs y y = f(x). f usually not available in closed form. f constructed from modeller’s understanding of the process.

There may be no physical input-output data.

f may be deterministic. Computer experiment: evaluating f at difference choices of x

A ‘model run’: evaluating f at a single choice of x.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 2 / 21

slide-6
SLIDE 6

Uncertainty in model inputs

Model may be set up to accept ‘controllable’ inputs only. But may be other parameters/coefficients/variables ‘hard-wired’ within the model.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 3 / 21

slide-7
SLIDE 7

Uncertainty in model inputs

Model may be set up to accept ‘controllable’ inputs only. But may be other parameters/coefficients/variables ‘hard-wired’ within the model. Define the input x to include these other numerical values used to calculate the outputs.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 3 / 21

slide-8
SLIDE 8

Uncertainty in model inputs

Model may be set up to accept ‘controllable’ inputs only. But may be other parameters/coefficients/variables ‘hard-wired’ within the model. Define the input x to include these other numerical values used to calculate the outputs. Suppose that there is a true input value, X, with at least some elements of X uncertain.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 3 / 21

slide-9
SLIDE 9

Uncertainty in model inputs

Model may be set up to accept ‘controllable’ inputs only. But may be other parameters/coefficients/variables ‘hard-wired’ within the model. Define the input x to include these other numerical values used to calculate the outputs. Suppose that there is a true input value, X, with at least some elements of X uncertain. What is our uncertainty about Y = f(X)?

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 3 / 21

slide-10
SLIDE 10

Uncertainty in model inputs

Model may be set up to accept ‘controllable’ inputs only. But may be other parameters/coefficients/variables ‘hard-wired’ within the model. Define the input x to include these other numerical values used to calculate the outputs. Suppose that there is a true input value, X, with at least some elements of X uncertain. What is our uncertainty about Y = f(X)? We quantify uncertainty about X with a probability distribution pX

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 3 / 21

slide-11
SLIDE 11

Uncertainty in model inputs

Model may be set up to accept ‘controllable’ inputs only. But may be other parameters/coefficients/variables ‘hard-wired’ within the model. Define the input x to include these other numerical values used to calculate the outputs. Suppose that there is a true input value, X, with at least some elements of X uncertain. What is our uncertainty about Y = f(X)? We quantify uncertainty about X with a probability distribution pX

If using expert judgement, methods & software at http://www.tonyohagan.co.uk/shelf/

Then need to obtain the distribution pY .

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 3 / 21

slide-12
SLIDE 12

Uncertainty in model inputs

Model may be set up to accept ‘controllable’ inputs only. But may be other parameters/coefficients/variables ‘hard-wired’ within the model. Define the input x to include these other numerical values used to calculate the outputs. Suppose that there is a true input value, X, with at least some elements of X uncertain. What is our uncertainty about Y = f(X)? We quantify uncertainty about X with a probability distribution pX

If using expert judgement, methods & software at http://www.tonyohagan.co.uk/shelf/

Then need to obtain the distribution pY . Can propagate uncertainty using Monte Carlo: sample X1, . . . , XN from pX and evaluate f(X1), . . . , f(XN)

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 3 / 21

slide-13
SLIDE 13

Uncertainty in model inputs

Model may be set up to accept ‘controllable’ inputs only. But may be other parameters/coefficients/variables ‘hard-wired’ within the model. Define the input x to include these other numerical values used to calculate the outputs. Suppose that there is a true input value, X, with at least some elements of X uncertain. What is our uncertainty about Y = f(X)? We quantify uncertainty about X with a probability distribution pX

If using expert judgement, methods & software at http://www.tonyohagan.co.uk/shelf/

Then need to obtain the distribution pY . Can propagate uncertainty using Monte Carlo: sample X1, . . . , XN from pX and evaluate f(X1), . . . , f(XN) What do we do if f is computationally expensive?

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 3 / 21

slide-14
SLIDE 14

Computationally expensive models

−3 −2 −1 1 2 3 −2 2 4 x f(x)

Want f(x1), . . . , f(xN), but can only evaluate f(x1), . . . , f(xn), for n << N.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 4 / 21

slide-15
SLIDE 15

Computationally expensive models

−3 −2 −1 1 2 3 −2 2 4 x f(x)

Want f(x1), . . . , f(xN), but can only evaluate f(x1), . . . , f(xn), for n << N. Could estimate f given f(x1), . . . , f(xn)

but can we quantify uncertainty in the estimate?

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 4 / 21

slide-16
SLIDE 16

Computationally expensive models

−3 −2 −1 1 2 3 −2 2 4 x f(x)

Want f(x1), . . . , f(xN), but can only evaluate f(x1), . . . , f(xn), for n << N. Could estimate f given f(x1), . . . , f(xn)

but can we quantify uncertainty in the estimate?

A statistical inference problem:

Treat f as an uncertain function Derive a probability distribution for f given f(x1), . . . , f(xn) (an “emulator”)

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 4 / 21

slide-17
SLIDE 17

Computationally expensive models

−3 −2 −1 1 2 3 −2 2 4 x f(x)

Want f(x1), . . . , f(xN), but can only evaluate f(x1), . . . , f(xn), for n << N. Could estimate f given f(x1), . . . , f(xn)

but can we quantify uncertainty in the estimate?

A statistical inference problem:

Treat f as an uncertain function Derive a probability distribution for f given f(x1), . . . , f(xn) (an “emulator”)

Popular technique: Gaussian process emulation (Sacks et al 1989)

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 4 / 21

slide-18
SLIDE 18

Gaussian process emulators

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 5 / 21

−4 −2 2 4 −5 5 x f(x)

slide-19
SLIDE 19

Gaussian process emulators

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 5 / 21

−4 −2 2 4 −5 5 x f(x)

slide-20
SLIDE 20

Gaussian process emulators

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 5 / 21

−4 −2 2 4 −5 5 x f(x)

slide-21
SLIDE 21

Gaussian process emulators

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 5 / 21

−4 −2 2 4 −5 5 x f(x)

slide-22
SLIDE 22

Gaussian process emulators

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 5 / 21

−4 −2 2 4 −5 5 x f(x)

slide-23
SLIDE 23

The emulator does not replace the simulator

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 6 / 21

slide-24
SLIDE 24

The emulator does not replace the simulator

In a computer experiment, may want to know f(x1), . . . , f(xN), but can only observe f(x1), . . . , f(xn), with n < N.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 6 / 21

slide-25
SLIDE 25

The emulator does not replace the simulator

In a computer experiment, may want to know f(x1), . . . , f(xN), but can only observe f(x1), . . . , f(xn), with n < N. As part of the analysis, we work with p{f(xn+1), . . . , f(xN)|f(x1), . . . , f(xn)}, which we get from the emulator.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 6 / 21

slide-26
SLIDE 26

The emulator does not replace the simulator

In a computer experiment, may want to know f(x1), . . . , f(xN), but can only observe f(x1), . . . , f(xn), with n < N. As part of the analysis, we work with p{f(xn+1), . . . , f(xN)|f(x1), . . . , f(xn)}, which we get from the emulator. If the simulator has given us the value of f(x), the emulator will give us the same value

1 2 3 4 5 6 7 8 9 10 2 4 6 8 10 12 14

x f(x)

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 6 / 21

slide-27
SLIDE 27

Example: 18 input climate model, 255 model runs

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 7 / 21

slide-28
SLIDE 28

Example: 18 input climate model, 255 model runs

13 14 15 16 17 18 19 12 14 16 18

Emulator means and 95% intervals

simulator output emulator prediction

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 7 / 21

slide-29
SLIDE 29

Need to think carefully about input distributions

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 8 / 21

slide-30
SLIDE 30

Need to think carefully about input distributions

Consider f(x) = exp(−x), with Y = f(X) and X ∼ U[0, b].

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 8 / 21

slide-31
SLIDE 31

Need to think carefully about input distributions

Consider f(x) = exp(−x), with Y = f(X) and X ∼ U[0, b]. In this case we have V ar(Y ) = b − 2 + 4 exp(−b) − (b + 2) exp(−2b) 2b2 ,

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 8 / 21

slide-32
SLIDE 32

Need to think carefully about input distributions

Consider f(x) = exp(−x), with Y = f(X) and X ∼ U[0, b]. In this case we have V ar(Y ) = b − 2 + 4 exp(−b) − (b + 2) exp(−2b) 2b2 , Increasing b increases the variance of X but decreases the variance of Y

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 8 / 21

slide-33
SLIDE 33

2 4 6 8 10 12 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

x f (x) X ∼ U [0, 5] X ∼ U [0, 10]

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 9 / 21

slide-34
SLIDE 34

Probabilistic sensitivity analysis of model outputs

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 10 / 21

slide-35
SLIDE 35

Probabilistic sensitivity analysis of model outputs

Interested in Y = f(X), where X is uncertain with distribution pX.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 10 / 21

slide-36
SLIDE 36

Probabilistic sensitivity analysis of model outputs

Interested in Y = f(X), where X is uncertain with distribution pX. Sensitivity analysis: which elements in X = {X1, . . . , Xd} are most responsible for the uncertainty in Y = f(X)?

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 10 / 21

slide-37
SLIDE 37

Probabilistic sensitivity analysis of model outputs

Interested in Y = f(X), where X is uncertain with distribution pX. Sensitivity analysis: which elements in X = {X1, . . . , Xd} are most responsible for the uncertainty in Y = f(X)? Depends on both f and pX

Typically, cannot quantify ‘importance’ of input by studying f alone

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 10 / 21

slide-38
SLIDE 38

Variance based sensitivity analysis

(See, e.g., Saltelli et al., 2008) Investigate how {X1, . . . , Xd} contribute to V ar(Y )

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 11 / 21

slide-39
SLIDE 39

Variance based sensitivity analysis

(See, e.g., Saltelli et al., 2008) Investigate how {X1, . . . , Xd} contribute to V ar(Y ) Can consider expected reduction in V ar(Y ) achieved by learning true value of Xi:

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 11 / 21

slide-40
SLIDE 40

Variance based sensitivity analysis

(See, e.g., Saltelli et al., 2008) Investigate how {X1, . . . , Xd} contribute to V ar(Y ) Can consider expected reduction in V ar(Y ) achieved by learning true value of Xi: V ar(Y ) − EXi{V arX−i(Y |Xi)} = V arXi{EX−i(Y |Xi)}

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 11 / 21

slide-41
SLIDE 41

Variance based sensitivity analysis

(See, e.g., Saltelli et al., 2008) Investigate how {X1, . . . , Xd} contribute to V ar(Y ) Can consider expected reduction in V ar(Y ) achieved by learning true value of Xi: V ar(Y ) − EXi{V arX−i(Y |Xi)} = V arXi{EX−i(Y |Xi)} Small value does not imply ‘unimportant’: contribution to variance may be through interactions

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 11 / 21

slide-42
SLIDE 42

Variance based sensitivity analysis

(See, e.g., Saltelli et al., 2008) Investigate how {X1, . . . , Xd} contribute to V ar(Y ) Can consider expected reduction in V ar(Y ) achieved by learning true value of Xi: V ar(Y ) − EXi{V arX−i(Y |Xi)} = V arXi{EX−i(Y |Xi)} Small value does not imply ‘unimportant’: contribution to variance may be through interactions “Total effect variance”: measure contribution to variance of Xi including interactions

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 11 / 21

slide-43
SLIDE 43

Variance based sensitivity analysis

(See, e.g., Saltelli et al., 2008) Investigate how {X1, . . . , Xd} contribute to V ar(Y ) Can consider expected reduction in V ar(Y ) achieved by learning true value of Xi: V ar(Y ) − EXi{V arX−i(Y |Xi)} = V arXi{EX−i(Y |Xi)} Small value does not imply ‘unimportant’: contribution to variance may be through interactions “Total effect variance”: measure contribution to variance of Xi including interactions

Small value does imply ‘unimportant’ but requires independence between {X1, . . . , Xd}

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 11 / 21

slide-44
SLIDE 44

Variance based sensitivity analysis

(See, e.g., Saltelli et al., 2008) Investigate how {X1, . . . , Xd} contribute to V ar(Y ) Can consider expected reduction in V ar(Y ) achieved by learning true value of Xi: V ar(Y ) − EXi{V arX−i(Y |Xi)} = V arXi{EX−i(Y |Xi)} Small value does not imply ‘unimportant’: contribution to variance may be through interactions “Total effect variance”: measure contribution to variance of Xi including interactions

Small value does imply ‘unimportant’ but requires independence between {X1, . . . , Xd}

Can speed up computation with emulator (Oakley & O’Hagan, 2004)

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 11 / 21

slide-45
SLIDE 45

Variance-based sensitivity analysis

— pX1(x1) — E(Y |X1 = x1) — pX2(x2) — E(Y |X2 = x2)

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 12 / 21

−4 −2 2 4 0.0 0.5 1.0 1.5 2.0 2.5 3.0 X1 −4 −2 2 4 0.0 0.5 1.0 1.5 2.0 2.5 3.0 X2

slide-46
SLIDE 46

Example1: modelling Rotavirus

1MUCM case study: analysis by John Paul Gosling, Hugo Maruri-Aguilar, Alexis

Boukouvalas

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 13 / 21

slide-47
SLIDE 47

Example1: modelling Rotavirus

Model developed by GlaxoSmithKline. Predicts incidence of rotavirus in a population before and after a vaccine is administered to a proportion of the infant population

1MUCM case study: analysis by John Paul Gosling, Hugo Maruri-Aguilar, Alexis

Boukouvalas

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 13 / 21

slide-48
SLIDE 48

Example1: modelling Rotavirus

Model developed by GlaxoSmithKline. Predicts incidence of rotavirus in a population before and after a vaccine is administered to a proportion of the infant population Deterministic compartmental model, 672 compartments (16 disease stages × 42 age classes)

1MUCM case study: analysis by John Paul Gosling, Hugo Maruri-Aguilar, Alexis

Boukouvalas

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 13 / 21

slide-49
SLIDE 49

Example1: modelling Rotavirus

Model developed by GlaxoSmithKline. Predicts incidence of rotavirus in a population before and after a vaccine is administered to a proportion of the infant population Deterministic compartmental model, 672 compartments (16 disease stages × 42 age classes) Inputs include transmission rates between age groups, reduction in risk following each infection

1MUCM case study: analysis by John Paul Gosling, Hugo Maruri-Aguilar, Alexis

Boukouvalas

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 13 / 21

slide-50
SLIDE 50

Example1: modelling Rotavirus

Model developed by GlaxoSmithKline. Predicts incidence of rotavirus in a population before and after a vaccine is administered to a proportion of the infant population Deterministic compartmental model, 672 compartments (16 disease stages × 42 age classes) Inputs include transmission rates between age groups, reduction in risk following each infection Outputs: time series of rotavirus incidence for six age groups following vaccination programme

1MUCM case study: analysis by John Paul Gosling, Hugo Maruri-Aguilar, Alexis

Boukouvalas

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 13 / 21

slide-51
SLIDE 51

Example1: modelling Rotavirus

Model developed by GlaxoSmithKline. Predicts incidence of rotavirus in a population before and after a vaccine is administered to a proportion of the infant population Deterministic compartmental model, 672 compartments (16 disease stages × 42 age classes) Inputs include transmission rates between age groups, reduction in risk following each infection Outputs: time series of rotavirus incidence for six age groups following vaccination programme GSK analysis investigated sensitivity of output to 9 inputs, using 8200 model runs We consider sensitivity of output to 20 inputs, using 340 model runs

1MUCM case study: analysis by John Paul Gosling, Hugo Maruri-Aguilar, Alexis

Boukouvalas

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 13 / 21

slide-52
SLIDE 52

Variance based sensitivity analysis

Analysis for an individual output: no. of infections in 2-3 age group after 2 years

5 10 15 20 5 10 15 20 25 30

input number % of total output variance

main effect interactions

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 14 / 21

slide-53
SLIDE 53

SA for a global aerosol model: Lee et al. 2013

−180 −135 −90 −45 45 90 135 180 −60 −30 30 60

Width of Aitken mode aerosol

Longitude Latitude 10 20 30 40 50 60 70 80 90 100 −180 −135 −90 −45 45 90 135 180 −60 −30 30 60

Biomass burning emission size

Longitude Latitude 10 20 30 40 50 60 70 80 90 100 −180 −135 −90 −45 45 90 135 180 −60 −30 30 60

Sea−spray flux

Longitude Latitude 10 20 30 40 50 60 70 80 90 100 −180 −135 −90 −45 45 90 135 180 −60 −30 30 60

Dry deposition velocity

Longitude Latitude 10 20 30 40 50 60 70 80 90 100

Acknowledgement: figure kindly provided by Lindsay Lee

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 15 / 21

slide-54
SLIDE 54

Measuring input importance for decision making

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 16 / 21

slide-55
SLIDE 55

Measuring input importance for decision making

Suppose output Y = f(X1, . . . , Xd) used to make a decision Is Xi ‘important’; is it worth learning Xi before making the decision?

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 16 / 21

slide-56
SLIDE 56

Measuring input importance for decision making

Suppose output Y = f(X1, . . . , Xd) used to make a decision Is Xi ‘important’; is it worth learning Xi before making the decision? Decision maker’s utility for decision a and ‘true’ model output Y is U(a, Y ).

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 16 / 21

slide-57
SLIDE 57

Measuring input importance for decision making

Suppose output Y = f(X1, . . . , Xd) used to make a decision Is Xi ‘important’; is it worth learning Xi before making the decision? Decision maker’s utility for decision a and ‘true’ model output Y is U(a, Y ). ‘Baseline’ utility of optimum decision based on no further information is U ∗ = maxaEY {U(a, Y )}

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 16 / 21

slide-58
SLIDE 58

Measuring input importance for decision making

Suppose output Y = f(X1, . . . , Xd) used to make a decision Is Xi ‘important’; is it worth learning Xi before making the decision? Decision maker’s utility for decision a and ‘true’ model output Y is U(a, Y ). ‘Baseline’ utility of optimum decision based on no further information is U ∗ = maxaEY {U(a, Y )} Expected value of learning Xi before making decision is EXi

  • max

a

E{U(a, Y )|Xi}

  • − U ∗,

the partial EVPI of Xi (Expected Value of Perfect Information)

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 16 / 21

slide-59
SLIDE 59

Sensitivity analysis for decision making

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 1 2 3 4 5 6 7 8

Xi E{U(d1,Y)|Xi} E{U(d2,Y)|Xi} p(Xi)

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 17 / 21

slide-60
SLIDE 60

Sensitivity analysis for decision making

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 1 2 3 4 5 6 7 8

Xi E{U(d1,Y)|Xi} E{U(d2,Y)|Xi} p(Xi)

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 17 / 21

slide-61
SLIDE 61

Case study: health economic modelling (Strong and Oakley, 2014)

State 1 CD4 > 200 CD4 < 500 State 3 AIDS State 2 CD4 < 200 State 4 Death

Model described in Chancellor et al. (1997) Predicts utilities (‘net health benefits’) for two different treatment

  • ptions for patients with HIV

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 18 / 21

slide-62
SLIDE 62

Case study: health economic modelling (Strong and Oakley, 2014)

State 1 CD4 > 200 CD4 < 500 State 3 AIDS State 2 CD4 < 200 State 4 Death

Model described in Chancellor et al. (1997) Predicts utilities (‘net health benefits’) for two different treatment

  • ptions for patients with HIV

Various uncertain inputs related to transition probabilities, costs, effectiveness of each treatment

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 18 / 21

slide-63
SLIDE 63

Case study: health economic modelling (Strong and Oakley, 2014)

State 1 CD4 > 200 CD4 < 500 State 3 AIDS State 2 CD4 < 200 State 4 Death

Model described in Chancellor et al. (1997) Predicts utilities (‘net health benefits’) for two different treatment

  • ptions for patients with HIV

Various uncertain inputs related to transition probabilities, costs, effectiveness of each treatment Consider extra parameters to represent model structure uncertainty

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 18 / 21

slide-64
SLIDE 64

Partial EVPI: expected value in £ per patient treated, of learning parameters, before choosing treatment option for patient population.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 19 / 21

slide-65
SLIDE 65

Partial EVPI: expected value in £ per patient treated, of learning parameters, before choosing treatment option for patient population. Consider partial EVPIs for four groups of parameters:

1

Transition probabilities

2

Difference in drug effectiveness (relative risk)

3

Treatment costs

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 19 / 21

slide-66
SLIDE 66

Partial EVPI: expected value in £ per patient treated, of learning parameters, before choosing treatment option for patient population. Consider partial EVPIs for four groups of parameters:

1

Transition probabilities

2

Difference in drug effectiveness (relative risk)

3

Treatment costs

4

‘Discrepancy’ parameters: model structure uncertainty (three scenarios)

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 19 / 21

slide-67
SLIDE 67

Partial EVPI: expected value in £ per patient treated, of learning parameters, before choosing treatment option for patient population. Consider partial EVPIs for four groups of parameters:

1

Transition probabilities

2

Difference in drug effectiveness (relative risk)

3

Treatment costs

4

‘Discrepancy’ parameters: model structure uncertainty (three scenarios) Parameter Partial EVPI Base case Scenario 1 Scenario 2 Scenario 3 Transition probabilities £0 £0 £0 £1.17 Relative risk £169.91 £193.09 £64.63 £164.55 Costs £194.41 £201.72 £65.17 £167.53 Discrepancy terms

  • £7.86

£110.21 £699.06 Overall EVPI £365.42 £401.53 £333.43 £957.28

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 19 / 21

slide-68
SLIDE 68

Closing comments

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 20 / 21

slide-69
SLIDE 69

Closing comments

Emulators for computationally expensive models:

facilitate various analyses using the available model runs; emulators do not replace models

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 20 / 21

slide-70
SLIDE 70

Closing comments

Emulators for computationally expensive models:

facilitate various analyses using the available model runs; emulators do not replace models

Sensitivity analysis tools for investigating input uncertainty.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 20 / 21

slide-71
SLIDE 71

Closing comments

Emulators for computationally expensive models:

facilitate various analyses using the available model runs; emulators do not replace models

Sensitivity analysis tools for investigating input uncertainty.

variance-based approach as a ‘general purpose’ method

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 20 / 21

slide-72
SLIDE 72

Closing comments

Emulators for computationally expensive models:

facilitate various analyses using the available model runs; emulators do not replace models

Sensitivity analysis tools for investigating input uncertainty.

variance-based approach as a ‘general purpose’ method EVPI for more focussed decision-making context

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 20 / 21

slide-73
SLIDE 73

Closing comments

Emulators for computationally expensive models:

facilitate various analyses using the available model runs; emulators do not replace models

Sensitivity analysis tools for investigating input uncertainty.

variance-based approach as a ‘general purpose’ method EVPI for more focussed decision-making context

Careful specification of input uncertainty is important!

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 20 / 21

slide-74
SLIDE 74

References

Sacks, J., Welch, W. J., Mitchell, T. J. and Wynn, H. P. (1989). Design and analysis of computer experiments, Statistical Science, 4: 409-435. Oakley, J. and O’Hagan, A. (2004). Probabilistic sensitivity analysis of complex models: a Bayesian approach. Journal of the Royal Statistical Society Series B, 66, 751-769. Saltelli, A. et al. (2008). Global Sensitivity Analysis: The Primer, New York: Wiley. Lee, L. A. et al. (2013) The magnitude and causes of uncertainty in global model simulations of cloud condensation nuclei, Atmospheric Chemistry and Physics, 13, pp.8879-8914. Strong, M. and Oakley, J. E. (2014). When is a model good enough? Deriving the expected value of model improvement via specifying internal model discrepancies. SIAM/ASA Journal on Uncertainty Quantification, 2(1), 106-125.

Jeremy Oakley (Sheffield) Uncertainty and Sensitivity Analysis April 2017 21 / 21