Gaussian process regression for Sensitivity analysis GPSS Workshop - - PowerPoint PPT Presentation

gaussian process regression for sensitivity analysis
SMART_READER_LITE
LIVE PREVIEW

Gaussian process regression for Sensitivity analysis GPSS Workshop - - PowerPoint PPT Presentation

Introduction FANOVA Pol. Chaos GPR Sensitivity Analysis Conclusion Gaussian process regression for Sensitivity analysis GPSS Workshop on UQ, Sheffield, September 2016 Nicolas Durrande, Mines St-tienne, durrande@emse.fr GPSS workshop on


slide-1
SLIDE 1

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Gaussian process regression for Sensitivity analysis

GPSS Workshop on UQ, Sheffield, September 2016 Nicolas Durrande, Mines St-Étienne, durrande@emse.fr

GPSS workshop on UQ GPs for sensitivity analysis 1 / 43

slide-2
SLIDE 2

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Introduction FANOVA, ie HDMR, ie Sobol-Hoeffding representation Polynomial Chaos Gaussian process Regression Sensitivity Analysis Conclusion

GPSS workshop on UQ GPs for sensitivity analysis 2 / 43

slide-3
SLIDE 3

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

We assume we are interested in a function f : Rd → R with d ≥ 2. We want to get some understanding on the “structure” of f : What is the effect of each input variables on the output ? Do some variables have more influence than other ? Do some variables interact together ? The talk will be illustrated on the following test function : f : [0, 1]6 → R x → 10 sin(πx1x2) + 20(x3 − 0.5)2 + 10x4 + 5x5

GPSS workshop on UQ GPs for sensitivity analysis 3 / 43

slide-4
SLIDE 4

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

First thing one can do is to plot the output versus each input. For 100 samples uniformly distributed over [0, 1]6 we get :

5 10 15 20 25

x1 x2 x3

0.0 0.2 0.4 0.6 0.8 1.0 5 10 15 20 25

x4

0.0 0.2 0.4 0.6 0.8 1.0

x5

0.0 0.2 0.4 0.6 0.8 1.0

x6

GPSS workshop on UQ GPs for sensitivity analysis 4 / 43

slide-5
SLIDE 5

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

In a similar fashion, we can fix all variables except one. In graph bellow, all non plotted variables are set to 0.5.

5 10 15 20 25

x1 x2 x3

0.0 0.2 0.4 0.6 0.8 1.0 5 10 15 20 25

x4

0.0 0.2 0.4 0.6 0.8 1.0

x5

0.0 0.2 0.4 0.6 0.8 1.0

x6

GPSS workshop on UQ GPs for sensitivity analysis 5 / 43

slide-6
SLIDE 6

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

In order to get an insight on the interaction between variables, we can look at the influence of changing the reference value.

0.0 0.2 0.4 0.6 0.8 1.0 5 10 15 20 25 30

x1

x2 = 0 x2 = 0.2 x2 = 0.4 x2 = 0.5 x2 = 0.7 x2 = 0.9 0.0 0.2 0.4 0.6 0.8 1.0

x4

x2 = 0 x2 = 0.2 x2 = 0.4 x2 = 0.5 x2 = 0.7 x2 = 0.9

GPSS workshop on UQ GPs for sensitivity analysis 6 / 43

slide-7
SLIDE 7

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Introduction FANOVA, ie HDMR, ie Sobol-Hoeffding representation Polynomial Chaos Gaussian process Regression Sensitivity Analysis Conclusion

GPSS workshop on UQ GPs for sensitivity analysis 7 / 43

slide-8
SLIDE 8

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

One common tool for analysing the structure of f is to look at its FANOVA representation : f (x) = f0 +

d

  • i=1

fi(xi) +

  • i<j

fi,j(xi, xj) + · · · + f1,...,d(x) This decomposition is such that : f0 accounts for the constant term

⇒ all fI are zero mean (I = 0)

f1 accounts for all signal that can be explained just by x1

  • fI(x)dx−1 = 0 for all I /

∈ {0, 1} ⇒

  • fI(x)dx1 = 0 for all I ⊃ 1

In other words, this decomposition is such that all terms are

  • rthogonal in L2.

GPSS workshop on UQ GPs for sensitivity analysis 8 / 43

slide-9
SLIDE 9

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

The expressions of the fI are : f0 =

  • f (x)dx

fi(xi) =

  • f (x)dx−i − f0

fi,j(xi, xj) =

  • f (x)dx−ij − fi(xi) − fj(xj) + f0

It can also be interesting to look at the total effect of some inputs : ˜ f1(x1) =

  • f (x)dx−1

˜ f1,2(x1, x2) =

  • f (x)dx−{1,2}

GPSS workshop on UQ GPs for sensitivity analysis 9 / 43

slide-10
SLIDE 10

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

On the previous example we obtain :

  • 4
  • 2

2 4

x1 x2 x3

0.0 0.2 0.4 0.6 0.8 1.0

  • 4
  • 2

2 4

x4

0.0 0.2 0.4 0.6 0.8 1.0

x5

0.0 0.2 0.4 0.6 0.8 1.0

x6

GPSS workshop on UQ GPs for sensitivity analysis 10 / 43

slide-11
SLIDE 11

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

We can also look at 2nd order interactions f1,2(x1, x2) f1,3(x1, x3) Interaction x1, x2 Interaction x1, x3

GPSS workshop on UQ GPs for sensitivity analysis 11 / 43

slide-12
SLIDE 12

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

The total effect of (x1, x2) is thus ˜ f1,2(x1, x2) = f0 + f1(x1) + f2(x2) + f1,2(x1, x2)

GPSS workshop on UQ GPs for sensitivity analysis 12 / 43

slide-13
SLIDE 13

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

In practical application f is not analytical so the above method require numerical computations of the integrals. If there is a cost associated with the evaluation of f , surrogate models are useful. Some models are naturally easy to interpret, for example m(x) = β0 + β1x1 + β2x2 but it soon becomes more tricky. m(x) = β0 + β1x1 + β2x2 + β1,2x1x2

GPSS workshop on UQ GPs for sensitivity analysis 13 / 43

slide-14
SLIDE 14

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

In GPR, the mean can be seen either as a linear combination of the observations : m(x) = αtF the kernel evaluated at X : m(x) = k(x, X)β For example, we have for a squared exponential kernel

  • 0.5

0.0 0.5 1.0 1.5 0.0 0.5 1.0 1.5 2.0 2.5 3.0 x k(x, X)

  • 0.5

0.0 0.5 1.0 1.5

  • 2
  • 1

1 2 3 x Z(x)|Z(X) = F

The basis function have a local influence which makes the interpretation difficult.

GPSS workshop on UQ GPs for sensitivity analysis 14 / 43

slide-15
SLIDE 15

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Introduction FANOVA, ie HDMR, ie Sobol-Hoeffding representation Polynomial Chaos Gaussian process Regression Sensitivity Analysis Conclusion

GPSS workshop on UQ GPs for sensitivity analysis 15 / 43

slide-16
SLIDE 16

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

The principle of polynomial chaos is to project f onto a basis of

  • rthonormal polynomials.

One dimension For x ∈ R, the hi are of order i. Starting from the constant function h0 = 1, the following ones can be obtain using Gram-Schmidt orthogonalisation.

h0(x) = 1 ||1||, h1(x) = x − x, h0h0 ||x − x, h0h0||, h2(x) = x2 − x, h0h0 − x, h1h1 ||x2 − x, h0h0 − x, h1h1||

d-dimension In Rd, the basis is obtained by a tensor product of one dimensional

  • basis. For example, if d = 2 :

h00(x) = 1 × 1 h10(x) = h1(x1) × 1 h01(x) = 1 × h1(x2) h11(x) = h1(x1) × h1(x2) h20(x) = h2(x1) × 1 . . . = . . .

GPSS workshop on UQ GPs for sensitivity analysis 16 / 43

slide-17
SLIDE 17

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

The orthonormal basis H depends on the measure over the input space D. A uniform measure over D = [−1, 1] gives the Legendre basis : h0(x) = 1/2 h1(x) = 3/2 x h2(x) = 5/4 (3x2 − 1) h3(x) = 7/4 (5x3 − 3x) h4(x) = 9/16 (35x4 − 30x2 + 3) . . . = . . . A standard Gaussian measure over R gives the Hermite basis : h0(x) = 1/ √ 2π h1(x) = 1/ √ 2π x h2(x) = 1/(2 √ 2π) (x2 − 1) h3(x) = 1/(6 √ 2π) (x3 − 3x) h4(x) = 1/(24 √ 2π) (x4 − 6x2 + 3) . . . = . . .

GPSS workshop on UQ GPs for sensitivity analysis 17 / 43

slide-18
SLIDE 18

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Legendre basis in 1D

  • 1.0
  • 0.5

0.0 0.5 1.0

  • 2

2 4 x0 x1 x2 x3 x4

GPSS workshop on UQ GPs for sensitivity analysis 18 / 43

slide-19
SLIDE 19

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Legendre basis in 2D

x1 x2 h00 x1 x2 h10 x1 x2 h01 x1 x2 h11 x1 x2 h20 x1 x2 h21

GPSS workshop on UQ GPs for sensitivity analysis 19 / 43

slide-20
SLIDE 20

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

If we consider linear regression model based on polynomial chaos basis functions m(x) =

  • I⊂{0,...p}

βhI(x) the FANOVA representation of m is straightforward. For example in 2D : m0 =

  • m(x)dx1dx2 =
  • β00h00(x)dx1dx2 = β00

m1(x1) =

  • m(x)dx2 − m0

=

  • β00h00(x) + β10h10(x) + β20h20(x)dx2 − m0

= β10h10(x1) + β20h20(x1) m1,2(x) = ... = β11h11(x) + β12h12(x) + β21h21(x) + β22h22(x)

GPSS workshop on UQ GPs for sensitivity analysis 20 / 43

slide-21
SLIDE 21

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

We obtain on the motivating example :

  • 6
  • 4
  • 2

2 4 6

x1 x2 x3

0.0 0.2 0.4 0.6 0.8 1.0

  • 6
  • 4
  • 2

2 4 6

x4

0.0 0.2 0.4 0.6 0.8 1.0

x5

0.0 0.2 0.4 0.6 0.8 1.0

x6

GPSS workshop on UQ GPs for sensitivity analysis 21 / 43

slide-22
SLIDE 22

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Same figure without cheating :

  • 6
  • 4
  • 2

2 4 6

x1 x2 x3

0.0 0.2 0.4 0.6 0.8 1.0

  • 6
  • 4
  • 2

2 4 6

x4

0.0 0.2 0.4 0.6 0.8 1.0

x5

0.0 0.2 0.4 0.6 0.8 1.0

x6

GPSS workshop on UQ GPs for sensitivity analysis 22 / 43

slide-23
SLIDE 23

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Introduction FANOVA, ie HDMR, ie Sobol-Hoeffding representation Polynomial Chaos Gaussian process Regression Sensitivity Analysis Conclusion

GPSS workshop on UQ GPs for sensitivity analysis 23 / 43

slide-24
SLIDE 24

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

A first idea is to consider ANOVA kernels [Stitson 97] :

k(x, y) =

d

  • i=1

(1 + ki(xi, yi)) = 1 +

d

  • i=1

ki(xi, yi)

  • additive part

+

  • i<j

ki(xi, yi)kj(xj, yj)

  • 2nd order interactions

+ · · · +

d

  • i=1

ki(xi, yi)

  • full interaction

The associated GP is

Z(x) = Z0

  • cst

+

d

  • i=1

Zi(xi)

  • additive part

+

  • i<j

Zi,j(xi, xj)

  • 2nd order interactions

+ · · · + Z1...d(x)

  • full interaction

However, the ZI do not satisfy

ZI(x)dxi = 0.

GPSS workshop on UQ GPs for sensitivity analysis 24 / 43

slide-25
SLIDE 25

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

If we build a GPR model based on this kernel, we obtain :

m(x) = k(x, X)k(X, X)−1F m(x) =

  • 1 +

d

  • i=1

ki(xi, yi) +

  • i<j

ki(xi, yi)kj(xj, yj)

  • k(X, X)−1F

= 1tk(X, X)−1F +

d

  • i=1

k(xi, Xi)k(X, X)−1F

  • mi(xi)

+

  • i<j

ki(xi, Xi)k(xj, Xj)k(X, X)−1F

  • mi,j(xi,xj)

+ . . .

As previously, the mI do not satisfy

mI(x)dxi = 0.

GPSS workshop on UQ GPs for sensitivity analysis 25 / 43

slide-26
SLIDE 26

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

samples with zero integrals

We are interested in building a GP such that the integral of the samples are exactly zero... Let’s consider the associated conditional GP : Z0

law

= Z

  • Z(s)ds=0

Let µ0(x) = E

Z(x)

  • Z(s)ds=0

denote the conditional

expectation and k0(x, x′) = cov

Z(x), Z(x′)

  • Z(s)ds=0
  • µ0(x) =
  • k(x, s)ds
  • k(s, t)dsdt

−1

k0(x, y) = k(x, x′) −

  • k(x, s)ds
  • k(s, t)dsdt

−1

k(x, s)ds

GPSS workshop on UQ GPs for sensitivity analysis 26 / 43

slide-27
SLIDE 27

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Samples from Z0 have the required property µ0(x) = 0 k0(x, y) = k(x, y) −

  • k(x, s)ds
  • k(y, s)ds
  • k(s, t)dsdt

0.0 0.2 0.4 0.6 0.8 1.0

  • 1.0
  • 0.5

0.0 0.5 1.0 1.5

x Z(x)

GPSS workshop on UQ GPs for sensitivity analysis 27 / 43

slide-28
SLIDE 28

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

These 1-dimensional kernels are of great importance to create ANOVA kernels dedicated to sensitivity analysis :

kSA(x, y) =

d

  • i=1

(1 + k0(xi, yi)) = 1 +

d

  • i=1

k(xi, yi)

  • additive part

+

  • i<j

k(xi, yi)k(xj, yj)

  • 2nd order interactions

+ · · · +

d

  • i=1

k(xi, yi)

  • full interaction

The associated GP naturally writes

ZSA(x) = Z0

  • cst

+

d

  • i=1

Zi(xi)

  • additive part

+

  • i<j

Zi,j(xi, xj)

  • 2nd order interactions

+ · · · + Z1...d(x)

  • full interaction

Now, the ZI do satisfy

ZI(x)dxi = 0.

GPSS workshop on UQ GPs for sensitivity analysis 28 / 43

slide-29
SLIDE 29

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

We get the following decomposition of samples

x1 x2 Z x1 x2 Z00 x1 x2 Z10 x1 x2 Z01 x1 x2 Z11

GPSS workshop on UQ GPs for sensitivity analysis 29 / 43

slide-30
SLIDE 30

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Furthermore, the GPR model inherits this properties

2D example

k(x, y) =

2

  • i=1

(1 + k0(xi, yi)) = 1 + k0(x1, y1) + k0(x2, y2) + k0(x1, y1)k0(x2, y2) The mean writes

m(x) = (1 + k0(x1, X1) + k0(x2, X2) + k0(x1, X1)k0(x2, X2))tk(X, X)−1F = m0 + m1(x1) + m2(x2) + m12(x)

These terms correspond to the FANOVA representation of m. The sub-models are conditional expectations : mI(x) = E (ZI(x)|Z(X)=F) We can thus associate a predictive covariance to each sub-model !

GPSS workshop on UQ GPs for sensitivity analysis 30 / 43

slide-31
SLIDE 31

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

We obtain on the motivating example :

  • 4
  • 2

2 4

x1 x2 x3

0.0 0.2 0.4 0.6 0.8 1.0

  • 4
  • 2

2 4

x4

0.0 0.2 0.4 0.6 0.8 1.0

x5

0.0 0.2 0.4 0.6 0.8 1.0

x6

GPSS workshop on UQ GPs for sensitivity analysis 31 / 43

slide-32
SLIDE 32

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

We obtain on the motivating example :

GPSS workshop on UQ GPs for sensitivity analysis 32 / 43

slide-33
SLIDE 33

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Introduction FANOVA, ie HDMR, ie Sobol-Hoeffding representation Polynomial Chaos Gaussian process Regression Sensitivity Analysis Conclusion

GPSS workshop on UQ GPs for sensitivity analysis 33 / 43

slide-34
SLIDE 34

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

The principle of sensitivity analysis is to quantify how much each input or group of inputs has an influence on the output : local sensitivity analysis global sensitivity analysis Probabilistic framework has proven to be very interesting : If we introduce randomness in the inputs, how random is the output ? Hereafter, we focus on variance based global sensitivity analysis

GPSS workshop on UQ GPs for sensitivity analysis 34 / 43

slide-35
SLIDE 35

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Let X be the random vector representing our uncertainty on the

  • inputs. We assume its probability distribution factorises (i.e. the Xi

are independents) : µ(x) = µ(x1) × µ(x2) × · · · × µ(xn) This factorization of µ allows to use it in the FANOVA representation of f : f (x) = f0 +

d

  • i=1

fi(xi) +

  • i<j

fi,j(xi, xj) + · · · + f1,...,d(x) in this expression, the fI are orthogonal for µ. We know plug X into this expression : f (X) = f0 +

d

  • i=1

fi(Xi) +

  • i<j

fi,j(Xi, Xj) + · · · + f1,...,d(X) and we get interesting results...

GPSS workshop on UQ GPs for sensitivity analysis 35 / 43

slide-36
SLIDE 36

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

The fI(XI) are centred and independent E (fI(XI)) =

  • fI(xI)dµ(x) = 0

(for I = 0) cov (fI(XI), fJ(XJ)) = E (fI(XI)fJ(XJ)) =

  • fI(xI)fJ(xJ)dµ(x) = 0

As a consequence, we get var (f (X)) =

d

  • i=1

var (fi(Xi))+

  • i<j

var (fi,j(Xi, Xj))+· · ·+var (f1,...,d(X)) The Sobol indices are defined as SI = var (fI(XI)) var (f (X)) These indices are in [0, 1], and their sum is 1.

GPSS workshop on UQ GPs for sensitivity analysis 36 / 43

slide-37
SLIDE 37

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

In practice, these indices can be computed using Monte Carlo methods. ⇒ This requires lots of observations Another approach is to use surrogate models. If the model is well chosen, computational cost is almost free ! Polynomial Chaos GPR with ksa kernels

GPSS workshop on UQ GPs for sensitivity analysis 37 / 43

slide-38
SLIDE 38

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Polynomial Chaos

In the case of polynomial Chaos, Sobol indices are given by the squares of the β coefficients Di = VarX[EX(H(X)β|Xi)] = VarX[Hi(Xi)βi] = β2

i

Si = Di

  • k Dk

For mode details, see the work from Bruno Sudret

GPSS workshop on UQ GPs for sensitivity analysis 38 / 43

slide-39
SLIDE 39

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

GPR with ksa kernel

The sensitivity indices can be obtained analytically : SI = var (mI(XI)) var (m(X)) = F TK −1 (

i∈I Γi) K −1F

F TK −1

d

i=1 (1n×n + Γi) − 1n×n

  • K −1F

where Γi is the matrix Γi =

  • Di k0

i (si)k0 i (si)Tdsi, and ⊙ is an

entry-wise product.

GPSS workshop on UQ GPs for sensitivity analysis 39 / 43

slide-40
SLIDE 40

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

The computation of Sobol indices on the mean gives : S1 S2 S3 S4 S5 S6 S12 model 1 0.20 0.20 0.09 0.35 0.09 0.00 0.07 model 2 0.20 0.20 0.08 0.37 0.09 0.00 0.05 truth 0.20 0.20 0.09 0.35 0.09 0.00 0.07 For this test-function 50 observations are enough !

GPSS workshop on UQ GPs for sensitivity analysis 40 / 43

slide-41
SLIDE 41

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Introduction FANOVA, ie HDMR, ie Sobol-Hoeffding representation Polynomial Chaos Gaussian process Regression Sensitivity Analysis Conclusion

GPSS workshop on UQ GPs for sensitivity analysis 41 / 43

slide-42
SLIDE 42

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Sensitivity analysis There are interesting tools to get an insight of what’s happening inside high dimensional functions The effective dimensionality can be much smaller Monte Carlo or model based approach Some modelling tips What is the purpose of the model ? GPR models are not necessarily black-box... It is possible to include fancy observations in GPR (integrals, derivatives, ...) This talk unfortunately focused on sensitivity analysis on the mean... the proper way is to perform SA on the conditional sample

  • paths. See work from Marrel, Iooss et Al, SAMO 2007.

GPSS workshop on UQ GPs for sensitivity analysis 42 / 43

slide-43
SLIDE 43

Introduction FANOVA

  • Pol. Chaos

GPR Sensitivity Analysis Conclusion

Using appropriate kernels, the computation of Sobol indices on the samples gives : Di = VarX[EX(Z(X)|Xi)] = VarX[Zi(Xi)] We can easily sample from this distribution

D1 prior 1 2 3 4 D2 prior 1 2 3 4 D12 prior 1 2 3 4

Similarly, we can sample from the posterior to get an uncertainty measure on the indices.

GPSS workshop on UQ GPs for sensitivity analysis 43 / 43