kullback leibler designs
play

Kullback-Leibler Designs Astrid JOURDAN Jessica FRANCO - PowerPoint PPT Presentation

Kullback-Leibler Designs Astrid JOURDAN Jessica FRANCO ENBIS 2009 / Saint-Etienne Contents Contents Introduction Kullback-Leibler divergence Estimation by a Monte-Carlo method Estimation by a Monte-Carlo method


  1. Kullback-Leibler Designs Astrid JOURDAN Jessica FRANCO ENBIS 2009 / Saint-Etienne

  2. Contents Contents � Introduction � Kullback-Leibler divergence � Estimation by a Monte-Carlo method Estimation by a Monte-Carlo method � Design comparison � Conclusion 2 ENBIS 2009 / Saint-Etienne

  3. Introduction Computer experiments Physical experimentation is impossible Mathematical Models Input parameters Outputs Outputs Computer x y(x) code Time-consuming simulations Sensitivity Analysis Metamodel Optimization Uncertainty Quantification 3 ENBIS 2009 / Saint-Etienne

  4. Introduction Design constraints • No replication, in particular when projecting the design on to a subset of parameters (non- Space filling collapsing) designs • Provide information about all parts of the experimental region experimental region • Allow one to adapt a variety of statistical Exploratory models designs Goal : fill up the space in uniform fashion with the design points 4 ENBIS 2009 / Saint-Etienne

  5. Kullback-Leibler Divergence Divergence � Introduction � Kullback-Leibler divergence � Estimation by a Monte Carlo method � Design comparison � Conclusion ENBIS 2009 / Saint-Etienne

  6. Kullback Leibler Divergence Goal Suppose that the design points X 1 ,...,X n , are n independent observations of the random vector X=(X 1 ,...,X d ) with absolutely continuous density function f select the design points in such a way as to select the design points in such a way as to have the density function “close” to the uniform density function. The Kullback-Leibler (KL) divergence measures the difference between two density functions f and g (with f << g )   f ( x ) = ∫   D ( f , g ) f ( x ) ln dx   g ( x ) 6 ENBIS 2009 / Saint-Etienne

  7. Kullback Leibler Divergence KL divergence properties • The KL divergence is not a metric (it is not symmetric, it does not satisfy the triangle inequality) • The KL divergence is always non-negative and D( f , g ) = 0 ⇒ f = g p.p. If {P ,…,P } is a sequence of distributions then If {P 1 ,…,P n } is a sequence of distributions then KL divergence Total variation → ⇒ → P P P P n n → + ∞ → + ∞ n n Minimizing the KL divergence • The KL divergence is invariant under parameter transformations. Design space = unit cube 7 ENBIS 2009 / Saint-Etienne

  8. Kullback Leibler Divergence The KL divergence and the Shannon entropy If g is the uniform density function then [ ] = ∫ ( ) = − D ( ) ( x ) ln ( x ) dx H f f f f where H( f ) is the Shannon entropy Minimizing the KL divergence ⇔ Maximizing the entropy If f is supported by [0,1] d , one always has H( f ) ≤ 0 and the maximum value of H( f ), zero, being uniquely attained by the uniform density. Using an exchange algorithm to Entropy build an “optimal” design estimation 8 ENBIS 2009 / Saint-Etienne

  9. Estimation by a Monte Carlo method � Introduction � Kullback Leibler divergence � Estimation by a Monte Carlo method � Design comparison � Conclusion ENBIS 2009 / Saint-Etienne

  10. Estimation by a Monte Carlo method Estimation by a Monte Carlo method Estimation by a Monte Carlo method The entropy can be written as an expectation [ ] [ ] = ∫ ( ) ( ) − = − H f f ( x ) ln f ( x ) dx E ln f ( x ) P f The Monte Carlo method (MC) provides a unbiased and consistent estimate of the entropy estimate of the entropy n 1 ∑ = − ˆ H ( X ) ln f ( X i ) n = i 1 where X 1 ,…,X n are the design points. the unknown density function f is replaced by its kernel density estimate (Ahmad and Lin, 1976) 10 ENBIS 2009 / Saint-Etienne

  11. Estimation by a Monte Carlo method Estimation by a Monte Carlo method Estimation by a Monte Carlo method Joe (1989) obtained asymptotic bias and variance terms for the estimator n ∑ 1 = − ˆ ˆ H ( X ) ln f ( X i ) n = i 1 ˆ f where is the kernel estimate, −   n x X ∑ 1 = ∀ x ∈ [0,1] d , ˆ   i f ( x ) K   d h nh = i 1 The bias depends on the size n , the dimension d and the bandwidth h fix the bias during the exchange algorithm 11 ENBIS 2009 / Saint-Etienne

  12. Estimation by a Monte Carlo method Estimation by a Monte Carlo method The kernel density estimation : the bandwidth � The bandwidth h plays an important role in the estimation h=0.4 h=0.1 Standard deviation ⇒ Scott’s rule of the uniform distribution 1 1 1 = σ = ˆ ˆ h ˆ h + + j j 1 /( d 4 ) 1 /( d 4 ) n 12 n j=1,…,d 12 ENBIS 2009 / Saint-Etienne

  13. Estimation by a Monte Carlo method Estimation by a Monte Carlo method The kernel density estimation : the kernel � the choice of the kernel function � is much less important Multidimensional Gaussian function − π  −  d / 2 ( 2 ) 1 = 2 K ( z ) exp z     d 2 s 2 s i − X X = j where i,j=1,…, n z h [ ] ∈ 2 2 Epanechnikov, uniform,… z 0 , d / h kernel functions are not 2 ∈ [ ] (d=10 and n=100 : ) z 0 ; 231 . 7 desirable ˆ Remark : is no more supported by [0,1] d f 13 ENBIS 2009 / Saint-Etienne

  14. Estimation by a Monte Carlo method Estimation by a Monte Carlo method Convergences -1.10 0.0 -0.2 -1.15 -0.4 Entropy Entropy -0.6 Entropie Entropie -1.20 E E 0.8 -0.8 -1.0 -1.25 d=3 d=3 -1.2 n=30 0 500 1000 1500 2000 2500 3000 0 100 200 300 400 500 Design size n Number of exchanges Taille de l'échantillon Nombre d'échanges The entropy estimation The exchange algorithm converges slowly towards 0 converges rapidly 14 ENBIS 2009 / Saint-Etienne

  15. Design comparison � Introduction � Kullback-Leibler divergence � Estimation by a Monte Carlo method � Design comparison � Conclusion ENBIS 2009 / Saint-Etienne

  16. Design comparison Improvement of the initial setting Plan initial Plan final d=2 n=20 1.0 1.0 0.8 0.8 0.6 0.6 0.4 0.4 0.2 0.2 • Quasi-independent of the initial setting 0.0 0.0 Initial design KL design 0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.2 0.4 0.6 0.8 1.0 Plan initial Plan final • Convergence towards quasi-periodical distribution 1.0 1.0 0.8 0.8 0.6 0.6 0.4 0.4 0.2 0.2 16 0.0 0.0 ENBIS 2009 / Saint-Etienne 0.0 0.2 0.4 0.6 0.8 1.0 0.0 0.2 0.4 0.6 0.8 1.0

  17. Design comparison Projections the design points will generally lie on the boundary of the design space, especially for small size n 1.0 1.0 0.8 0.8 0.6 0.6 Projections X2 0.4 d=10 0.4 n=100 0.2 0.2 0.0 0.0 0.0 0.2 0.4 0.6 0.8 1.0 0 2 4 6 8 10 X1 Axes Projections on each dimension Projections on 2D plane X 1 X 2 17 ENBIS 2009 / Saint-Etienne

  18. Usual space-filling designs • The maximin criterion ( Maximin) maximizes the minimal distance between the design points (Johnson et al., 1990), min d ( x , x ) i j ≤ < ≤ 1 i j n • The entropy criterion ( Dmax) is the maximization of the determinant of a covariance matrix (Shewry & Wynn, 1987),   d ∑ p = − θ − k k   R ( x , x ) exp x x i j k i j   = k 1 • Two kind of designs are based on the analogy of minimizing forces between charged particles Audze-Eglais (1977) criterion ( AE ) minimizes Strauss designs ( Strauss ) − n 1 n ∑ ∑ built with a MCMC 1 2 method (Franco, 2008) d ( x , x ) = = + 18 i j i 1 j i 1 ENBIS 2009 / Saint-Etienne

  19. Design comparison Usual criteria (d=10 and n=100) quantify how the points fill up the space Distance criteria Cov Maximin 1.0 0.12 KL Dmax Maximin 0.10 0.9 Strauss Dmax Dmax AE 0.08 0.8 Strauss AE 0.06 KL Maximin 0.7 0.04 0.02 0.6 Maximin KL Dmax Strauss AE Maximin KL Dmax Strauss AE The cover measure calculates the The Maximin criterion maximizes difference between the design and a the minimal distance between the uniform mesh (min) design points (max) 19 ENBIS 2009 / Saint-Etienne

  20. Design comparison Usual criteria (d=10 and n=100) Measure how close points being uniformly distributed Uniformity criteria DL2 KL 5.4e-06 -2.83100 KL Dmax 5.2e-06 Maximin Strauss AE KL 5.0e-06 -2.83105 -2.830994 KL KL 4.8e-06 4.8e-06 -2.830995 Strauss -2.830996 4.6e-06 -2.830997 Dmax -2.83110 -2.830998 Maximin 4.4e-06 KL -2.830999 Maximin zoom -2.831000 4.2e-06 Dmax Maximin KL Dmax AE -2.83115 4.0e-06 Maximin KL Dmax Strauss AE Maximin KL Dmax Strauss AE The discrepancy measures the difference between the empirical cumulative KL divergence (max) distribution of the design points and the uniform one (min) 20 ENBIS 2009 / Saint-Etienne

  21. Conclusion � Introduction � Kullback-Leibler divergence � Estimation by a Monte Carlo method � Design comparison � Conclusion ENBIS 2009 / Saint-Etienne

  22. Conclusion Conclusion Results • The KL criterion spread points evenly throughout the unit cube • The KL designs outperform the usual space-filling designs Outlooks • Estimation based on the nearest neighbor distances (CPU time + support of f ) • Construction of optimal Latin hypercube (projection) • Tsallis entropy (analytic expression) , Rényi entropy (estimated by MST) 22 ENBIS 2009 / Saint-Etienne

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend