ionic optimisation
play

Ionic optimisation Georg KRESSE Institut f ur Materialphysik and - PowerPoint PPT Presentation

Ionic optimisation Georg KRESSE Institut f ur Materialphysik and Center for Computational Material Science Universit at Wien, Sensengasse 8, A-1090 Wien, Austria b-initio ackage imulation ienna G. K RESSE , I ONIC OPTIMISATION Page 1


  1. Ionic optimisation Georg KRESSE Institut f¨ ur Materialphysik and Center for Computational Material Science Universit¨ at Wien, Sensengasse 8, A-1090 Wien, Austria b-initio ackage imulation ienna G. K RESSE , I ONIC OPTIMISATION Page 1

  2. � � � Overview the mathematical problem – minimisation of functions – rule of the Hessian matrix – how to overcome slow convergence the three implemented algorithms – Quasi-Newton (DIIS) – conjugate gradient (CG) – damped MD strength, weaknesses a little bit on molecular dynamics G. K RESSE , I ONIC OPTIMISATION Page 2

  3. ✂ � ✂ ✄ ✁ ✂ ✝ ✂ ✄ ✞ � ✄ ☎ ✝ ✁ ✟ ☎ ✂ ✂ ☎ ✁ ✂ ✄ ✄ ☎ ✂ ✁ ✂ ✄ ☎ ✂ ✁ ✝ ✝ ✁ ✆ ✁ ✂ ✄ ✆ ✁ ✂ ✄ ☎ ☎ ✆ ✂ ✂ ✂ ✂ The mathematical problem search for the local minimum of a function f x for simplicity we will consider a simple quadratic function 1 1 x 0 x 0 f x a b x x B x a ¯ x B x 2 2 where B is the Hessian matrix ∂ 2 f B ij ∂ x i ∂ x j for a stationary point, one requires ∂ f x 0 g x B x ∂ x ∂ f ∑ x 0 g i x B ij x j ∂ x i j j at the minimum the Hessian matrix must be additionally positive definite G. K RESSE , I ONIC OPTIMISATION Page 3

  4. ✂ ✝ ✄ ✄ ☎ ✂ ✁ ✡ ✂ ☎ ✠ ✁ ✝ ✂ ✂ ☎ ✂ ✁ ✂ � ✄ ✄ ✂ ✁ ✂ ✂ ☎ � ✂ ✂ � � ✂ The Newton algorithm educational example x 1 start with an arbitrary start point x 1 calculate the gradient g multiply with the inverse of the Hessian matrix and perform a step x 2 x 1 1 x 1 B g ∂ f x 1 x 1 x 0 x 2 x 0 by inserting g B , one immediately recognises that ∂ x hence one can find the minimum in one step in practice, the calculation of B is not possible in a reasonable time-span, and one needs to approximate B by some reasonable approximation G. K RESSE , I ONIC OPTIMISATION Page 4

  5. ✂ ✁ ✄ ✂ ✁ ☞ ✄ ✝ ✂ ☎ ✂ ✄ ✂ ✂ ✁ ☛ ✂ Steepest descent approximate B by the largest eigenvalue of the Hessian matrix steepest descent algorithm (Jacobi algorithm for linear equations) x 1 1. initial guess x 1 2. calculate the gradient g 3. make a step into the direction of the steepest descent Γ max x 2 x 1 x 1 1 B g 4. repeat step 2 and 3 until convergence is reached for functions with long steep valleys convergence can be very slow Γ max Γ min G. K RESSE , I ONIC OPTIMISATION Page 5

  6. ✁ ✁ ✄ ✝ ✂ ✁ ✂ ✂ � ✂ ✟ ✟ ✟ ✂ ✎ ✄ ✎ ✂ ☎ ✍ ✂ ☎ ✄ ✟ ✂ ✝ ✟ ✁ ✂ ✟ ☎ ✄ ✍ ✂ ✍ ✂ ☎ ✍ ✌ ✝ ✟ ✟ ✌ ✌ ☎ ✟ ✆ ✟ ☞ ✂ ☎ ✍ ✂ ✍ � ✟ ✟ ✂ ✆ ✟ ✟ ✟ ✆ ✌ ✌ ✌ ☞ ✂ ☎ ✌ ✂ ✍ ✍ ✝ ✌ Speed of convergence how many steps are required to converge to a predefined accuracy x 1 x 0 assume that B is diagonal, and start from 1 Γ 1 0 1 with Γ 1 Γ 2 Γ 3 x 1 x 0 B Γ n 0 1 x 2 after steepest descent step are: x 1 gradient g and Γ 1 Γ 1 Γ n 1 1 x 1 x 1 x 0 x 2 x 1 x 1 x 0 g B g Γ n Γ n Γ n Γ n 1 G. K RESSE , I ONIC OPTIMISATION Page 6

  7. � ✝ ✟ ✟ ✟ ☞ ☞ ✝ � ✌ ✌ ✆ ✍ ✂ ☎ ✍ ✂ Convergence the error reduction is given by 1 Γ 1 Γ n 1 1−Γ/Γ max x 2 x 0 Γ n Γ n Γ Γ Γ Γ Γ Γ 1 1 2 4 5 3 1−2Γ/Γ max −1 – the error is reduced for each component – in the high frequency component the error vanishes after on step – for the low frequency component the reduction is smallest G. K RESSE , I ONIC OPTIMISATION Page 7

  8. � ✝ ✝ ✏ ✏ ✝ ✝ ✁ ✄ ✏ ✏ ✑ � � the derivation is also true for non-diagonal matrices in this case, the eigenvalues of the Hessian matrix are relevant for ionic relaxation, the eigenvalues of the Hessian matrix correspond to the vibrational frequencies of the system the highest frequency mode determines the maximum stable step-width (“hard modes limit the step-size”) but the soft modes converge slowest to reduce the error in all components to a predefined fraction ε , k iterations are required Γ min k ε 1 Γ max Γ min ln ε k ln 1 Γ max k Γ min Γ max k ∝ Γ max ln ε ln ε k Γ max Γ min Γ min G. K RESSE , I ONIC OPTIMISATION Page 8

  9. ✂ ✟ ✝ ✂ ✂ ☎ ✄ ✒ ✂ � � � ✠ ☎ ✏ ✠ ✁ Pre-conditioning 1 , if an approximation of the inverse Hessian matrix is know P B the convergence speed can be much improved λ P x N 1 x N x N g in this case the convergence speed depends on the eigenvalue spectrum of PB 1 , the Newton algorithm is obtained for P B G. K RESSE , I ONIC OPTIMISATION Page 9

  10. ✂ ✠ ✟ ✄ ✠ ✂ ✁ ✟ Variable-metric schemes, Quasi-Newton scheme variable-metric schemes maintain an iteration history they construct an implicit or explicit approximation of the inverse Hessian matrix 1 B approx search directions are given by 1 B g x approx the asymptotic convergence rate is give by Γ max number of iterations ∝ Γ min G. K RESSE , I ONIC OPTIMISATION Page 10

  11. ✁ ✂ ✕ � ✂ ✞ � ✂ ✁ ✂ ✄ ☎ ✟ ✝ ✟ ✂ ☎ ✂ ✂ ✝ ☎ ✂ ☎ ✄ ✂ ✝ ✞ ☎ ✟ ✟ ✟ ✂ ✞ ✔ ✟ ✟ ✞ ✕ ☎ ✔ ✂ ☎ ✓ � ✞ ✓ Simple Quasi-Newton scheme, DIIS direct inversion in the iterative subspace (DIIS) set of points x i g i 1 and 1 i N i N search for a linear combination of x i which minimises the gradient, under the constraint ∑ α i 1 i B ∑ B ∑ ∑ ∑ α i α i α i α i x i x i x 0 x i x 0 g i i i i ∑ ∑ α i B α i x i x 0 g i i i gradient is linear in it’s arguments for a quadratic function G. K RESSE , I ONIC OPTIMISATION Page 11

  12. ✂ ✕ ✄ ✓ ✂ ✔ ☎ ✞ ✟ ✟ ✟ ✞ ✂ ✝ ✁ ✂ ✂ ☎ ☎ ✂ ✂ ✟ ✂ ✂ ✂ ☎ ✂ ✂ ✂ ✝ ✂ ✂ ☎ ✂ ✁ ✂ ✄ ✂ ✂ ☎ ☎ ✂ ✂ Full DIIS algorithm x 1 1. single initial point g 1 x 1 2. gradient g , move along gradient (steepest descent) λ x 2 x 1 g 1 g 2 x 2 3. calculate new gradient g g i 4. search in the space spanned by i 1 N for the minimal gradient ∑ α i g i g opt and calculate the corresponding position ∑ α i x i x opt x 3 by moving from 5. Construct a new point x opt along g opt λ x 3 x opt g opt G. K RESSE , I ONIC OPTIMISATION Page 12

  13. ✚ ✖ ✖ ✖ ✖ ✖ ✘ ✖ ✖ ✖ ✗ ✙ ✖ x 0 to x 1 (arrows correspond to gradients 1. steepest descent step from g 0 and g 1 ) x 1 2. gradient along indicated red line is now know, determine optimal position opt x 1 x 1 3. another steepest descent step form opt along g opt g opt 4. calculate gradient x 2 now the gradient is known in the entire 2 dimensional space (linearity condition) and the function can be minimised exactly 0 0 1 1 0 1 a x + a x, a +a =1 x 0 1 x x 2 x 0 x 1 opt G. K RESSE , I ONIC OPTIMISATION Page 13

  14. ✂ ✝ ✂ ☎ ✁ ✂ ✄ ✁ ✂ ✠ ✄ ✂ ✠ ✂ ✁ ✂ ✁ ✠ ✂ ✄ ✁ ✄ ✁ ✂ ✁ ✂ ✆ ✞ ✂ ✢ ✁ ✂ ☎ ✄ ✂ ✂ ✂ ☎ ✂ ✂ ✁ ✂ ✄ ✄ ✆ ✠ ✂ Conjugate gradient first step is a steepest descent step with line minimisation search directions are “conjugated” to the previous search directions x N 1. gradient at the current position g 2. conjugate this gradient to the previous search direction using: x N x N 1 x N g g g γ γ s N x N s N 1 ✄✜✛ g 1 1 x N x N g g ✄✜✛ s N 3. line minimisation along this search direction 4. continue with step 1), if the gradient is not sufficiently small. the search directions satisfy: δ NM s N B s M N M the conjugate gradient algorithm finds the minimum of a quadratic function with k degrees of freedom in k 1 steps exactly G. K RESSE , I ONIC OPTIMISATION Page 14

  15. ✖ ✖ ✖ ✙ ✖ ✘ ✖ ✗ ✖ ✖ ✣ ✖ x 0 , search for minimum along 1. steepest descent step from g 0 by performing several trial x 1 steps (crosses, at least one triastep is required) 2. determine new gradient g 1 g x 1 and conjugate it to get s 1 (green arrow) for 2d-functions the gradient points now directly to the minimum 3. minimisation along search direction s 1 x 0 1 x x 2 x 0 s 1 1 x 1 x G. K RESSE , I ONIC OPTIMISATION Page 15

  16. ☞ � � ☞ � ✞ Asymptotic convergence rate asymptotic convergence rate is the convergence behaviour for the case that the degrees of freedom are much large than the number of steps e.g. 100 degrees of freedom but you perform only 10-20 steps how quickly, do the forces decrease? this depends entirely on the eigenvalue spectrum of the Hessian matrix: steepest descent: Γ max Γ min – steps are required to reduce the forces to a fraction ε Γ max Γ min – DIIS, CG, damped MD: steps are required to reduce the forces to a fraction ε Γ max Γ min are the maximum and minimal eigenvalue of the Hessian matrix G. K RESSE , I ONIC OPTIMISATION Page 16

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend