solving large scale eigenvalue problems
play

Solving large scale eigenvalue problems Lecture 9, April 25, 2018: - PowerPoint PPT Presentation

Solving large scale eigenvalue problems Solving large scale eigenvalue problems Lecture 9, April 25, 2018: Lanczos and Arnoldi methods http://people.inf.ethz.ch/arbenz/ewp/ Peter Arbenz Computer Science Department, ETH Z urich E-mail:


  1. Solving large scale eigenvalue problems Solving large scale eigenvalue problems Lecture 9, April 25, 2018: Lanczos and Arnoldi methods http://people.inf.ethz.ch/arbenz/ewp/ Peter Arbenz Computer Science Department, ETH Z¨ urich E-mail: arbenz@inf.ethz.ch Large scale eigenvalue problems, Lecture 9, April 25, 2018 1/44

  2. Solving large scale eigenvalue problems Survey Survey of today’s lecture We continue with the Arnoldi algorithm and its ‘symmetric cousin’, the Lanczos algorithm. ◮ The Lanczos algorithm and its deficiencies ◮ Loss of orthogonality ◮ Limiting the memory consumption of Arnoldi: Restarting Lanczos/Arnoldi algorithms Large scale eigenvalue problems, Lecture 9, April 25, 2018 2/44

  3. Solving large scale eigenvalue problems Arnoldi algorithm Reminder: the Arnoldi algorithm ◮ The Arnoldi algorithm constructs orthonormal bases for the Krylov spaces K j ( x ) = K j ( x , A ) := R ([ x , A x , . . . , A j − 1 x ]) ∈ R n × j , j = 1 , 2 , . . . ◮ These bases are nested. ◮ Let { v 1 , . . . , v j } be an orthonormal bases for K j ( x , A ). We obtain v j +1 by orthogonalizing A v j against { v 1 , . . . , v j } : j � r j = A v j − V j V ∗ j A v j = A v j − v i ( v ∗ i A v j ) , i =1 v j +1 = r j / � r j � . ◮ This is the Gram–Schmidt orthogonalization procedure. Large scale eigenvalue problems, Lecture 9, April 25, 2018 3/44

  4. Solving large scale eigenvalue problems Arnoldi algorithm Arnoldi algorithm 1: Let A ∈ R n × n . This algorithm computes orthonormal basis for K j ( x ). 2: v 1 = x / � x � 2 ; 3: for j = 1 , . . . do r j := A v j ; 4: for i = 1 , . . . , j do { Gram-Schmidt orthogonalization } 5: h ij := v ∗ i r j , r j := r j − v i h ij ; 6: end for 7: h j +1 , j := � r j � ; 8: if h j +1 , j = 0 then { Found an invariant subspace } 9: return ( v 1 , . . . , v j , H ∈ R j × j ) 10: end if 11: v j +1 = r j / h j +1 , j ; 12: 13: end for Large scale eigenvalue problems, Lecture 9, April 25, 2018 4/44

  5. Solving large scale eigenvalue problems Lanczos algorithm Lanczos algorithm = Arnoldi + symmetry Let A be symmetric, A = A ∗ . In the Arnoldi algorithm we form j � r j = A v j − v i ( v ∗ i A v j ) , i =1 i A v j = ( A v i ) ∗ v j v ∗ A v i ∈ K i +1 ( x ) = ⇒ A v i ⊥ v j for i + 1 < j , = ⇒ v ∗ i A v j = 0 for i + 1 < j . Thus, r j = A v j − v j ( v ∗ j A v j ) , − v j − 1 ( v ∗ j − 1 A v j ) =: A v j − v j α j − v j − 1 β j − 1 . Large scale eigenvalue problems, Lecture 9, April 25, 2018 5/44

  6. Solving large scale eigenvalue problems Lanczos algorithm Lanczos algorithm = Arnoldi + symmetry (cont.) j +1 A v j = ¯ � r j � = v ∗ j +1 r j = v ∗ j +1 ( A v j − α j v j − β j − 1 v j − 1 ) = v ∗ β j . From this it follows that β j ∈ R . Therefore, β j v j +1 = r j , β j = � r j � . Altogether A v j = β j − 1 v j − 1 + α j v j + β j v j +1 . Large scale eigenvalue problems, Lecture 9, April 25, 2018 6/44

  7. Solving large scale eigenvalue problems Lanczos algorithm Lanczos algorithm = Arnoldi + symmetry (cont.) Gathering these equations for j = 1 , . . . , k we get   α 1 β 1 β 1 α 2 β 2     ...   AV k = V k β 2 α 3 + β k [ 0 , . . . , 0 , v k +1 ] .     ... ...   β k − 1   β k − 1 α k � �� � T k T k ∈ R k × k is real symmetric . The equation above is called Lanczos relation. Large scale eigenvalue problems, Lecture 9, April 25, 2018 7/44

  8. Solving large scale eigenvalue problems Lanczos algorithm Lanczos algorithm 1: Let A ∈ R n × n be symmetric. This algorithm computes an orthonormal basis V m = [ v 1 , . . . , v m ] for K m ( x ) where m is the smallest index such that K m ( x ) = K m +1 ( x ), and the matrix T m . 2: v := x / � x � ; V 1 = [ v ]; 3: r := A v ; 4: α 1 := v ∗ r ; r := r − α 1 v ; 5: β 1 := � r � ; 6: for j = 2 , 3 , . . . do q = v ; v := r /β j − 1 ; V j := [ V j − 1 , v ]; 7: r := A v − β j − 1 q ; 8: α j := v ∗ r ; r := r − α j v ; 9: β j := � r � ; 10: if β j = 0 then 11: return ( V ∈ R n × j ; α 1 , . . . , α j ; β 1 , . . . , β j − 1 ) 12: end if 13: 14: end for Large scale eigenvalue problems, Lecture 9, April 25, 2018 8/44

  9. Solving large scale eigenvalue problems Lanczos algorithm Discussion of the Lanczos algorithm ◮ Lanczos algorithm needs just three vectors to compute T m . ◮ The cost of an iteration step j does not depend on the index j . ◮ The storage requirement depends on j . ◮ Remark on very large eigenvalue problems. ◮ From AV m = V m T m and T m s ( m ) = ϑ ( m ) s ( m ) we have i i i A y ( m ) = ϑ ( m ) y ( m ) y ( m ) = V m s ( m ) , . i i i i i ◮ In general m is very large. We do not want go so far. When should we stop? Large scale eigenvalue problems, Lecture 9, April 25, 2018 9/44

  10. Solving large scale eigenvalue problems Lanczos algorithm The Lanczos process as an iterative method ◮ We have seen earlier that eigenvalues at the end of the spectrum are approximated very quickly in Krylov spaces. ◮ Thus, only a very few iteration steps may be required to get those eigenvalues (and corresponding eigenvectors) within the desired accuracy. ◮ Can we check this? Can we check if | ϑ ( j ) − λ i | is small? i Lemma (Eigenvalue inclusion of Krylov–Bogoliubov) Let A ∈ R n × n be symmetric. Let ϑ ∈ R and x ∈ R n with x � = 0 be arbitrary. Set τ := � ( A − ϑ I ) x � / � x � . Then there is an eigenvalue of A in the interval [ ϑ − τ, ϑ + τ ] . Large scale eigenvalue problems, Lecture 9, April 25, 2018 10/44

  11. Solving large scale eigenvalue problems Lanczos algorithm The Lanczos process as an iterative method (cont.) Apply the Lemma with x = y ( j ) = V j s ( j ) and ϑ = ϑ ( j ) i , a Ritz pair i i of the j -th step of the Lanczos algorithm, i.e., T j s ( j ) = ϑ ( j ) i s ( j ) i . i � A y ( j ) − ϑ ( j ) i y ( j ) i � = � AV j s ( j ) − ϑ ( j ) i V j s ( j ) i � i i = � ( AV j − V j T j ) s ( j ) i � (Lanczos relation) j s ( j ) j s ( j ) i | = | β j || s ( j ) = � β j v j +1 e ∗ i � = | β j || e ∗ ji | . s ( j ) is the j -th, i.e., the last element of the eigenvector s j of T j . ji Exercise: Sketch an algorithm that computes the eigenvalues of a real symmetric tridiagonl matrix plus the last component of all its eigenvectors. This is the Golub–Welsch algorithm [3]. Large scale eigenvalue problems, Lecture 9, April 25, 2018 11/44

  12. Solving large scale eigenvalue problems Lanczos algorithm The Lanczos process as an iterative method (cont.) Lemma = ⇒ there is an eigenvalue λ of A such that | λ − ϑ ( j ) i | ≤ β j | s ( j ) ji | . (1) It is possible to get good eigenvalue approximations even if β j is not small! Further, it is also known that | s ji | sin ∠ ( y ( j ) i , z ) ≤ β j γ , (2) where z is the eigenvector corresponding to λ in the Lemma and γ is the gap between λ and the next eigenvalue � = λ of A . γ may be estimated by | λ − ϑ ( j ) k | , k � = i . Large scale eigenvalue problems, Lecture 9, April 25, 2018 12/44

  13. Solving large scale eigenvalue problems Lanczos algorithm Numerical example Matrix: A = diag(0 , 1 , 2 , 3 , 4 , 100000) , Initial vector: √ x = (1 , 1 , 1 , 1 , 1 , 1) T / 6 . ◮ The Lanczos algorithm should stop after m = n = 6 iteration steps with the complete Lanczos relation. ◮ Up to rounding error, we expect that β 6 = 0 and that the eigenvalues of T 6 are identical with those of A . Large scale eigenvalue problems, Lecture 9, April 25, 2018 13/44

  14. Solving large scale eigenvalue problems Lanczos algorithm Numerical example (cont.) j = 1 α 1 = 16668 . 33333333334 , β 1 = 37267 . 05429136513 . j = 2 α 2 = 83333 . 66652666384 , β 2 = 3 . 464101610531258 . The diagonal of the eigenvalue matrix Θ 2 is: diag(Θ 2 ) = (1 . 999959999195565 , 99999 . 99989999799) T . The last row of β 2 S 2 is β 2 S 2 , : = (1 . 414213562613906 , 3 . 162277655014521) . Large scale eigenvalue problems, Lecture 9, April 25, 2018 14/44

  15. Solving large scale eigenvalue problems Lanczos algorithm Numerical example (cont.) The matrix of Ritz vectors Y 2 = Q 2 S 2 is   − 2 . 0000 · 10 − 05 − 0 . 44722 − 9 . 9998 · 10 − 06 − 0 . 44722     4 . 0002 · 10 − 10 − 0 . 44721     1 . 0001 · 10 − 05 − 0 . 44721     2 . 0001 · 10 − 05 − 0 . 44720   4 . 4723 · 10 − 10 1 . 0000 Large scale eigenvalue problems, Lecture 9, April 25, 2018 15/44

  16. Solving large scale eigenvalue problems Lanczos algorithm Numerical example (cont.) j = 3 α 3 = 2 . 000112002245340 β 3 = 1 . 183215957295906 . The diagonal of the eigenvalue matrix is diag(Θ 3 ) = (0 . 5857724375775532 , 3 . 414199561869119 , 99999 . 99999999999) T The largest eigenvalue has converged already. This is not surprising as λ 2 /λ 1 = 4 · 10 − 5 . With simple vector iteration the eigenvalues would converge with the factor λ 2 /λ 1 = 4 · 10 − 5 . The last row of β 3 S 3 is � 0 . 83665 , 0 . 83667 , 3 . 74173 · 10 − 5 � β 3 S 3 , : = . Large scale eigenvalue problems, Lecture 9, April 25, 2018 16/44

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend