matrix calculations diagonalisation orthogonality and
play

Matrix Calculations: Diagonalisation, Orthogonality, and - PowerPoint PPT Presentation

Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Matrix Calculations: Diagonalisation, Orthogonality, and Applications A. Kissinger Institute for Computing and Information Sciences


  1. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Matrix Calculations: Diagonalisation, Orthogonality, and Applications A. Kissinger Institute for Computing and Information Sciences Radboud University Nijmegen Version: autumn 2018 A. Kissinger Version: autumn 2018 Matrix Calculations 1 / 56

  2. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Last time • Vectors look different in different bases, e.g. for: � 1 �� 1 � �� �� 1 � � 1 �� B = , C = , 1 − 1 1 2 • we have: � 1 � 2 � � 1 � � 2 = = 0 − 1 1 S C 2 B A. Kissinger Version: autumn 2018 Matrix Calculations 2 / 56

  3. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Last time � 1 �� 1 � �� �� 1 � � 1 �� B = , C = , 1 − 1 1 2 • We can transform bases using basis transformation matrices. Going to standard basis is easy (basis elements are columns): � 1 � � 1 � 1 1 ❚ B⇒S = ❚ C⇒S = 1 − 1 1 2 • ...coming back means taking the inverse: � 1 � ❚ S⇒B = ( ❚ B⇒S ) − 1 = 1 1 1 − 1 2 � 2 � − 1 ❚ S⇒C = ( ❚ C⇒S ) − 1 = − 1 1 A. Kissinger Version: autumn 2018 Matrix Calculations 3 / 56

  4. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Last time • The change of basis of a vector is computed by applying the matrix. For example, changing from S to B is: ✈ ′ = ❚ S⇒B · ✈ • The change of basis for a matrix is computed by surrounding it with basis-change matrices. • Changing from a matrix ❆ in S to a matrix ❆ ′ in B is: ❆ ′ = ❚ S⇒B · ❆ · ❚ B⇒S • (Memory aid: look at the first matrix after the equals sign to see what basis transformation you are doing.) A. Kissinger Version: autumn 2018 Matrix Calculations 4 / 56

  5. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up • Many linear maps have their ‘own’ basis, their eigenbasis , which has the property that all basis elements ✈ ∈ B do this: ❆ · ✈ = λ ✈ • λ is called an eigenvalue, ✈ is called an eigenvector. • Eigenvalues are computed by solving: det( ❆ − λ ■ ) = 0 A. Kissinger Version: autumn 2018 Matrix Calculations 5 / 56

  6. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Outline Eigenvectors and diagonalisation Inner products and orthogonality Wrapping up A. Kissinger Version: autumn 2018 Matrix Calculations 6 / 56

  7. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Computing eigenvectors • For an n × n matrix, the equation det( ❆ − λ ■ ) = 0 has n solutions, which we’ll write as: λ 1 , λ 2 , . . . , λ n • (e.g. a 2 × 2 matrix involves solving a quadratic equation, which has 2 solutions λ 1 and λ 2 ) • For each of these solutions, we get a homogeneous system: ( ❆ − λ i ■ ) · ✈ i = 0 � �� � matrix • Solving this homogeneous system gives us the associated eigenvector ✈ i for the eigenvalue λ i A. Kissinger Version: autumn 2018 Matrix Calculations 8 / 56

  8. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Example • This matrix: � 1 � − 2 ❆ = 0 − 1 • Has characteristic polynomial: � − λ + 1 � − 2 = λ 2 − 1 det 0 − λ − 1 • The equation λ 2 − 1 = 0 has 2 solutions : λ 1 = 1 and λ 2 = − 1. A. Kissinger Version: autumn 2018 Matrix Calculations 9 / 56

  9. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Example • For λ 1 = 1, we get a homogeneous system: ( ❆ − λ 1 · ■ ) · ✈ 1 = 0 • Computing ( ❆ − (1) · ■ ): � 1 � � 1 � � 0 � − 2 0 − 2 − (1) · = 0 − 1 0 1 0 − 2 • So, we need to find a non-zero solution for: � 0 � − 2 · ✈ 1 = 0 0 − 2 (just like in lecture 2) � 0 � • This works: ✈ 1 = 1 A. Kissinger Version: autumn 2018 Matrix Calculations 10 / 56

  10. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Example • For λ 2 = − 1, we get another homogeneous system: ( ❆ − λ 2 · ■ ) · ✈ 2 = 0 • Computing ( ❆ − ( − 1) · ■ ): � 1 � � 1 � � 2 � − 2 0 − 2 − ( − 1) · = 0 − 1 0 1 0 0 • So, we need to find a non-zero solution for: � 2 � − 2 · ✈ 2 = 0 0 0 � 1 � • This works: ✈ 2 = 1 A. Kissinger Version: autumn 2018 Matrix Calculations 11 / 56

  11. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Example So, for the matrix ❆ , we computed 2 eigenvalue/eigenvector pairs: � 0 � λ 1 = 1 , ✈ 1 = 1 and � 1 � λ 2 = − 1 , ✈ 2 = 1 A. Kissinger Version: autumn 2018 Matrix Calculations 12 / 56

  12. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Theorem If the eigenvalues of a matrix ❆ are all different, then their associated eigenvectors form a basis. Proof. We need to prove the ✈ i are all linearly independent. Then suppose (for contradiction) that ✈ 1 , . . . , ✈ n are linearly dependent , i.e.: c 1 ✈ 1 + c 2 ✈ 2 + . . . + c n ✈ n = 0 for k non-zero coefficients. Then, using that they are eigvectors: ❆ · ( c 1 ✈ 1 + . . . + c n ✈ n ) = 0 = ⇒ λ 1 c 1 ✈ 1 + . . . + λ n c n ✈ n = 0 1 Suppose c j � = 0, then subtract λ j times 2nd equation from the 1st equation: c 1 ✈ 1 + c 2 ✈ 2 + . . . + c n ✈ n − 1 λ j ( λ 1 c 1 ✈ 1 + . . . + λ n c n ✈ n ) = 0 This has k − 1 non-zero coefficients (because all the λ i ’s are distinct). Repeat until we have just 1 non-zero coefficient, and we have: c j ✈ k = 0 = ⇒ ✈ k = 0 but eigenvectors are always non-zero, so this is a contradiction. A. Kissinger Version: autumn 2018 Matrix Calculations 13 / 56

  13. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Changing basis • Once we have a basis of eigenvectors B = { ✈ 1 , ✈ 2 , . . . , ✈ n } , translating to B gives us a diagonal matrix, whose diagonal entries are the eigenvalues:   λ 1 0 0 0 0 λ 2 0 0   ❚ S⇒B · ❆ · ❚ B⇒S = ❉ where ❉ =   0 0 · · · 0   0 0 0 λ n • Going the other direction, we can always write ❆ in terms of a diagonal matrix: ❆ = ❚ B⇒S · ❉ · ❚ S⇒B A. Kissinger Version: autumn 2018 Matrix Calculations 14 / 56

  14. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Definition For a matrix ❆ with eigenvalues λ 1 , . . . , λ n and eigenvectors B = { ✈ 1 , . . . , ✈ n } , decomposing ❆ as:   λ 1 0 0 0 0 λ 2 0 0   ❆ = ❚ B⇒S ·  · ❚ S⇒B   0 0 · · · 0  0 0 0 λ n is called diagonalising the matrix ❆ . A. Kissinger Version: autumn 2018 Matrix Calculations 15 / 56

  15. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Summary: diagonalising a matrix (study this slide!) We diagonalise a matrix ❆ as follows: 1 Compute each eigenvalue λ 1 , λ 2 , . . . , λ n by solving the characteristic polynomial 2 For each eigenvalue, compute the associated eigenvector ✈ i by solving the homogenious system ( ❆ − λ i ■ ) · ✈ i = 0 . 3 Write down ❆ as the product of three matrices: ❆ = ❚ B⇒S · ❉ · ❚ S⇒B where: • ❚ B⇒S has the eigenvectors ✈ 1 , . . . , ✈ n (in order!) as its columns • ❉ has the eigenvalues (in the same order!) down its diagonal, and zeroes everywhere else • ❚ S⇒B is the inverse of ❚ B⇒S . A. Kissinger Version: autumn 2018 Matrix Calculations 16 / 56

  16. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Example: political swingers, part I • We take an extremely crude view on politics and distinguish only left and right wing political supporters • We study changes in political views, per year • Suppose we observe, for each year: • 80% of lefties remain lefties and 20% become righties • 90% of righties remain righties, and 10% become lefties Questions . . . • start with a population L = 100 , R = 150, and compute the number of lefties and righties after one year; • similarly, after 2 years, and 3 years, . . . • We can represent these computations conveniently using matrix multiplication. A. Kissinger Version: autumn 2018 Matrix Calculations 17 / 56

  17. Eigenvectors and diagonalisation Inner products and orthogonality Radboud University Nijmegen Wrapping up Political swingers, part II • So if we start with a population L = 100 , R = 150, then after one year we have: • lefties: 0 . 8 · 100 + 0 . 1 · 150 = 80 + 15 = 95 • righties: 0 . 2 · 100 + 0 . 9 · 150 = 20 + 135 = 155 � � � � L 100 • If = , then after one year we have: R 150 � 95 � 100 � � 0 . 8 � � 100 � � 0 . 1 P · = · = 150 0 . 2 0 . 9 150 155 • After two years we have: � 95 � 95 � 91 . 5 � � 0 . 8 � � � 0 . 1 P · = · = 155 0 . 2 0 . 9 155 158 . 5 A. Kissinger Version: autumn 2018 Matrix Calculations 18 / 56

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend