Linear Algebra Chapter 5: Eigenvalues and Eigenvectors Section 5.1. - - PowerPoint PPT Presentation

linear algebra
SMART_READER_LITE
LIVE PREVIEW

Linear Algebra Chapter 5: Eigenvalues and Eigenvectors Section 5.1. - - PowerPoint PPT Presentation

Linear Algebra Chapter 5: Eigenvalues and Eigenvectors Section 5.1. Eigenvalues and EigenvectorsProofs of Theorems April 10, 2020 () Linear Algebra April 10, 2020 1 / 23 Table of contents Page 300 Number 8 1 Page 300 Number 14 2


slide-1
SLIDE 1

Linear Algebra

April 10, 2020 Chapter 5: Eigenvalues and Eigenvectors Section 5.1. Eigenvalues and Eigenvectors—Proofs of Theorems

() Linear Algebra April 10, 2020 1 / 23

slide-2
SLIDE 2

Table of contents

1

Page 300 Number 8

2

Page 300 Number 14

3

Theorem 5.1. Properties of Eigenvalues and Eigenvectors

4

Page 298 Example 8

5

Page 300 Number 18

6

Page 301 Number 30

7

Page 301 Number 32

8

Page 302 Number 38

9

Page 302 Number 40

() Linear Algebra April 10, 2020 2 / 23

slide-3
SLIDE 3

Page 300 Number 8

Page 300 Number 8

Page 300 Number 8. Find the characteristic polynomial, the real eigenvalues, and the corresponding eigenvectors for A =   −1 −4 2 −1 4 3   .

  • Solution. We have

A−λI =   −1 −4 2 −1 4 3  −λ   1 1 1   =   −1 − λ −4 2 − λ −1 4 3 − λ   .

() Linear Algebra April 10, 2020 3 / 23

slide-4
SLIDE 4

Page 300 Number 8

Page 300 Number 8

Page 300 Number 8. Find the characteristic polynomial, the real eigenvalues, and the corresponding eigenvectors for A =   −1 −4 2 −1 4 3   .

  • Solution. We have

A−λI =   −1 −4 2 −1 4 3  −λ   1 1 1   =   −1 − λ −4 2 − λ −1 4 3 − λ   . So the characteristic polynomial is p(λ) = det(A − λI) =

  • −1 − λ

−4 2 − λ −1 4 3 − λ

  • =

(−1 − λ)

  • 2 − λ

−1 3 − λ

  • − (0) + (0)

= (−1 − λ) ((2 − λ)(3 − λ) − (−1)(0)) = (−1 − λ)(2 − λ)(3 − λ).

() Linear Algebra April 10, 2020 3 / 23

slide-5
SLIDE 5

Page 300 Number 8

Page 300 Number 8

Page 300 Number 8. Find the characteristic polynomial, the real eigenvalues, and the corresponding eigenvectors for A =   −1 −4 2 −1 4 3   .

  • Solution. We have

A−λI =   −1 −4 2 −1 4 3  −λ   1 1 1   =   −1 − λ −4 2 − λ −1 4 3 − λ   . So the characteristic polynomial is p(λ) = det(A − λI) =

  • −1 − λ

−4 2 − λ −1 4 3 − λ

  • =

(−1 − λ)

  • 2 − λ

−1 3 − λ

  • − (0) + (0)

= (−1 − λ) ((2 − λ)(3 − λ) − (−1)(0)) = (−1 − λ)(2 − λ)(3 − λ).

() Linear Algebra April 10, 2020 3 / 23

slide-6
SLIDE 6

Page 300 Number 8

Page 300 Number 8 (continued 1)

Solution (continued). So the eigenvalues are λ1 = −1, λ2 = 2, λ3 = 3. To find the eigenvectors corresponding to each eigenvalue, we consider the formula A v = λ v or (A − λI) v = 0 (see Note 5.1.A):

() Linear Algebra April 10, 2020 4 / 23

slide-7
SLIDE 7

Page 300 Number 8

Page 300 Number 8 (continued 1)

Solution (continued). So the eigenvalues are λ1 = −1, λ2 = 2, λ3 = 3. To find the eigenvectors corresponding to each eigenvalue, we consider the formula A v = λ v or (A − λI) v = 0 (see Note 5.1.A): λ1 = −1. With v1 = [v1, v2, v3]T an eigenvector corresponding to the eigenvalue λ1 = −1 we need (A − λ1I) v1 = 0.

() Linear Algebra April 10, 2020 4 / 23

slide-8
SLIDE 8

Page 300 Number 8

Page 300 Number 8 (continued 1)

Solution (continued). So the eigenvalues are λ1 = −1, λ2 = 2, λ3 = 3. To find the eigenvectors corresponding to each eigenvalue, we consider the formula A v = λ v or (A − λI) v = 0 (see Note 5.1.A): λ1 = −1. With v1 = [v1, v2, v3]T an eigenvector corresponding to the eigenvalue λ1 = −1 we need (A − λ1I) v1 =

  • 0. So we consider the

augmented matrix   −1 − (−1) −4 2 − (−1) −1 4 3 − (−1)   =   −4 3 −1 4 4  

R3→R3+R2

 −4 3 −1 3 3  

R2→R2−R3

 −4 −4 3 3  

R2→R2/(−4)

  • R3 → R3/3

  1 1 1 1  

R1↔R2

 1 1 1 1  

() Linear Algebra April 10, 2020 4 / 23

slide-9
SLIDE 9

Page 300 Number 8

Page 300 Number 8 (continued 1)

Solution (continued). So the eigenvalues are λ1 = −1, λ2 = 2, λ3 = 3. To find the eigenvectors corresponding to each eigenvalue, we consider the formula A v = λ v or (A − λI) v = 0 (see Note 5.1.A): λ1 = −1. With v1 = [v1, v2, v3]T an eigenvector corresponding to the eigenvalue λ1 = −1 we need (A − λ1I) v1 =

  • 0. So we consider the

augmented matrix   −1 − (−1) −4 2 − (−1) −1 4 3 − (−1)   =   −4 3 −1 4 4  

R3→R3+R2

 −4 3 −1 3 3  

R2→R2−R3

 −4 −4 3 3  

R2→R2/(−4)

  • R3 → R3/3

  1 1 1 1  

R1↔R2

 1 1 1 1  

() Linear Algebra April 10, 2020 4 / 23

slide-10
SLIDE 10

Page 300 Number 8

Page 300 Number 8 (continued 2)

Solution (continued).   1 1 1 1  

R2↔R3

 1 1 1 1   . So we need v1 + v3 = v2 + v3 = = , or v1 = −v3 v2 = −v3 v3 = v3

  • r, with r = v3

as a free variable, v1 = −r v2 = −r v3 = r .

() Linear Algebra April 10, 2020 5 / 23

slide-11
SLIDE 11

Page 300 Number 8

Page 300 Number 8 (continued 2)

Solution (continued).   1 1 1 1  

R2↔R3

 1 1 1 1   . So we need v1 + v3 = v2 + v3 = = , or v1 = −v3 v2 = −v3 v3 = v3

  • r, with r = v3

as a free variable, v1 = −r v2 = −r v3 = r . So the collection of all eigenvectors of λ1 = −1 is v1 = r   −1 −1 1   where r ∈ R, r = 0.

() Linear Algebra April 10, 2020 5 / 23

slide-12
SLIDE 12

Page 300 Number 8

Page 300 Number 8 (continued 2)

Solution (continued).   1 1 1 1  

R2↔R3

 1 1 1 1   . So we need v1 + v3 = v2 + v3 = = , or v1 = −v3 v2 = −v3 v3 = v3

  • r, with r = v3

as a free variable, v1 = −r v2 = −r v3 = r . So the collection of all eigenvectors of λ1 = −1 is v1 = r   −1 −1 1   where r ∈ R, r = 0.

() Linear Algebra April 10, 2020 5 / 23

slide-13
SLIDE 13

Page 300 Number 8

Page 300 Number 8 (continued 3)

Solution (continued). λ2 = 2. As above, we consider (A − 2I) v2 = 0 and consider the augmented matrix   −1 − (2) −4 2 − (2) −1 4 3 − (2)   =   −3 −4 −1 4 1  

R1→R1/(−3)

 1 −4 −1 4 1  

R2→R2+4R1

  • R3 → R3 − 4R1

  1 −1 1  

R3→R3+R2

 1 −1  

R2→−R2

 1 1   .

() Linear Algebra April 10, 2020 6 / 23

slide-14
SLIDE 14

Page 300 Number 8

Page 300 Number 8 (continued 4)

Solution (continued). So we need v1 = v3 = = , or v1 = v2 = v2 v3 =

  • r, with

s = v2 as a free variable, v1 = v2 = s v3 = . So the collection of all eigenvectors

  • f λ2 = 2 is

v2 = s   1   where s ∈ R, s = 0.

() Linear Algebra April 10, 2020 7 / 23

slide-15
SLIDE 15

Page 300 Number 8

Page 300 Number 8 (continued 4)

Solution (continued). So we need v1 = v3 = = , or v1 = v2 = v2 v3 =

  • r, with

s = v2 as a free variable, v1 = v2 = s v3 = . So the collection of all eigenvectors

  • f λ2 = 2 is

v2 = s   1   where s ∈ R, s = 0.

() Linear Algebra April 10, 2020 7 / 23

slide-16
SLIDE 16

Page 300 Number 8

Page 300 Number 8 (continued 5)

Solution (continued). λ3 = 3. As above, we consider (A − 3I) v3 = 0 and consider the augmented matrix   −1 − (3) −4 2 − (3) −1 4 3 − (3)   =   −4 −4 −1 −1 4  

R2→R2−R1

  • R3 → R3 + R1

  −4 −1 −1  

R1→R1/(−4)

  • R2 → −R2

  1 1 1   . So we need v1 = v2 + v3 = = , or v1 = v2 = −v3 v3 = v3 . . .

() Linear Algebra April 10, 2020 8 / 23

slide-17
SLIDE 17

Page 300 Number 8

Page 300 Number 8 (continued 5)

Solution (continued). λ3 = 3. As above, we consider (A − 3I) v3 = 0 and consider the augmented matrix   −1 − (3) −4 2 − (3) −1 4 3 − (3)   =   −4 −4 −1 −1 4  

R2→R2−R1

  • R3 → R3 + R1

  −4 −1 −1  

R1→R1/(−4)

  • R2 → −R2

  1 1 1   . So we need v1 = v2 + v3 = = , or v1 = v2 = −v3 v3 = v3 . . .

() Linear Algebra April 10, 2020 8 / 23

slide-18
SLIDE 18

Page 300 Number 8

Page 300 Number 8 (continued 5)

Page 300 Number 8. Find the characteristic polynomial, the real eigenvalues, and the corresponding eigenvectors for A =   −1 −4 2 −1 4 3   . Solution (continued). . . . v1 = v2 = −v3 v3 = v3

  • r, with t = v3 as a free

variable, v1 = v2 = −t v3 = t . So the collection of all eigenvectors of λ3 = 3 is

  • v3 = t

  −1 1   where t ∈ R, t = 0.

() Linear Algebra April 10, 2020 9 / 23

slide-19
SLIDE 19

Page 300 Number 8

Page 300 Number 8 (continued 5)

Page 300 Number 8. Find the characteristic polynomial, the real eigenvalues, and the corresponding eigenvectors for A =   −1 −4 2 −1 4 3   . Solution (continued). . . . v1 = v2 = −v3 v3 = v3

  • r, with t = v3 as a free

variable, v1 = v2 = −t v3 = t . So the collection of all eigenvectors of λ3 = 3 is

  • v3 = t

  −1 1   where t ∈ R, t = 0.

() Linear Algebra April 10, 2020 9 / 23

slide-20
SLIDE 20

Page 300 Number 14

Page 300 Number 14

Page 300 Number 14. Find the characteristic polynomial, the real eigenvalues, and the corresponding eigenvectors for A =   4 8 4 8 4   .

  • Solution. We have

A − λI =   4 8 4 8 4   − λ   1 1 1   =   4 − λ 8 4 − λ 8 4 − λ   .

() Linear Algebra April 10, 2020 10 / 23

slide-21
SLIDE 21

Page 300 Number 14

Page 300 Number 14

Page 300 Number 14. Find the characteristic polynomial, the real eigenvalues, and the corresponding eigenvectors for A =   4 8 4 8 4   .

  • Solution. We have

A − λI =   4 8 4 8 4   − λ   1 1 1   =   4 − λ 8 4 − λ 8 4 − λ   . So the characteristic polynomial is p(λ) = det(A − λI) =

  • 4 − λ

8 4 − λ 8 4 − λ

  • = (4 − λ)
  • 4 − λ

8 4 − λ

  • − 0 + 0 = (4 − λ) ((4 − λ)(4 − λ) − (8)(0))

= (4 − λ)3.

() Linear Algebra April 10, 2020 10 / 23

slide-22
SLIDE 22

Page 300 Number 14

Page 300 Number 14

Page 300 Number 14. Find the characteristic polynomial, the real eigenvalues, and the corresponding eigenvectors for A =   4 8 4 8 4   .

  • Solution. We have

A − λI =   4 8 4 8 4   − λ   1 1 1   =   4 − λ 8 4 − λ 8 4 − λ   . So the characteristic polynomial is p(λ) = det(A − λI) =

  • 4 − λ

8 4 − λ 8 4 − λ

  • = (4 − λ)
  • 4 − λ

8 4 − λ

  • − 0 + 0 = (4 − λ) ((4 − λ)(4 − λ) − (8)(0))

= (4 − λ)3.

() Linear Algebra April 10, 2020 10 / 23

slide-23
SLIDE 23

Page 300 Number 14

Page 300 Number 14 (continued 1)

Solution (continued). The eigenvalues can be found from the characteristic equation p(λ) = 0: (4 − λ)3 = 0. By Note 5.1.A, the only eigenvalue is λ = 4. To find the eigenvector corresponding to λ = 4 we consider the formula A v = λ v or (A − λI) v = 0.

() Linear Algebra April 10, 2020 11 / 23

slide-24
SLIDE 24

Page 300 Number 14

Page 300 Number 14 (continued 1)

Solution (continued). The eigenvalues can be found from the characteristic equation p(λ) = 0: (4 − λ)3 = 0. By Note 5.1.A, the only eigenvalue is λ = 4. To find the eigenvector corresponding to λ = 4 we consider the formula A v = λ v or (A − λI) v =

  • 0. This leads to

the augmented matrix   4 − (4) 8 4 − (4) 8 4 − (4)   =   8 8  

R1↔R2

 8 8  

R1→R1/8

 1 1   .

() Linear Algebra April 10, 2020 11 / 23

slide-25
SLIDE 25

Page 300 Number 14

Page 300 Number 14 (continued 1)

Solution (continued). The eigenvalues can be found from the characteristic equation p(λ) = 0: (4 − λ)3 = 0. By Note 5.1.A, the only eigenvalue is λ = 4. To find the eigenvector corresponding to λ = 4 we consider the formula A v = λ v or (A − λI) v =

  • 0. This leads to

the augmented matrix   4 − (4) 8 4 − (4) 8 4 − (4)   =   8 8  

R1↔R2

 8 8  

R1→R1/8

 1 1   .

() Linear Algebra April 10, 2020 11 / 23

slide-26
SLIDE 26

Page 300 Number 14

Page 300 Number 14 (continued 1)

Solution (continued). So we need v1 + v3 = = = , or v1 = −v3 v2 = v2 v3 = v3 . With r = v2 and s = v3 as free variables, v1 = −s v2 = r v3 = s . So the collection of all eigenvectors of λ = 4 is

  • v = r

  1   + s   −1 1   where r, s ∈ R and not both r = 0 and s = 0.

  • ()

Linear Algebra April 10, 2020 12 / 23

slide-27
SLIDE 27

Page 300 Number 14

Page 300 Number 14 (continued 1)

Solution (continued). So we need v1 + v3 = = = , or v1 = −v3 v2 = v2 v3 = v3 . With r = v2 and s = v3 as free variables, v1 = −s v2 = r v3 = s . So the collection of all eigenvectors of λ = 4 is

  • v = r

  1   + s   −1 1   where r, s ∈ R and not both r = 0 and s = 0.

  • ()

Linear Algebra April 10, 2020 12 / 23

slide-28
SLIDE 28

Theorem 5.1. Properties of Eigenvalues and Eigenvectors

Theorem 5.1

Theorem 5.1. Properties of Eigenvalues and Eigenvectors. Let A be an n × n matrix.

  • 2. If λ is an eigenvalue of an invertible matrix A with

v as a corresponding eigenvector, then λ = 0 and 1/λ is an eigenvalue of A−1, again with v as a corresponding eigenvector.

  • Proof. Page 301 Number 28. If λ = 0 is an eigenvalue of matrix A then

there is a nonzero vector v such that A v = λ v =

  • 0. But then the

homogeneous system of equations associated with A v = 0 has a nontrivial

  • solution. This implies that A is not invertible (by Theorem 1.16). But λ is

given to be an eigenvalue of an invertible matrix, so it must be that, in fact, λ = 0.

() Linear Algebra April 10, 2020 13 / 23

slide-29
SLIDE 29

Theorem 5.1. Properties of Eigenvalues and Eigenvectors

Theorem 5.1

Theorem 5.1. Properties of Eigenvalues and Eigenvectors. Let A be an n × n matrix.

  • 2. If λ is an eigenvalue of an invertible matrix A with

v as a corresponding eigenvector, then λ = 0 and 1/λ is an eigenvalue of A−1, again with v as a corresponding eigenvector.

  • Proof. Page 301 Number 28. If λ = 0 is an eigenvalue of matrix A then

there is a nonzero vector v such that A v = λ v =

  • 0. But then the

homogeneous system of equations associated with A v = 0 has a nontrivial

  • solution. This implies that A is not invertible (by Theorem 1.16). But λ is

given to be an eigenvalue of an invertible matrix, so it must be that, in fact, λ = 0. If λ is an eigenvalue of A with eigenvector v, then A v = λ v. Therefore A−1A v = A−1λ v or, by Theorem 1.3.A(7), “Scalars Pull Through,” v = λA−1

  • v. So A−1

v = (1/λ) v and 1/λ is an eigenvalue of A−1.

() Linear Algebra April 10, 2020 13 / 23

slide-30
SLIDE 30

Theorem 5.1. Properties of Eigenvalues and Eigenvectors

Theorem 5.1

Theorem 5.1. Properties of Eigenvalues and Eigenvectors. Let A be an n × n matrix.

  • 2. If λ is an eigenvalue of an invertible matrix A with

v as a corresponding eigenvector, then λ = 0 and 1/λ is an eigenvalue of A−1, again with v as a corresponding eigenvector.

  • Proof. Page 301 Number 28. If λ = 0 is an eigenvalue of matrix A then

there is a nonzero vector v such that A v = λ v =

  • 0. But then the

homogeneous system of equations associated with A v = 0 has a nontrivial

  • solution. This implies that A is not invertible (by Theorem 1.16). But λ is

given to be an eigenvalue of an invertible matrix, so it must be that, in fact, λ = 0. If λ is an eigenvalue of A with eigenvector v, then A v = λ v. Therefore A−1A v = A−1λ v or, by Theorem 1.3.A(7), “Scalars Pull Through,” v = λA−1

  • v. So A−1

v = (1/λ) v and 1/λ is an eigenvalue of A−1.

() Linear Algebra April 10, 2020 13 / 23

slide-31
SLIDE 31

Page 298 Example 8

Page 298 Example 8

Page 298 Example 8. Let D∞ be the vector space of all functions mapping R into R and having derivatives of all order. Let T : D∞ → D∞ be the differentiation map so that T(f ) = f ′. Describe all eigenvalues and eigenvectors of T. (Notice that by Example 3.4.5, T actually is linear.)

  • Solution. We need scalars λ and nonzero functions f where T(f ) = λf .

() Linear Algebra April 10, 2020 14 / 23

slide-32
SLIDE 32

Page 298 Example 8

Page 298 Example 8

Page 298 Example 8. Let D∞ be the vector space of all functions mapping R into R and having derivatives of all order. Let T : D∞ → D∞ be the differentiation map so that T(f ) = f ′. Describe all eigenvalues and eigenvectors of T. (Notice that by Example 3.4.5, T actually is linear.)

  • Solution. We need scalars λ and nonzero functions f where T(f ) = λf .

Case 1. If λ = 0, then we need T(f ) = 0f = 0 or f ′ = 0. So f must be a constant function. Eigenvectors are nonzero by definition, so the eigenvectors associated with eigenvalue 0 are all f (x) = k where k ∈ R, k = 0.

() Linear Algebra April 10, 2020 14 / 23

slide-33
SLIDE 33

Page 298 Example 8

Page 298 Example 8

Page 298 Example 8. Let D∞ be the vector space of all functions mapping R into R and having derivatives of all order. Let T : D∞ → D∞ be the differentiation map so that T(f ) = f ′. Describe all eigenvalues and eigenvectors of T. (Notice that by Example 3.4.5, T actually is linear.)

  • Solution. We need scalars λ and nonzero functions f where T(f ) = λf .

Case 1. If λ = 0, then we need T(f ) = 0f = 0 or f ′ = 0. So f must be a constant function. Eigenvectors are nonzero by definition, so the eigenvectors associated with eigenvalue 0 are all f (x) = k where k ∈ R, k = 0. Case 2. If λ = 0, then we need T(f ) = λf or f ′ = λf . That is, dy/dx = λy or (as a separable differential equation), dy/y = λ dx and so 1

y dy =

  • λ dx or ln |y| = λx + c or eln |y| = eλx+c or |y| = eceλx or

y = ±eceλx or y = keλx where we set k = ec or k = −ec (so k = 0). So the eigenvectors associated with eigenvalue λ = 0 are all y = keλx where k = 0.

() Linear Algebra April 10, 2020 14 / 23

slide-34
SLIDE 34

Page 298 Example 8

Page 298 Example 8

Page 298 Example 8. Let D∞ be the vector space of all functions mapping R into R and having derivatives of all order. Let T : D∞ → D∞ be the differentiation map so that T(f ) = f ′. Describe all eigenvalues and eigenvectors of T. (Notice that by Example 3.4.5, T actually is linear.)

  • Solution. We need scalars λ and nonzero functions f where T(f ) = λf .

Case 1. If λ = 0, then we need T(f ) = 0f = 0 or f ′ = 0. So f must be a constant function. Eigenvectors are nonzero by definition, so the eigenvectors associated with eigenvalue 0 are all f (x) = k where k ∈ R, k = 0. Case 2. If λ = 0, then we need T(f ) = λf or f ′ = λf . That is, dy/dx = λy or (as a separable differential equation), dy/y = λ dx and so 1

y dy =

  • λ dx or ln |y| = λx + c or eln |y| = eλx+c or |y| = eceλx or

y = ±eceλx or y = keλx where we set k = ec or k = −ec (so k = 0). So the eigenvectors associated with eigenvalue λ = 0 are all y = keλx where k = 0.

() Linear Algebra April 10, 2020 14 / 23

slide-35
SLIDE 35

Page 300 Number 18

Page 300 Number 18

Page 300 Number 18. Find the eigenvalues and corresponding eigenvectors for the linear transformation T([x, y]) = [x − y, −x + y]).

  • Solution. We apply Corollary 2.3.A, “Standard Matrix Representation of

Linear Transformations,” to find the matrix representing T. We have T(ˆ ı) = T([1, 0]) = [(1) − (0), (−1) + (0)] = [1, −1] and T(ˆ ) = T([0, 1]) = [(0) − (1), −(0) + (1)] = [−1, 1]. Hence the standard matrix representation of T is A =

  • 1

−1 −1 1

  • .

() Linear Algebra April 10, 2020 15 / 23

slide-36
SLIDE 36

Page 300 Number 18

Page 300 Number 18

Page 300 Number 18. Find the eigenvalues and corresponding eigenvectors for the linear transformation T([x, y]) = [x − y, −x + y]).

  • Solution. We apply Corollary 2.3.A, “Standard Matrix Representation of

Linear Transformations,” to find the matrix representing T. We have T(ˆ ı) = T([1, 0]) = [(1) − (0), (−1) + (0)] = [1, −1] and T(ˆ ) = T([0, 1]) = [(0) − (1), −(0) + (1)] = [−1, 1]. Hence the standard matrix representation of T is A =

  • 1

−1 −1 1

  • . By Note 5.1.B, the

eigenvalues and eigenvectors of T are the same as those of A. So we consider the characteristic polynomial p(λ) = det(A−λI) = det

  • 1

−1 −1 1

  • − λ

1 1

  • =
  • 1 − λ

−1 −1 1 − λ

  • = (1 − λ)(1 − λ) − (−1)(−1) = 1 − 2λ + λ2 − 1 = λ2 − 2λ = λ(λ − 2).

() Linear Algebra April 10, 2020 15 / 23

slide-37
SLIDE 37

Page 300 Number 18

Page 300 Number 18

Page 300 Number 18. Find the eigenvalues and corresponding eigenvectors for the linear transformation T([x, y]) = [x − y, −x + y]).

  • Solution. We apply Corollary 2.3.A, “Standard Matrix Representation of

Linear Transformations,” to find the matrix representing T. We have T(ˆ ı) = T([1, 0]) = [(1) − (0), (−1) + (0)] = [1, −1] and T(ˆ ) = T([0, 1]) = [(0) − (1), −(0) + (1)] = [−1, 1]. Hence the standard matrix representation of T is A =

  • 1

−1 −1 1

  • . By Note 5.1.B, the

eigenvalues and eigenvectors of T are the same as those of A. So we consider the characteristic polynomial p(λ) = det(A−λI) = det

  • 1

−1 −1 1

  • − λ

1 1

  • =
  • 1 − λ

−1 −1 1 − λ

  • = (1 − λ)(1 − λ) − (−1)(−1) = 1 − 2λ + λ2 − 1 = λ2 − 2λ = λ(λ − 2).

We find the eigenvalues from the characteristic polynomial p(λ) = λ(λ − 2) = 0. So the eigenvalues of T are λ1 = 0 and λ2 = 2.

() Linear Algebra April 10, 2020 15 / 23

slide-38
SLIDE 38

Page 300 Number 18

Page 300 Number 18

Page 300 Number 18. Find the eigenvalues and corresponding eigenvectors for the linear transformation T([x, y]) = [x − y, −x + y]).

  • Solution. We apply Corollary 2.3.A, “Standard Matrix Representation of

Linear Transformations,” to find the matrix representing T. We have T(ˆ ı) = T([1, 0]) = [(1) − (0), (−1) + (0)] = [1, −1] and T(ˆ ) = T([0, 1]) = [(0) − (1), −(0) + (1)] = [−1, 1]. Hence the standard matrix representation of T is A =

  • 1

−1 −1 1

  • . By Note 5.1.B, the

eigenvalues and eigenvectors of T are the same as those of A. So we consider the characteristic polynomial p(λ) = det(A−λI) = det

  • 1

−1 −1 1

  • − λ

1 1

  • =
  • 1 − λ

−1 −1 1 − λ

  • = (1 − λ)(1 − λ) − (−1)(−1) = 1 − 2λ + λ2 − 1 = λ2 − 2λ = λ(λ − 2).

We find the eigenvalues from the characteristic polynomial p(λ) = λ(λ − 2) = 0. So the eigenvalues of T are λ1 = 0 and λ2 = 2.

() Linear Algebra April 10, 2020 15 / 23

slide-39
SLIDE 39

Page 300 Number 18

Page 300 Number 18 (continued 1)

Solution (continued). Denote the eigenvalues as λ1 = 0 and λ2 = 2. To find the eigenvectors corresponding to each eigenvalue, we consider the formula A v = λ v or (A − λI) v = 0. λ1 = 0. With v1 = [v1, v2]T an eigenvector corresponding to eigenvalue λ1 = 0 we need (A − λI) v =

  • 0. So we consider the augmented matrix

1 − (0) −1 −1 1 − (0)

  • =
  • 1

−1 −1 1 R2→R2+R1

  • 1

−1

  • .

So we need v1 − v2 = = 0 or v1 = v2 v2 = v2

  • r, with r = v2 as a free

variable, v1 = r v2 = r .

() Linear Algebra April 10, 2020 16 / 23

slide-40
SLIDE 40

Page 300 Number 18

Page 300 Number 18 (continued 1)

Solution (continued). Denote the eigenvalues as λ1 = 0 and λ2 = 2. To find the eigenvectors corresponding to each eigenvalue, we consider the formula A v = λ v or (A − λI) v = 0. λ1 = 0. With v1 = [v1, v2]T an eigenvector corresponding to eigenvalue λ1 = 0 we need (A − λI) v =

  • 0. So we consider the augmented matrix

1 − (0) −1 −1 1 − (0)

  • =
  • 1

−1 −1 1 R2→R2+R1

  • 1

−1

  • .

So we need v1 − v2 = = 0 or v1 = v2 v2 = v2

  • r, with r = v2 as a free

variable, v1 = r v2 = r . So the collection of all eigenvalues of λ1 = 0 is

  • v1 = r

1 1

  • where r ∈ R, r = 0.

() Linear Algebra April 10, 2020 16 / 23

slide-41
SLIDE 41

Page 300 Number 18

Page 300 Number 18 (continued 1)

Solution (continued). Denote the eigenvalues as λ1 = 0 and λ2 = 2. To find the eigenvectors corresponding to each eigenvalue, we consider the formula A v = λ v or (A − λI) v = 0. λ1 = 0. With v1 = [v1, v2]T an eigenvector corresponding to eigenvalue λ1 = 0 we need (A − λI) v =

  • 0. So we consider the augmented matrix

1 − (0) −1 −1 1 − (0)

  • =
  • 1

−1 −1 1 R2→R2+R1

  • 1

−1

  • .

So we need v1 − v2 = = 0 or v1 = v2 v2 = v2

  • r, with r = v2 as a free

variable, v1 = r v2 = r . So the collection of all eigenvalues of λ1 = 0 is

  • v1 = r

1 1

  • where r ∈ R, r = 0.

() Linear Algebra April 10, 2020 16 / 23

slide-42
SLIDE 42

Page 300 Number 18

Page 300 Number 18 (continued 2)

Solution (continued). λ2 = 2. As above, we need (A − 2I) v2 = 0 and consider the augmented matrix 1 − (2) −1 −1 1 − (2)

  • =

−1 −1 −1 −1

  • R2→R2−R1
  • −1

−1 R1→−R1

  • 1

1

  • .

So we need v1 + v2 = = 0 or v1 = −v2 v2 = v2

  • r with s = v2 as a free

variable, v1 = −s v2 = s . So the collection of all eigenvalues of λ2 = 2 is

  • v2 = s

−1 1

  • where s ∈ R, s = 0.

() Linear Algebra April 10, 2020 17 / 23

slide-43
SLIDE 43

Page 300 Number 18

Page 300 Number 18 (continued 2)

Solution (continued). λ2 = 2. As above, we need (A − 2I) v2 = 0 and consider the augmented matrix 1 − (2) −1 −1 1 − (2)

  • =

−1 −1 −1 −1

  • R2→R2−R1
  • −1

−1 R1→−R1

  • 1

1

  • .

So we need v1 + v2 = = 0 or v1 = −v2 v2 = v2

  • r with s = v2 as a free

variable, v1 = −s v2 = s . So the collection of all eigenvalues of λ2 = 2 is

  • v2 = s

−1 1

  • where s ∈ R, s = 0.

() Linear Algebra April 10, 2020 17 / 23

slide-44
SLIDE 44

Page 301 Number 30

Page 301 Number 30

Page 301 Number 30. Prove that a square matrix is invertible if and

  • nly if no eigenvalue is zero.
  • Proof. Suppose A is invertible. Then by Theorem 4.3, “Determinant

Criterion for Invertibility,” det(A) = 0. Now if λ = 0 is an eigenvalue then det(A − λI) = det(A − 0I) = det(A) = 0, so 0 cannot be an eigenvalue.

() Linear Algebra April 10, 2020 18 / 23

slide-45
SLIDE 45

Page 301 Number 30

Page 301 Number 30

Page 301 Number 30. Prove that a square matrix is invertible if and

  • nly if no eigenvalue is zero.
  • Proof. Suppose A is invertible. Then by Theorem 4.3, “Determinant

Criterion for Invertibility,” det(A) = 0. Now if λ = 0 is an eigenvalue then det(A − λI) = det(A − 0I) = det(A) = 0, so 0 cannot be an eigenvalue. Suppose λ = 0 is an eigenvalue. Then, again, det(A − λI) = det(A − 0I) = det(A) = 0. So by Theorem 4.3, A is not invertible.

() Linear Algebra April 10, 2020 18 / 23

slide-46
SLIDE 46

Page 301 Number 30

Page 301 Number 30

Page 301 Number 30. Prove that a square matrix is invertible if and

  • nly if no eigenvalue is zero.
  • Proof. Suppose A is invertible. Then by Theorem 4.3, “Determinant

Criterion for Invertibility,” det(A) = 0. Now if λ = 0 is an eigenvalue then det(A − λI) = det(A − 0I) = det(A) = 0, so 0 cannot be an eigenvalue. Suppose λ = 0 is an eigenvalue. Then, again, det(A − λI) = det(A − 0I) = det(A) = 0. So by Theorem 4.3, A is not invertible.

() Linear Algebra April 10, 2020 18 / 23

slide-47
SLIDE 47

Page 301 Number 32

Page 301 Number 32

Page 301 Number 32. Let A be an n × n matrix and let I be the n × n identity matrix. Compare the eigenvectors and eigenvalues of A with those

  • f A + rI for a scalar r.
  • Solution. Suppose λ is an eigenvalue of A with corresponding eigenvector
  • v. Then A

v = λ

  • v. So

(A + rI) v = A v + rI v = A v + r v = λ v + r v = (λ + r) v. So λ + r is an eigenvalue of A + rI with v as a corresponding eigenvector.

() Linear Algebra April 10, 2020 19 / 23

slide-48
SLIDE 48

Page 301 Number 32

Page 301 Number 32

Page 301 Number 32. Let A be an n × n matrix and let I be the n × n identity matrix. Compare the eigenvectors and eigenvalues of A with those

  • f A + rI for a scalar r.
  • Solution. Suppose λ is an eigenvalue of A with corresponding eigenvector
  • v. Then A

v = λ

  • v. So

(A + rI) v = A v + rI v = A v + r v = λ v + r v = (λ + r) v. So λ + r is an eigenvalue of A + rI with v as a corresponding eigenvector. Conversely, if λ + r is an eigenvalue of A + rI with eigenvector w then (A + rI) w = (λ + r) w or A w + r w = λ w + r w or A w = λ w so w is an eigenvector of A corresponding to eigenvalues λ.

() Linear Algebra April 10, 2020 19 / 23

slide-49
SLIDE 49

Page 301 Number 32

Page 301 Number 32

Page 301 Number 32. Let A be an n × n matrix and let I be the n × n identity matrix. Compare the eigenvectors and eigenvalues of A with those

  • f A + rI for a scalar r.
  • Solution. Suppose λ is an eigenvalue of A with corresponding eigenvector
  • v. Then A

v = λ

  • v. So

(A + rI) v = A v + rI v = A v + r v = λ v + r v = (λ + r) v. So λ + r is an eigenvalue of A + rI with v as a corresponding eigenvector. Conversely, if λ + r is an eigenvalue of A + rI with eigenvector w then (A + rI) w = (λ + r) w or A w + r w = λ w + r w or A w = λ w so w is an eigenvector of A corresponding to eigenvalues λ. So the eigenvalues of A + rI are precisely those of the form λ + r where λ is an eigenvalue of A. The corresponding eigenvectors of A + rI corresponding to λ + r are precisely the eigenvectors of A corresponding to λ.

() Linear Algebra April 10, 2020 19 / 23

slide-50
SLIDE 50

Page 301 Number 32

Page 301 Number 32

Page 301 Number 32. Let A be an n × n matrix and let I be the n × n identity matrix. Compare the eigenvectors and eigenvalues of A with those

  • f A + rI for a scalar r.
  • Solution. Suppose λ is an eigenvalue of A with corresponding eigenvector
  • v. Then A

v = λ

  • v. So

(A + rI) v = A v + rI v = A v + r v = λ v + r v = (λ + r) v. So λ + r is an eigenvalue of A + rI with v as a corresponding eigenvector. Conversely, if λ + r is an eigenvalue of A + rI with eigenvector w then (A + rI) w = (λ + r) w or A w + r w = λ w + r w or A w = λ w so w is an eigenvector of A corresponding to eigenvalues λ. So the eigenvalues of A + rI are precisely those of the form λ + r where λ is an eigenvalue of A. The corresponding eigenvectors of A + rI corresponding to λ + r are precisely the eigenvectors of A corresponding to λ.

() Linear Algebra April 10, 2020 19 / 23

slide-51
SLIDE 51

Page 302 Number 38

Page 302 Number 38

Page 302 Number 38. Let A be an n × n matrix and let C be an invertible n × n matrix. Prove that the eigenvalues of A and of C −1AC are the same.

  • Solution. Notice that

C −1AC − λI = C −1AC − λC −1C = C −1AC − C −1(λC) by Theorem 1.3.A(7), “Scalars Pull Through” = C −1(AC − λC) by Theorem 1.3.A(10), “Distribution Law of Matrix Multiplication” = C −1(A − λI)C by Theorem 1.3.A(10).

() Linear Algebra April 10, 2020 20 / 23

slide-52
SLIDE 52

Page 302 Number 38

Page 302 Number 38

Page 302 Number 38. Let A be an n × n matrix and let C be an invertible n × n matrix. Prove that the eigenvalues of A and of C −1AC are the same.

  • Solution. Notice that

C −1AC − λI = C −1AC − λC −1C = C −1AC − C −1(λC) by Theorem 1.3.A(7), “Scalars Pull Through” = C −1(AC − λC) by Theorem 1.3.A(10), “Distribution Law of Matrix Multiplication” = C −1(A − λI)C by Theorem 1.3.A(10).

() Linear Algebra April 10, 2020 20 / 23

slide-53
SLIDE 53

Page 302 Number 38

Page 302 Number 38 (continued)

Solution (continued). Recall that det(C −1) = 1/det(C) by Exercise 4.2.31. So the characteristic polynomial for C −1AC is det(C −1AC − λI) = det(C −1(A − λI)C) as just shown = det(C −1)det(A − λI)det(C) by Theorem 4.4, “The Multiplicative Property” = (1/det(C))det(A − λI)det(C) = det(A − λI).

() Linear Algebra April 10, 2020 21 / 23

slide-54
SLIDE 54

Page 302 Number 38

Page 302 Number 38 (continued)

Solution (continued). Recall that det(C −1) = 1/det(C) by Exercise 4.2.31. So the characteristic polynomial for C −1AC is det(C −1AC − λI) = det(C −1(A − λI)C) as just shown = det(C −1)det(A − λI)det(C) by Theorem 4.4, “The Multiplicative Property” = (1/det(C))det(A − λI)det(C) = det(A − λI). Now det(A − λI) is the characteristic polynomial of A, so A and C −1AC have the same characteristic polynomials. These polynomials have the same roots (of course) and since the eigenvalues of a matrix are the roots

  • f the characteristic polynomial (see Note 5.1.A), A and C −1AC have the

same eigenvalues, as claimed.

() Linear Algebra April 10, 2020 21 / 23

slide-55
SLIDE 55

Page 302 Number 38

Page 302 Number 38 (continued)

Solution (continued). Recall that det(C −1) = 1/det(C) by Exercise 4.2.31. So the characteristic polynomial for C −1AC is det(C −1AC − λI) = det(C −1(A − λI)C) as just shown = det(C −1)det(A − λI)det(C) by Theorem 4.4, “The Multiplicative Property” = (1/det(C))det(A − λI)det(C) = det(A − λI). Now det(A − λI) is the characteristic polynomial of A, so A and C −1AC have the same characteristic polynomials. These polynomials have the same roots (of course) and since the eigenvalues of a matrix are the roots

  • f the characteristic polynomial (see Note 5.1.A), A and C −1AC have the

same eigenvalues, as claimed.

() Linear Algebra April 10, 2020 21 / 23

slide-56
SLIDE 56

Page 302 Number 40

Page 302 Number 40

Page 302 Number 40. The Cayley-Hamilton Theorem states: Cayley-Hamilton Theorem. Every square matrix A satisfies its char- acteristic equation. That is, if p(λ) = anλn +an−1λn−1 +· · ·+a1λ+a0 is the characteristic polynomial of A then p(A) = anAn + an−1An−1 + · · · + a1A + a0I = O (where O is the n × n zero matrix). Use the Cayley-Hamilton Theorem to prove that, for invertible n × n matrix A, A−1 can be computed as a linear combination of A0 = I, A, A2, . . . , An−1.

  • Proof. Let A be an invertible n × n matrix and let p(λ) be the

characteristic polynomial of A. Then by the Cayley-Hamilton Theorem, p(A) = anAn + an−1An−1 + · · · + a1A + a0I = O.

() Linear Algebra April 10, 2020 22 / 23

slide-57
SLIDE 57

Page 302 Number 40

Page 302 Number 40

Page 302 Number 40. The Cayley-Hamilton Theorem states: Cayley-Hamilton Theorem. Every square matrix A satisfies its char- acteristic equation. That is, if p(λ) = anλn +an−1λn−1 +· · ·+a1λ+a0 is the characteristic polynomial of A then p(A) = anAn + an−1An−1 + · · · + a1A + a0I = O (where O is the n × n zero matrix). Use the Cayley-Hamilton Theorem to prove that, for invertible n × n matrix A, A−1 can be computed as a linear combination of A0 = I, A, A2, . . . , An−1.

  • Proof. Let A be an invertible n × n matrix and let p(λ) be the

characteristic polynomial of A. Then by the Cayley-Hamilton Theorem, p(A) = anAn + an−1An−1 + · · · + a1A + a0I = O. So anAn + an−1An−1 + · · · + a2A2 + a1A = −a0I. Multiplying both sides

  • f this equation on the right be A−1 gives

(anAn + an−1An−1 + · · · + a2A2 + a1A)A−1 = (−a0I)A−1 . . .

() Linear Algebra April 10, 2020 22 / 23

slide-58
SLIDE 58

Page 302 Number 40

Page 302 Number 40

Page 302 Number 40. The Cayley-Hamilton Theorem states: Cayley-Hamilton Theorem. Every square matrix A satisfies its char- acteristic equation. That is, if p(λ) = anλn +an−1λn−1 +· · ·+a1λ+a0 is the characteristic polynomial of A then p(A) = anAn + an−1An−1 + · · · + a1A + a0I = O (where O is the n × n zero matrix). Use the Cayley-Hamilton Theorem to prove that, for invertible n × n matrix A, A−1 can be computed as a linear combination of A0 = I, A, A2, . . . , An−1.

  • Proof. Let A be an invertible n × n matrix and let p(λ) be the

characteristic polynomial of A. Then by the Cayley-Hamilton Theorem, p(A) = anAn + an−1An−1 + · · · + a1A + a0I = O. So anAn + an−1An−1 + · · · + a2A2 + a1A = −a0I. Multiplying both sides

  • f this equation on the right be A−1 gives

(anAn + an−1An−1 + · · · + a2A2 + a1A)A−1 = (−a0I)A−1 . . .

() Linear Algebra April 10, 2020 22 / 23

slide-59
SLIDE 59

Page 302 Number 40

Page 302 Number 40 (continued)

Proof (continued). . . . or, by Theorem 1.3.A(1), “Distribution Law of Matrix Multiplication,” anAnA−1 + an−1An−1A−1 + · · · + a2A2A−1 + a1AA−1 = (−a0I)A−1

  • r by Theorem 1.3.A(10), “Associativity Law of Matrix Multiplication,”

and Theorem 1.3.A(6), “Associative Law of Matrix Multiplication,” anAn−1(AA−1)+an−1An−2(AA−1)+· · ·+a2A(AA−1)+a1(AA−1) = −a0IA−1

  • r

anAn−1 + an−1An−2 + · · · + a2A + a1I = −a0A−1. Since A is invertible, then 0 is not an eigenvalue of A by Exercise 30, so p(0) = a0 = 0. We then have A−1 = −an a0 An−1 − an−1 a0 An−2 − · · · − a2 a0 A − a1 a0 I. So A−1 is a linear combination of An−1, An−2, . . . , A, I, as claimed.

() Linear Algebra April 10, 2020 23 / 23

slide-60
SLIDE 60

Page 302 Number 40

Page 302 Number 40 (continued)

Proof (continued). . . . or, by Theorem 1.3.A(1), “Distribution Law of Matrix Multiplication,” anAnA−1 + an−1An−1A−1 + · · · + a2A2A−1 + a1AA−1 = (−a0I)A−1

  • r by Theorem 1.3.A(10), “Associativity Law of Matrix Multiplication,”

and Theorem 1.3.A(6), “Associative Law of Matrix Multiplication,” anAn−1(AA−1)+an−1An−2(AA−1)+· · ·+a2A(AA−1)+a1(AA−1) = −a0IA−1

  • r

anAn−1 + an−1An−2 + · · · + a2A + a1I = −a0A−1. Since A is invertible, then 0 is not an eigenvalue of A by Exercise 30, so p(0) = a0 = 0. We then have A−1 = −an a0 An−1 − an−1 a0 An−2 − · · · − a2 a0 A − a1 a0 I. So A−1 is a linear combination of An−1, An−2, . . . , A, I, as claimed.

() Linear Algebra April 10, 2020 23 / 23