Linear algebra and differential equations (Math 54): Lecture 16 - - PowerPoint PPT Presentation

linear algebra and differential equations math 54 lecture
SMART_READER_LITE
LIVE PREVIEW

Linear algebra and differential equations (Math 54): Lecture 16 - - PowerPoint PPT Presentation

Linear algebra and differential equations (Math 54): Lecture 16 Vivek Shende March 14, 2019 Hello and welcome to class! Hello and welcome to class! Last time Hello and welcome to class! Last time We discussed complex eigenvalues and


slide-1
SLIDE 1

Linear algebra and differential equations (Math 54): Lecture 16

Vivek Shende March 14, 2019

slide-2
SLIDE 2

Hello and welcome to class!

slide-3
SLIDE 3

Hello and welcome to class!

Last time

slide-4
SLIDE 4

Hello and welcome to class!

Last time

We discussed complex eigenvalues and eigenvectors,

slide-5
SLIDE 5

Hello and welcome to class!

Last time

We discussed complex eigenvalues and eigenvectors, and then began the discussion of length, orthogonality, and the dot product.

slide-6
SLIDE 6

Hello and welcome to class!

Last time

We discussed complex eigenvalues and eigenvectors, and then began the discussion of length, orthogonality, and the dot product.

This time

Orthogonal projection, orthogonal sets, orthonormal bases.

slide-7
SLIDE 7

Midterm 2

slide-8
SLIDE 8

Midterm 2

Covers chapters 4-6.5 (depending on where we get next time),

slide-9
SLIDE 9

Midterm 2

Covers chapters 4-6.5 (depending on where we get next time), but you may need the material from the previous chapters to solve the problems.

slide-10
SLIDE 10

Midterm 2

Covers chapters 4-6.5 (depending on where we get next time), but you may need the material from the previous chapters to solve the problems. It will be, in format, much like midterm 1.

slide-11
SLIDE 11

Midterm 2

Covers chapters 4-6.5 (depending on where we get next time), but you may need the material from the previous chapters to solve the problems. It will be, in format, much like midterm 1. Also like the practice midterm, available at https://math.berkeley.edu/~vivek/54slides/prmid2.pdf

slide-12
SLIDE 12

Dot product and transpose

slide-13
SLIDE 13

Dot product and transpose

The dot product can be represented as multiplication by the transpose.

slide-14
SLIDE 14

Dot product and transpose

The dot product can be represented as multiplication by the

  • transpose. If we write our vectors as columns, then

v · w = vTw

slide-15
SLIDE 15

Dot product and transpose

The dot product can be represented as multiplication by the

  • transpose. If we write our vectors as columns, then

v · w = vTw

Example

  1 2 3   ·   4 5 6   = 1 · 4 + 2 · 5 + 3 · 6 = [1 2 3]   4 5 6  

slide-16
SLIDE 16

Orthogonal complements

slide-17
SLIDE 17

Orthogonal complements

Consider a linear subspace V ⊂ Rn.

slide-18
SLIDE 18

Orthogonal complements

Consider a linear subspace V ⊂ Rn. We write V ⊥ for the set of all vectors which are orthogonal to every vector in V .

slide-19
SLIDE 19

Orthogonal complements

Consider a linear subspace V ⊂ Rn. We write V ⊥ for the set of all vectors which are orthogonal to every vector in V . It called the orthogonal complement of V .

slide-20
SLIDE 20

Orthogonal complements

Equivalently, V ⊥ is the set of all vectors which have dot product zero with every element of V .

slide-21
SLIDE 21

Orthogonal complements

Equivalently, V ⊥ is the set of all vectors which have dot product zero with every element of V . Note that v · v = ||v||2, which is zero if and only if v is zero.

slide-22
SLIDE 22

Orthogonal complements

Equivalently, V ⊥ is the set of all vectors which have dot product zero with every element of V . Note that v · v = ||v||2, which is zero if and only if v is zero. So V ∩ V ⊥ = {0}

slide-23
SLIDE 23

Orthogonal complements

The orthogonal complement is a vector subspace.

slide-24
SLIDE 24

Orthogonal complements

The orthogonal complement is a vector subspace. I.e., if x, y ∈ V ⊥ are any vectors orthogonal to all vectors in V ,

slide-25
SLIDE 25

Orthogonal complements

The orthogonal complement is a vector subspace. I.e., if x, y ∈ V ⊥ are any vectors orthogonal to all vectors in V , then any linear combination of these vectors is also orthogonal to all vectors in V .

slide-26
SLIDE 26

Orthogonal complements

The orthogonal complement is a vector subspace. I.e., if x, y ∈ V ⊥ are any vectors orthogonal to all vectors in V , then any linear combination of these vectors is also orthogonal to all vectors in V .

slide-27
SLIDE 27

Orthogonal complements

The orthogonal complement is a vector subspace. I.e., if x, y ∈ V ⊥ are any vectors orthogonal to all vectors in V , then any linear combination of these vectors is also orthogonal to all vectors in V . Proof:

slide-28
SLIDE 28

Orthogonal complements

The orthogonal complement is a vector subspace. I.e., if x, y ∈ V ⊥ are any vectors orthogonal to all vectors in V , then any linear combination of these vectors is also orthogonal to all vectors in V . Proof: If x · v = 0 and y · v = 0, then

slide-29
SLIDE 29

Orthogonal complements

The orthogonal complement is a vector subspace. I.e., if x, y ∈ V ⊥ are any vectors orthogonal to all vectors in V , then any linear combination of these vectors is also orthogonal to all vectors in V . Proof: If x · v = 0 and y · v = 0, then (ax + by) · v = ax · v + by · v = 0

slide-30
SLIDE 30

Orthogonal complements

The orthogonal complement of V can be characterized as vectors

  • rthogonal to all elements of some basis for V .
slide-31
SLIDE 31

Orthogonal complements

The orthogonal complement of V can be characterized as vectors

  • rthogonal to all elements of some basis for V .

Indeed, suppose v1, . . . , vn is such a basis, and x · vi = 0 for all i.

slide-32
SLIDE 32

Orthogonal complements

The orthogonal complement of V can be characterized as vectors

  • rthogonal to all elements of some basis for V .

Indeed, suppose v1, . . . , vn is such a basis, and x · vi = 0 for all i. Then for any v ∈ V , we can expand v = civi,

slide-33
SLIDE 33

Orthogonal complements

The orthogonal complement of V can be characterized as vectors

  • rthogonal to all elements of some basis for V .

Indeed, suppose v1, . . . , vn is such a basis, and x · vi = 0 for all i. Then for any v ∈ V , we can expand v = civi, and observe x · v = x ·

  • civi =
  • cix · vi = 0
slide-34
SLIDE 34

Orthogonal complements

The orthogonal complement of V can be characterized as vectors

  • rthogonal to all elements of some basis for V .

Indeed, suppose v1, . . . , vn is such a basis, and x · vi = 0 for all i. Then for any v ∈ V , we can expand v = civi, and observe x · v = x ·

  • civi =
  • cix · vi = 0

Thus x is orthogonal to any element of V .

slide-35
SLIDE 35

Orthogonal complements

This allows us to compute (bases for) orthogonal complements.

slide-36
SLIDE 36

Orthogonal complements

This allows us to compute (bases for) orthogonal complements. Given a basis v1, . . . , vk for V , to find a basis for V ⊥:

slide-37
SLIDE 37

Orthogonal complements

This allows us to compute (bases for) orthogonal complements. Given a basis v1, . . . , vk for V , to find a basis for V ⊥:

◮ Write the matrix [v1, . . . , vk]T, whose rows are the vi.

slide-38
SLIDE 38

Orthogonal complements

This allows us to compute (bases for) orthogonal complements. Given a basis v1, . . . , vk for V , to find a basis for V ⊥:

◮ Write the matrix [v1, . . . , vk]T, whose rows are the vi. ◮ Note that the kernel of this matrix is precisely the set of

vectors whose dot product with all the vi is zero.

slide-39
SLIDE 39

Orthogonal complements

This allows us to compute (bases for) orthogonal complements. Given a basis v1, . . . , vk for V , to find a basis for V ⊥:

◮ Write the matrix [v1, . . . , vk]T, whose rows are the vi. ◮ Note that the kernel of this matrix is precisely the set of

vectors whose dot product with all the vi is zero.

◮ Determine (a basis for) the kernel of this matrix.

slide-40
SLIDE 40

Orthogonal complements

This allows us to compute (bases for) orthogonal complements. Given a basis v1, . . . , vk for V , to find a basis for V ⊥:

◮ Write the matrix [v1, . . . , vk]T, whose rows are the vi. ◮ Note that the kernel of this matrix is precisely the set of

vectors whose dot product with all the vi is zero.

◮ Determine (a basis for) the kernel of this matrix.

slide-41
SLIDE 41

Example

slide-42
SLIDE 42

Example

Find a basis for the orthogonal complement in R4 of the subspace spanned by (1, 0, 1, 0) and (2, 3, 4, 5).

slide-43
SLIDE 43

Example

Find a basis for the orthogonal complement in R4 of the subspace spanned by (1, 0, 1, 0) and (2, 3, 4, 5). This is the same as finding a basis for the kernel of the matrix 1 1 2 3 4 5

slide-44
SLIDE 44

Example

Find a basis for the orthogonal complement in R4 of the subspace spanned by (1, 0, 1, 0) and (2, 3, 4, 5). This is the same as finding a basis for the kernel of the matrix 1 1 2 3 4 5

  • We row reduce:

1 1 2 3 4 5

1 1 3 2 5

1 1 1 2/3 5/3

slide-45
SLIDE 45

Example

Find a basis for the orthogonal complement in R4 of the subspace spanned by (1, 0, 1, 0) and (2, 3, 4, 5). This is the same as finding a basis for the kernel of the matrix 1 1 2 3 4 5

  • We row reduce:

1 1 2 3 4 5

1 1 3 2 5

1 1 1 2/3 5/3

  • The kernel is spanned by (−1, −2/3, 1, 0) and (0, −5/3, 0, 1).
slide-46
SLIDE 46

Try it yourself!

Find a basis for the orthogonal complement in R3 of the subspace spanned by the vector (1, 2, 3).

slide-47
SLIDE 47

Try it yourself!

Find a basis for the orthogonal complement in R3 of the subspace spanned by the vector (1, 2, 3). This is the same as finding the kernel of the matrix [1 2 3]

slide-48
SLIDE 48

Try it yourself!

Find a basis for the orthogonal complement in R3 of the subspace spanned by the vector (1, 2, 3). This is the same as finding the kernel of the matrix [1 2 3] It’s already row reduced.

slide-49
SLIDE 49

Try it yourself!

Find a basis for the orthogonal complement in R3 of the subspace spanned by the vector (1, 2, 3). This is the same as finding the kernel of the matrix [1 2 3] It’s already row reduced. The kernel is spanned by (−2, 1, 0) and (−3, 0, 1).

slide-50
SLIDE 50

Orthogonal complements

slide-51
SLIDE 51

Orthogonal complements

(For notation, recall V was a subspace of Rn.)

slide-52
SLIDE 52

Orthogonal complements

(For notation, recall V was a subspace of Rn.) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns,

slide-53
SLIDE 53

Orthogonal complements

(For notation, recall V was a subspace of Rn.) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns, so dim V ⊥ = n − dim V ,

slide-54
SLIDE 54

Orthogonal complements

(For notation, recall V was a subspace of Rn.) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns, so dim V ⊥ = n − dim V , i.e. dim V + dim V ⊥ = dim Rn

slide-55
SLIDE 55

Orthogonal complements

(For notation, recall V was a subspace of Rn.) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns, so dim V ⊥ = n − dim V , i.e. dim V + dim V ⊥ = dim Rn Recall from before, V ∩ V ⊥ = {0}.

slide-56
SLIDE 56

Orthogonal complements

V ⊥ = “everything orthogonal to

slide-57
SLIDE 57

Orthogonal complements

V ⊥ = “everything orthogonal to everything in V ”

slide-58
SLIDE 58

Orthogonal complements

V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to

slide-59
SLIDE 59

Orthogonal complements

V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to

slide-60
SLIDE 60

Orthogonal complements

V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ”

slide-61
SLIDE 61

Orthogonal complements

V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥, so V ⊂ (V ⊥)⊥.

slide-62
SLIDE 62

Orthogonal complements

V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥, so V ⊂ (V ⊥)⊥. But from the previous slide, we have:

slide-63
SLIDE 63

Orthogonal complements

V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥, so V ⊂ (V ⊥)⊥. But from the previous slide, we have: dim V + dim V ⊥ = n

slide-64
SLIDE 64

Orthogonal complements

V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥, so V ⊂ (V ⊥)⊥. But from the previous slide, we have: dim V + dim V ⊥ = n and similarly dim V ⊥ + dim(V ⊥)⊥ = n

slide-65
SLIDE 65

Orthogonal complements

V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥, so V ⊂ (V ⊥)⊥. But from the previous slide, we have: dim V + dim V ⊥ = n and similarly dim V ⊥ + dim(V ⊥)⊥ = n So we learn dim V = dim(V ⊥)⊥,

slide-66
SLIDE 66

Orthogonal complements

V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥, so V ⊂ (V ⊥)⊥. But from the previous slide, we have: dim V + dim V ⊥ = n and similarly dim V ⊥ + dim(V ⊥)⊥ = n So we learn dim V = dim(V ⊥)⊥, hence V = (V ⊥)⊥.

slide-67
SLIDE 67

Orthogonal complements

In words, V = (V ⊥)⊥ is telling you that:

slide-68
SLIDE 68

Orthogonal complements

In words, V = (V ⊥)⊥ is telling you that: Everything orthogonal to

slide-69
SLIDE 69

Orthogonal complements

In words, V = (V ⊥)⊥ is telling you that: Everything orthogonal to everything orthogonal to

slide-70
SLIDE 70

Orthogonal complements

In words, V = (V ⊥)⊥ is telling you that: Everything orthogonal to everything orthogonal to everything in V

slide-71
SLIDE 71

Orthogonal complements

In words, V = (V ⊥)⊥ is telling you that: Everything orthogonal to everything orthogonal to everything in V is already in V .

slide-72
SLIDE 72

Orthogonal complements

Theorem

Let V ⊂ Rn be a linear subspace.

slide-73
SLIDE 73

Orthogonal complements

Theorem

Let V ⊂ Rn be a linear subspace. Any vector in Rn has a unique expression as sum of a vector in V and a vector in V ⊥.

slide-74
SLIDE 74

Orthogonal complements

Theorem

Let V ⊂ Rn be a linear subspace. Any vector in Rn has a unique expression as sum of a vector in V and a vector in V ⊥. In fact, something more general is true:

slide-75
SLIDE 75

Orthogonal complements

Theorem

Let V ⊂ Rn be a linear subspace. Any vector in Rn has a unique expression as sum of a vector in V and a vector in V ⊥. In fact, something more general is true:

Theorem

Given any two subspaces V , W such that dim V + dim W = n and V ∩ W = {0},

slide-76
SLIDE 76

Orthogonal complements

Theorem

Let V ⊂ Rn be a linear subspace. Any vector in Rn has a unique expression as sum of a vector in V and a vector in V ⊥. In fact, something more general is true:

Theorem

Given any two subspaces V , W such that dim V + dim W = n and V ∩ W = {0}, any vector in Rn can be written uniquely as a sum

  • f a vector in V and a vector in W .
slide-77
SLIDE 77

Example

Consider the subspace V ⊂ R4 spanned by e1, e2.

slide-78
SLIDE 78

Example

Consider the subspace V ⊂ R4 spanned by e1, e2. Its orthogonal complement V ⊥ is spanned by e3, e4.

slide-79
SLIDE 79

Example

Consider the subspace V ⊂ R4 spanned by e1, e2. Its orthogonal complement V ⊥ is spanned by e3, e4. Any vector (w, x, y, z) can be written as (w, x, 0, 0) + (0, 0, y, z).

slide-80
SLIDE 80

Proof

slide-81
SLIDE 81

Proof

Take bases v1, . . . , vv of V and w1, . . . , ww of W .

slide-82
SLIDE 82

Proof

Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0.

slide-83
SLIDE 83

Proof

Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0. Then aivi = − bjwj.

slide-84
SLIDE 84

Proof

Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0. Then aivi = − bjwj. One side is in V and the other is in W ,

slide-85
SLIDE 85

Proof

Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0. Then aivi = − bjwj. One side is in V and the other is in W , so both are in V ∩ W ,

slide-86
SLIDE 86

Proof

Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0. Then aivi = − bjwj. One side is in V and the other is in W , so both are in V ∩ W , hence zero.

slide-87
SLIDE 87

Proof

Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0. Then aivi = − bjwj. One side is in V and the other is in W , so both are in V ∩ W , hence zero. {vi} and {wj} were bases,

slide-88
SLIDE 88

Proof

Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0. Then aivi = − bjwj. One side is in V and the other is in W , so both are in V ∩ W , hence zero. {vi} and {wj} were bases, so ai and bj must be zero.

slide-89
SLIDE 89

Proof

Thus {v1, . . . , vv, w1, . . . , ww} is linearly independent.

slide-90
SLIDE 90

Proof

Thus {v1, . . . , vv, w1, . . . , ww} is linearly independent. This linearly independent set has n elements,

slide-91
SLIDE 91

Proof

Thus {v1, . . . , vv, w1, . . . , ww} is linearly independent. This linearly independent set has n elements, so it’s a basis for Rn.

slide-92
SLIDE 92

Proof

Thus {v1, . . . , vv, w1, . . . , ww} is linearly independent. This linearly independent set has n elements, so it’s a basis for Rn. So any vector can be written as aivi + bjwj.

slide-93
SLIDE 93

Proof

Thus {v1, . . . , vv, w1, . . . , ww} is linearly independent. This linearly independent set has n elements, so it’s a basis for Rn. So any vector can be written as aivi + bjwj. This a sum of a vector in V and a vector in W .

slide-94
SLIDE 94

Proof

Such an expression is unique:

slide-95
SLIDE 95

Proof

Such an expression is unique: If v, v′ ∈ V and w, w′ ∈ W and v + w = v′ + w′,

slide-96
SLIDE 96

Proof

Such an expression is unique: If v, v′ ∈ V and w, w′ ∈ W and v + w = v′ + w′, then v − v′ = w − w′

slide-97
SLIDE 97

Proof

Such an expression is unique: If v, v′ ∈ V and w, w′ ∈ W and v + w = v′ + w′, then v − v′ = w − w′ is in V and in W

slide-98
SLIDE 98

Proof

Such an expression is unique: If v, v′ ∈ V and w, w′ ∈ W and v + w = v′ + w′, then v − v′ = w − w′ is in V and in W hence is zero.

slide-99
SLIDE 99

Proof

Such an expression is unique: If v, v′ ∈ V and w, w′ ∈ W and v + w = v′ + w′, then v − v′ = w − w′ is in V and in W hence is zero. So v = v′ and w = w′.

slide-100
SLIDE 100

Orthogonal projection

slide-101
SLIDE 101

Orthogonal projection

We have seen that if V ⊂ Rn is a vector subspace,

slide-102
SLIDE 102

Orthogonal projection

We have seen that if V ⊂ Rn is a vector subspace, any vector x can be written uniquely as x = v + v⊥ for some v ∈ V and v⊥ ∈ V ⊥.

slide-103
SLIDE 103

Orthogonal projection

We have seen that if V ⊂ Rn is a vector subspace, any vector x can be written uniquely as x = v + v⊥ for some v ∈ V and v⊥ ∈ V ⊥. The vector v above is said to be the orthogonal projection of x.

slide-104
SLIDE 104
slide-105
SLIDE 105
slide-106
SLIDE 106
slide-107
SLIDE 107
slide-108
SLIDE 108

Computing orthogonal projections

slide-109
SLIDE 109

Computing orthogonal projections

Special case: V is one dimensional,

slide-110
SLIDE 110

Computing orthogonal projections

Special case: V is one dimensional, hence the span of one vector v.

slide-111
SLIDE 111

Computing orthogonal projections

Special case: V is one dimensional, hence the span of one vector v. Writing x as the sum of a vector in V and a vector in V ⊥ means:

slide-112
SLIDE 112

Computing orthogonal projections

Special case: V is one dimensional, hence the span of one vector v. Writing x as the sum of a vector in V and a vector in V ⊥ means: writing x = λv + w where v · w = 0.

slide-113
SLIDE 113

Computing orthogonal projections

Special case: V is one dimensional, hence the span of one vector v. Writing x as the sum of a vector in V and a vector in V ⊥ means: writing x = λv + w where v · w = 0. Taking dot products, v · x = λv · v + v · w = λv · v

slide-114
SLIDE 114

Computing orthogonal projections

Special case: V is one dimensional, hence the span of one vector v. Writing x as the sum of a vector in V and a vector in V ⊥ means: writing x = λv + w where v · w = 0. Taking dot products, v · x = λv · v + v · w = λv · v Or in other words, λ = v · x v · v

slide-115
SLIDE 115

Computing orthogonal projections

Special case: V is one dimensional, hence the span of one vector v. Writing x as the sum of a vector in V and a vector in V ⊥ means: writing x = λv + w where v · w = 0. Taking dot products, v · x = λv · v + v · w = λv · v Or in other words, λ = v · x v · v So the orthogonal projection of x to Span(v) is v · x v · v

  • v
slide-116
SLIDE 116

Example

The orthogonal projection of (1, 2, 3, 4) to the line spanned by (1, 1, 1, 1) is

slide-117
SLIDE 117

Example

The orthogonal projection of (1, 2, 3, 4) to the line spanned by (1, 1, 1, 1) is (1, 1, 1, 1) · (1, 2, 3, 4) (1, 1, 1, 1) · (1, 1, 1, 1)

  • (1, 1, 1, 1) = 10

4 · (1, 1, 1, 1)

slide-118
SLIDE 118

Computing orthogonal projections

More generally, suppose you have a basis v1, · · · , vk for V

slide-119
SLIDE 119

Computing orthogonal projections

More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j.

slide-120
SLIDE 120

Computing orthogonal projections

More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis.

slide-121
SLIDE 121

Computing orthogonal projections

More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis.

slide-122
SLIDE 122

Computing orthogonal projections

More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis. Then the orthogonal projection of x to V is the sum of the

  • rthogonal projections to the vi.
slide-123
SLIDE 123

Computing orthogonal projections

More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis. Then the orthogonal projection of x to V is the sum of the

  • rthogonal projections to the vi. That is,

ProjV (x) =

  • i

vi · x vi · vi

  • · vi
slide-124
SLIDE 124

Computing orthogonal projections

More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis. Then the orthogonal projection of x to V is the sum of the

  • rthogonal projections to the vi. That is,

ProjV (x) =

  • i

vi · x vi · vi

  • · vi

Proof:

slide-125
SLIDE 125

Computing orthogonal projections

More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis. Then the orthogonal projection of x to V is the sum of the

  • rthogonal projections to the vi. That is,

ProjV (x) =

  • i

vi · x vi · vi

  • · vi

Proof: We compute the dot product vj · (x − ProjV (x)):

slide-126
SLIDE 126

Computing orthogonal projections

More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis. Then the orthogonal projection of x to V is the sum of the

  • rthogonal projections to the vi. That is,

ProjV (x) =

  • i

vi · x vi · vi

  • · vi

Proof: We compute the dot product vj · (x − ProjV (x)): vj · x − vj ·

  • i

vi · x vi · vi

  • vi = vj · x − vj · vj

vj · x vj · vj

  • = 0
slide-127
SLIDE 127

Orthonormal bases

The formula ProjV (x) =

  • i

vi · x vi · vi

  • · vi

simplifies further when vi · vi = ||vi||2 = 1 for all i.

slide-128
SLIDE 128

Orthonormal bases

The formula ProjV (x) =

  • i

vi · x vi · vi

  • · vi

simplifies further when vi · vi = ||vi||2 = 1 for all i. Such a basis is called an orthonormal basis.

slide-129
SLIDE 129

Orthonormal bases

The formula ProjV (x) =

  • i

vi · x vi · vi

  • · vi

simplifies further when vi · vi = ||vi||2 = 1 for all i. Such a basis is called an orthonormal basis.

Example

Any subset of the standard basis is an orthonormal basis for the linear space it spans.

slide-130
SLIDE 130

Finding orthonormal bases:

slide-131
SLIDE 131

Finding orthonormal bases:

If V is one dimensional:

slide-132
SLIDE 132

Finding orthonormal bases:

If V is one dimensional: choose any nonzero vector v1,

slide-133
SLIDE 133

Finding orthonormal bases:

If V is one dimensional: choose any nonzero vector v1, and then divide it by its length.

slide-134
SLIDE 134

Finding orthonormal bases:

If V is one dimensional: choose any nonzero vector v1, and then divide it by its length. Now you have a unit vector, v′

1 =

1 ||v1|| v1

slide-135
SLIDE 135

Finding orthonormal bases:

If V is two dimensional:

slide-136
SLIDE 136

Finding orthonormal bases:

If V is two dimensional: choose any nonzero vector v1,

slide-137
SLIDE 137

Finding orthonormal bases:

If V is two dimensional: choose any nonzero vector v1, and then divide it by its length.

slide-138
SLIDE 138

Finding orthonormal bases:

If V is two dimensional: choose any nonzero vector v1, and then divide it by its length. Now you have a unit vector, v′

1 =

1 ||v1|| v1 Next, choose any vector v2 outside the span of v′

  • 1. Compute its
  • rthogonal projection:

v2 = λv′

1 + v′ 2

The vectors v′

1, v′ 2 form an orthogonal basis. Finally rescale

v′′

2 =

1 ||v′

2|| v′ 2

The vectors v′

1, v′′ 2 form an orthonormal basis.