SLIDE 1
Linear algebra and differential equations (Math 54): Lecture 16 - - PowerPoint PPT Presentation
Linear algebra and differential equations (Math 54): Lecture 16 - - PowerPoint PPT Presentation
Linear algebra and differential equations (Math 54): Lecture 16 Vivek Shende March 14, 2019 Hello and welcome to class! Hello and welcome to class! Last time Hello and welcome to class! Last time We discussed complex eigenvalues and
SLIDE 2
SLIDE 3
Hello and welcome to class!
Last time
SLIDE 4
Hello and welcome to class!
Last time
We discussed complex eigenvalues and eigenvectors,
SLIDE 5
Hello and welcome to class!
Last time
We discussed complex eigenvalues and eigenvectors, and then began the discussion of length, orthogonality, and the dot product.
SLIDE 6
Hello and welcome to class!
Last time
We discussed complex eigenvalues and eigenvectors, and then began the discussion of length, orthogonality, and the dot product.
This time
Orthogonal projection, orthogonal sets, orthonormal bases.
SLIDE 7
Midterm 2
SLIDE 8
Midterm 2
Covers chapters 4-6.5 (depending on where we get next time),
SLIDE 9
Midterm 2
Covers chapters 4-6.5 (depending on where we get next time), but you may need the material from the previous chapters to solve the problems.
SLIDE 10
Midterm 2
Covers chapters 4-6.5 (depending on where we get next time), but you may need the material from the previous chapters to solve the problems. It will be, in format, much like midterm 1.
SLIDE 11
Midterm 2
Covers chapters 4-6.5 (depending on where we get next time), but you may need the material from the previous chapters to solve the problems. It will be, in format, much like midterm 1. Also like the practice midterm, available at https://math.berkeley.edu/~vivek/54slides/prmid2.pdf
SLIDE 12
Dot product and transpose
SLIDE 13
Dot product and transpose
The dot product can be represented as multiplication by the transpose.
SLIDE 14
Dot product and transpose
The dot product can be represented as multiplication by the
- transpose. If we write our vectors as columns, then
v · w = vTw
SLIDE 15
Dot product and transpose
The dot product can be represented as multiplication by the
- transpose. If we write our vectors as columns, then
v · w = vTw
Example
1 2 3 · 4 5 6 = 1 · 4 + 2 · 5 + 3 · 6 = [1 2 3] 4 5 6
SLIDE 16
Orthogonal complements
SLIDE 17
Orthogonal complements
Consider a linear subspace V ⊂ Rn.
SLIDE 18
Orthogonal complements
Consider a linear subspace V ⊂ Rn. We write V ⊥ for the set of all vectors which are orthogonal to every vector in V .
SLIDE 19
Orthogonal complements
Consider a linear subspace V ⊂ Rn. We write V ⊥ for the set of all vectors which are orthogonal to every vector in V . It called the orthogonal complement of V .
SLIDE 20
Orthogonal complements
Equivalently, V ⊥ is the set of all vectors which have dot product zero with every element of V .
SLIDE 21
Orthogonal complements
Equivalently, V ⊥ is the set of all vectors which have dot product zero with every element of V . Note that v · v = ||v||2, which is zero if and only if v is zero.
SLIDE 22
Orthogonal complements
Equivalently, V ⊥ is the set of all vectors which have dot product zero with every element of V . Note that v · v = ||v||2, which is zero if and only if v is zero. So V ∩ V ⊥ = {0}
SLIDE 23
Orthogonal complements
The orthogonal complement is a vector subspace.
SLIDE 24
Orthogonal complements
The orthogonal complement is a vector subspace. I.e., if x, y ∈ V ⊥ are any vectors orthogonal to all vectors in V ,
SLIDE 25
Orthogonal complements
The orthogonal complement is a vector subspace. I.e., if x, y ∈ V ⊥ are any vectors orthogonal to all vectors in V , then any linear combination of these vectors is also orthogonal to all vectors in V .
SLIDE 26
Orthogonal complements
The orthogonal complement is a vector subspace. I.e., if x, y ∈ V ⊥ are any vectors orthogonal to all vectors in V , then any linear combination of these vectors is also orthogonal to all vectors in V .
SLIDE 27
Orthogonal complements
The orthogonal complement is a vector subspace. I.e., if x, y ∈ V ⊥ are any vectors orthogonal to all vectors in V , then any linear combination of these vectors is also orthogonal to all vectors in V . Proof:
SLIDE 28
Orthogonal complements
The orthogonal complement is a vector subspace. I.e., if x, y ∈ V ⊥ are any vectors orthogonal to all vectors in V , then any linear combination of these vectors is also orthogonal to all vectors in V . Proof: If x · v = 0 and y · v = 0, then
SLIDE 29
Orthogonal complements
The orthogonal complement is a vector subspace. I.e., if x, y ∈ V ⊥ are any vectors orthogonal to all vectors in V , then any linear combination of these vectors is also orthogonal to all vectors in V . Proof: If x · v = 0 and y · v = 0, then (ax + by) · v = ax · v + by · v = 0
SLIDE 30
Orthogonal complements
The orthogonal complement of V can be characterized as vectors
- rthogonal to all elements of some basis for V .
SLIDE 31
Orthogonal complements
The orthogonal complement of V can be characterized as vectors
- rthogonal to all elements of some basis for V .
Indeed, suppose v1, . . . , vn is such a basis, and x · vi = 0 for all i.
SLIDE 32
Orthogonal complements
The orthogonal complement of V can be characterized as vectors
- rthogonal to all elements of some basis for V .
Indeed, suppose v1, . . . , vn is such a basis, and x · vi = 0 for all i. Then for any v ∈ V , we can expand v = civi,
SLIDE 33
Orthogonal complements
The orthogonal complement of V can be characterized as vectors
- rthogonal to all elements of some basis for V .
Indeed, suppose v1, . . . , vn is such a basis, and x · vi = 0 for all i. Then for any v ∈ V , we can expand v = civi, and observe x · v = x ·
- civi =
- cix · vi = 0
SLIDE 34
Orthogonal complements
The orthogonal complement of V can be characterized as vectors
- rthogonal to all elements of some basis for V .
Indeed, suppose v1, . . . , vn is such a basis, and x · vi = 0 for all i. Then for any v ∈ V , we can expand v = civi, and observe x · v = x ·
- civi =
- cix · vi = 0
Thus x is orthogonal to any element of V .
SLIDE 35
Orthogonal complements
This allows us to compute (bases for) orthogonal complements.
SLIDE 36
Orthogonal complements
This allows us to compute (bases for) orthogonal complements. Given a basis v1, . . . , vk for V , to find a basis for V ⊥:
SLIDE 37
Orthogonal complements
This allows us to compute (bases for) orthogonal complements. Given a basis v1, . . . , vk for V , to find a basis for V ⊥:
◮ Write the matrix [v1, . . . , vk]T, whose rows are the vi.
SLIDE 38
Orthogonal complements
This allows us to compute (bases for) orthogonal complements. Given a basis v1, . . . , vk for V , to find a basis for V ⊥:
◮ Write the matrix [v1, . . . , vk]T, whose rows are the vi. ◮ Note that the kernel of this matrix is precisely the set of
vectors whose dot product with all the vi is zero.
SLIDE 39
Orthogonal complements
This allows us to compute (bases for) orthogonal complements. Given a basis v1, . . . , vk for V , to find a basis for V ⊥:
◮ Write the matrix [v1, . . . , vk]T, whose rows are the vi. ◮ Note that the kernel of this matrix is precisely the set of
vectors whose dot product with all the vi is zero.
◮ Determine (a basis for) the kernel of this matrix.
SLIDE 40
Orthogonal complements
This allows us to compute (bases for) orthogonal complements. Given a basis v1, . . . , vk for V , to find a basis for V ⊥:
◮ Write the matrix [v1, . . . , vk]T, whose rows are the vi. ◮ Note that the kernel of this matrix is precisely the set of
vectors whose dot product with all the vi is zero.
◮ Determine (a basis for) the kernel of this matrix.
SLIDE 41
Example
SLIDE 42
Example
Find a basis for the orthogonal complement in R4 of the subspace spanned by (1, 0, 1, 0) and (2, 3, 4, 5).
SLIDE 43
Example
Find a basis for the orthogonal complement in R4 of the subspace spanned by (1, 0, 1, 0) and (2, 3, 4, 5). This is the same as finding a basis for the kernel of the matrix 1 1 2 3 4 5
SLIDE 44
Example
Find a basis for the orthogonal complement in R4 of the subspace spanned by (1, 0, 1, 0) and (2, 3, 4, 5). This is the same as finding a basis for the kernel of the matrix 1 1 2 3 4 5
- We row reduce:
1 1 2 3 4 5
- →
1 1 3 2 5
- →
1 1 1 2/3 5/3
SLIDE 45
Example
Find a basis for the orthogonal complement in R4 of the subspace spanned by (1, 0, 1, 0) and (2, 3, 4, 5). This is the same as finding a basis for the kernel of the matrix 1 1 2 3 4 5
- We row reduce:
1 1 2 3 4 5
- →
1 1 3 2 5
- →
1 1 1 2/3 5/3
- The kernel is spanned by (−1, −2/3, 1, 0) and (0, −5/3, 0, 1).
SLIDE 46
Try it yourself!
Find a basis for the orthogonal complement in R3 of the subspace spanned by the vector (1, 2, 3).
SLIDE 47
Try it yourself!
Find a basis for the orthogonal complement in R3 of the subspace spanned by the vector (1, 2, 3). This is the same as finding the kernel of the matrix [1 2 3]
SLIDE 48
Try it yourself!
Find a basis for the orthogonal complement in R3 of the subspace spanned by the vector (1, 2, 3). This is the same as finding the kernel of the matrix [1 2 3] It’s already row reduced.
SLIDE 49
Try it yourself!
Find a basis for the orthogonal complement in R3 of the subspace spanned by the vector (1, 2, 3). This is the same as finding the kernel of the matrix [1 2 3] It’s already row reduced. The kernel is spanned by (−2, 1, 0) and (−3, 0, 1).
SLIDE 50
Orthogonal complements
SLIDE 51
Orthogonal complements
(For notation, recall V was a subspace of Rn.)
SLIDE 52
Orthogonal complements
(For notation, recall V was a subspace of Rn.) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns,
SLIDE 53
Orthogonal complements
(For notation, recall V was a subspace of Rn.) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns, so dim V ⊥ = n − dim V ,
SLIDE 54
Orthogonal complements
(For notation, recall V was a subspace of Rn.) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns, so dim V ⊥ = n − dim V , i.e. dim V + dim V ⊥ = dim Rn
SLIDE 55
Orthogonal complements
(For notation, recall V was a subspace of Rn.) V ⊥ is the kernel of a matrix with dim V linearly independent rows and n columns, so dim V ⊥ = n − dim V , i.e. dim V + dim V ⊥ = dim Rn Recall from before, V ∩ V ⊥ = {0}.
SLIDE 56
Orthogonal complements
V ⊥ = “everything orthogonal to
SLIDE 57
Orthogonal complements
V ⊥ = “everything orthogonal to everything in V ”
SLIDE 58
Orthogonal complements
V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to
SLIDE 59
Orthogonal complements
V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to
SLIDE 60
Orthogonal complements
V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ”
SLIDE 61
Orthogonal complements
V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥, so V ⊂ (V ⊥)⊥.
SLIDE 62
Orthogonal complements
V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥, so V ⊂ (V ⊥)⊥. But from the previous slide, we have:
SLIDE 63
Orthogonal complements
V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥, so V ⊂ (V ⊥)⊥. But from the previous slide, we have: dim V + dim V ⊥ = n
SLIDE 64
Orthogonal complements
V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥, so V ⊂ (V ⊥)⊥. But from the previous slide, we have: dim V + dim V ⊥ = n and similarly dim V ⊥ + dim(V ⊥)⊥ = n
SLIDE 65
Orthogonal complements
V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥, so V ⊂ (V ⊥)⊥. But from the previous slide, we have: dim V + dim V ⊥ = n and similarly dim V ⊥ + dim(V ⊥)⊥ = n So we learn dim V = dim(V ⊥)⊥,
SLIDE 66
Orthogonal complements
V ⊥ = “everything orthogonal to everything in V ” (V ⊥)⊥ = “everything orthogonal to everything orthogonal to everything in V ” Everything in V is orthogonal to V ⊥, so V ⊂ (V ⊥)⊥. But from the previous slide, we have: dim V + dim V ⊥ = n and similarly dim V ⊥ + dim(V ⊥)⊥ = n So we learn dim V = dim(V ⊥)⊥, hence V = (V ⊥)⊥.
SLIDE 67
Orthogonal complements
In words, V = (V ⊥)⊥ is telling you that:
SLIDE 68
Orthogonal complements
In words, V = (V ⊥)⊥ is telling you that: Everything orthogonal to
SLIDE 69
Orthogonal complements
In words, V = (V ⊥)⊥ is telling you that: Everything orthogonal to everything orthogonal to
SLIDE 70
Orthogonal complements
In words, V = (V ⊥)⊥ is telling you that: Everything orthogonal to everything orthogonal to everything in V
SLIDE 71
Orthogonal complements
In words, V = (V ⊥)⊥ is telling you that: Everything orthogonal to everything orthogonal to everything in V is already in V .
SLIDE 72
Orthogonal complements
Theorem
Let V ⊂ Rn be a linear subspace.
SLIDE 73
Orthogonal complements
Theorem
Let V ⊂ Rn be a linear subspace. Any vector in Rn has a unique expression as sum of a vector in V and a vector in V ⊥.
SLIDE 74
Orthogonal complements
Theorem
Let V ⊂ Rn be a linear subspace. Any vector in Rn has a unique expression as sum of a vector in V and a vector in V ⊥. In fact, something more general is true:
SLIDE 75
Orthogonal complements
Theorem
Let V ⊂ Rn be a linear subspace. Any vector in Rn has a unique expression as sum of a vector in V and a vector in V ⊥. In fact, something more general is true:
Theorem
Given any two subspaces V , W such that dim V + dim W = n and V ∩ W = {0},
SLIDE 76
Orthogonal complements
Theorem
Let V ⊂ Rn be a linear subspace. Any vector in Rn has a unique expression as sum of a vector in V and a vector in V ⊥. In fact, something more general is true:
Theorem
Given any two subspaces V , W such that dim V + dim W = n and V ∩ W = {0}, any vector in Rn can be written uniquely as a sum
- f a vector in V and a vector in W .
SLIDE 77
Example
Consider the subspace V ⊂ R4 spanned by e1, e2.
SLIDE 78
Example
Consider the subspace V ⊂ R4 spanned by e1, e2. Its orthogonal complement V ⊥ is spanned by e3, e4.
SLIDE 79
Example
Consider the subspace V ⊂ R4 spanned by e1, e2. Its orthogonal complement V ⊥ is spanned by e3, e4. Any vector (w, x, y, z) can be written as (w, x, 0, 0) + (0, 0, y, z).
SLIDE 80
Proof
SLIDE 81
Proof
Take bases v1, . . . , vv of V and w1, . . . , ww of W .
SLIDE 82
Proof
Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0.
SLIDE 83
Proof
Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0. Then aivi = − bjwj.
SLIDE 84
Proof
Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0. Then aivi = − bjwj. One side is in V and the other is in W ,
SLIDE 85
Proof
Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0. Then aivi = − bjwj. One side is in V and the other is in W , so both are in V ∩ W ,
SLIDE 86
Proof
Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0. Then aivi = − bjwj. One side is in V and the other is in W , so both are in V ∩ W , hence zero.
SLIDE 87
Proof
Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0. Then aivi = − bjwj. One side is in V and the other is in W , so both are in V ∩ W , hence zero. {vi} and {wj} were bases,
SLIDE 88
Proof
Take bases v1, . . . , vv of V and w1, . . . , ww of W . Suppose aivi + bjwj = 0. Then aivi = − bjwj. One side is in V and the other is in W , so both are in V ∩ W , hence zero. {vi} and {wj} were bases, so ai and bj must be zero.
SLIDE 89
Proof
Thus {v1, . . . , vv, w1, . . . , ww} is linearly independent.
SLIDE 90
Proof
Thus {v1, . . . , vv, w1, . . . , ww} is linearly independent. This linearly independent set has n elements,
SLIDE 91
Proof
Thus {v1, . . . , vv, w1, . . . , ww} is linearly independent. This linearly independent set has n elements, so it’s a basis for Rn.
SLIDE 92
Proof
Thus {v1, . . . , vv, w1, . . . , ww} is linearly independent. This linearly independent set has n elements, so it’s a basis for Rn. So any vector can be written as aivi + bjwj.
SLIDE 93
Proof
Thus {v1, . . . , vv, w1, . . . , ww} is linearly independent. This linearly independent set has n elements, so it’s a basis for Rn. So any vector can be written as aivi + bjwj. This a sum of a vector in V and a vector in W .
SLIDE 94
Proof
Such an expression is unique:
SLIDE 95
Proof
Such an expression is unique: If v, v′ ∈ V and w, w′ ∈ W and v + w = v′ + w′,
SLIDE 96
Proof
Such an expression is unique: If v, v′ ∈ V and w, w′ ∈ W and v + w = v′ + w′, then v − v′ = w − w′
SLIDE 97
Proof
Such an expression is unique: If v, v′ ∈ V and w, w′ ∈ W and v + w = v′ + w′, then v − v′ = w − w′ is in V and in W
SLIDE 98
Proof
Such an expression is unique: If v, v′ ∈ V and w, w′ ∈ W and v + w = v′ + w′, then v − v′ = w − w′ is in V and in W hence is zero.
SLIDE 99
Proof
Such an expression is unique: If v, v′ ∈ V and w, w′ ∈ W and v + w = v′ + w′, then v − v′ = w − w′ is in V and in W hence is zero. So v = v′ and w = w′.
SLIDE 100
Orthogonal projection
SLIDE 101
Orthogonal projection
We have seen that if V ⊂ Rn is a vector subspace,
SLIDE 102
Orthogonal projection
We have seen that if V ⊂ Rn is a vector subspace, any vector x can be written uniquely as x = v + v⊥ for some v ∈ V and v⊥ ∈ V ⊥.
SLIDE 103
Orthogonal projection
We have seen that if V ⊂ Rn is a vector subspace, any vector x can be written uniquely as x = v + v⊥ for some v ∈ V and v⊥ ∈ V ⊥. The vector v above is said to be the orthogonal projection of x.
SLIDE 104
SLIDE 105
SLIDE 106
SLIDE 107
SLIDE 108
Computing orthogonal projections
SLIDE 109
Computing orthogonal projections
Special case: V is one dimensional,
SLIDE 110
Computing orthogonal projections
Special case: V is one dimensional, hence the span of one vector v.
SLIDE 111
Computing orthogonal projections
Special case: V is one dimensional, hence the span of one vector v. Writing x as the sum of a vector in V and a vector in V ⊥ means:
SLIDE 112
Computing orthogonal projections
Special case: V is one dimensional, hence the span of one vector v. Writing x as the sum of a vector in V and a vector in V ⊥ means: writing x = λv + w where v · w = 0.
SLIDE 113
Computing orthogonal projections
Special case: V is one dimensional, hence the span of one vector v. Writing x as the sum of a vector in V and a vector in V ⊥ means: writing x = λv + w where v · w = 0. Taking dot products, v · x = λv · v + v · w = λv · v
SLIDE 114
Computing orthogonal projections
Special case: V is one dimensional, hence the span of one vector v. Writing x as the sum of a vector in V and a vector in V ⊥ means: writing x = λv + w where v · w = 0. Taking dot products, v · x = λv · v + v · w = λv · v Or in other words, λ = v · x v · v
SLIDE 115
Computing orthogonal projections
Special case: V is one dimensional, hence the span of one vector v. Writing x as the sum of a vector in V and a vector in V ⊥ means: writing x = λv + w where v · w = 0. Taking dot products, v · x = λv · v + v · w = λv · v Or in other words, λ = v · x v · v So the orthogonal projection of x to Span(v) is v · x v · v
- v
SLIDE 116
Example
The orthogonal projection of (1, 2, 3, 4) to the line spanned by (1, 1, 1, 1) is
SLIDE 117
Example
The orthogonal projection of (1, 2, 3, 4) to the line spanned by (1, 1, 1, 1) is (1, 1, 1, 1) · (1, 2, 3, 4) (1, 1, 1, 1) · (1, 1, 1, 1)
- (1, 1, 1, 1) = 10
4 · (1, 1, 1, 1)
SLIDE 118
Computing orthogonal projections
More generally, suppose you have a basis v1, · · · , vk for V
SLIDE 119
Computing orthogonal projections
More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j.
SLIDE 120
Computing orthogonal projections
More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis.
SLIDE 121
Computing orthogonal projections
More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis.
SLIDE 122
Computing orthogonal projections
More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis. Then the orthogonal projection of x to V is the sum of the
- rthogonal projections to the vi.
SLIDE 123
Computing orthogonal projections
More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis. Then the orthogonal projection of x to V is the sum of the
- rthogonal projections to the vi. That is,
ProjV (x) =
- i
vi · x vi · vi
- · vi
SLIDE 124
Computing orthogonal projections
More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis. Then the orthogonal projection of x to V is the sum of the
- rthogonal projections to the vi. That is,
ProjV (x) =
- i
vi · x vi · vi
- · vi
Proof:
SLIDE 125
Computing orthogonal projections
More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis. Then the orthogonal projection of x to V is the sum of the
- rthogonal projections to the vi. That is,
ProjV (x) =
- i
vi · x vi · vi
- · vi
Proof: We compute the dot product vj · (x − ProjV (x)):
SLIDE 126
Computing orthogonal projections
More generally, suppose you have a basis v1, · · · , vk for V such that vi · vj = 0 for i = j. Such a basis is called an orthogonal basis. Then the orthogonal projection of x to V is the sum of the
- rthogonal projections to the vi. That is,
ProjV (x) =
- i
vi · x vi · vi
- · vi
Proof: We compute the dot product vj · (x − ProjV (x)): vj · x − vj ·
- i
vi · x vi · vi
- vi = vj · x − vj · vj
vj · x vj · vj
- = 0
SLIDE 127
Orthonormal bases
The formula ProjV (x) =
- i
vi · x vi · vi
- · vi
simplifies further when vi · vi = ||vi||2 = 1 for all i.
SLIDE 128
Orthonormal bases
The formula ProjV (x) =
- i
vi · x vi · vi
- · vi
simplifies further when vi · vi = ||vi||2 = 1 for all i. Such a basis is called an orthonormal basis.
SLIDE 129
Orthonormal bases
The formula ProjV (x) =
- i
vi · x vi · vi
- · vi
simplifies further when vi · vi = ||vi||2 = 1 for all i. Such a basis is called an orthonormal basis.
Example
Any subset of the standard basis is an orthonormal basis for the linear space it spans.
SLIDE 130
Finding orthonormal bases:
SLIDE 131
Finding orthonormal bases:
If V is one dimensional:
SLIDE 132
Finding orthonormal bases:
If V is one dimensional: choose any nonzero vector v1,
SLIDE 133
Finding orthonormal bases:
If V is one dimensional: choose any nonzero vector v1, and then divide it by its length.
SLIDE 134
Finding orthonormal bases:
If V is one dimensional: choose any nonzero vector v1, and then divide it by its length. Now you have a unit vector, v′
1 =
1 ||v1|| v1
SLIDE 135
Finding orthonormal bases:
If V is two dimensional:
SLIDE 136
Finding orthonormal bases:
If V is two dimensional: choose any nonzero vector v1,
SLIDE 137
Finding orthonormal bases:
If V is two dimensional: choose any nonzero vector v1, and then divide it by its length.
SLIDE 138
Finding orthonormal bases:
If V is two dimensional: choose any nonzero vector v1, and then divide it by its length. Now you have a unit vector, v′
1 =
1 ||v1|| v1 Next, choose any vector v2 outside the span of v′
- 1. Compute its
- rthogonal projection: