Announcements Monday, November 19 You should already have the link - - PowerPoint PPT Presentation

announcements
SMART_READER_LITE
LIVE PREVIEW

Announcements Monday, November 19 You should already have the link - - PowerPoint PPT Presentation

Announcements Monday, November 19 You should already have the link to view your graded midterm online. Course grades will be curved at the end of the semester. The percentage of As, Bs, and Cs to be awarded depends on many


slide-1
SLIDE 1

Announcements

Monday, November 19

◮ You should already have the link to view your graded midterm online.

◮ Course grades will be curved at the end of the semester. The percentage of A’s, B’s, and C’s to be awarded depends on many factors, and will not be determined until all grades are in. ◮ Individual exam grades are not curved.

◮ Send regrade requests by tomorrow. ◮ WeBWorK 6.6, 7.1, 7.2 are due the Wednesday after Thanksgiving. ◮ No more quizzes! ◮ My office is Skiles 244 and Rabinoffice hours are: Mondays, 12–1pm; Wednesdays, 1–3pm. (But not this Wednesday.)

slide-2
SLIDE 2

Section 7.2

Orthogonal Complements

slide-3
SLIDE 3

Orthogonal Complements

Definition

Let W be a subspace of Rn. Its orthogonal complement is W ⊥ =

  • v in Rn | v · w = 0 for all w in W
  • read “W perp”.

W ⊥ is orthogonal complement AT is transpose

Pictures: The orthogonal complement of a line in R2 is the perpendicular line.

[interactive] W W ⊥

The orthogonal complement of a line in R3 is the perpendicular plane.

[interactive] W ⊥ W

The orthogonal complement of a plane in R3 is the perpendicular line.

[interactive] W W ⊥

slide-4
SLIDE 4

Poll

slide-5
SLIDE 5

Orthogonal Complements

Basic properties

Let W be a subspace of Rn. Facts:

  • 1. W ⊥ is also a subspace of Rn
  • 2. (W ⊥)⊥ = W
  • 3. dim W + dim W ⊥ = n
  • 4. If W = Span{v1, v2, . . . , vm}, then

W ⊥ = all vectors orthogonal to each v1, v2, . . . , vm =

  • x in Rn | x · vi = 0 for all i = 1, 2, . . . , m
  • = Nul

    — v T

1 —

— v T

2 —

. . . — v T

m —

    .

slide-6
SLIDE 6

Orthogonal Complements

Computation

Problem: if W = Span      1 1 −1   ,   1 1 1     , compute W ⊥.

[interactive]

Span{v1, v2, . . . , vm}⊥ = Nul     — v T

1 —

— v T

2 —

. . . — v T

m —

   

slide-7
SLIDE 7

Orthogonal Complements

Row space, column space, null space

Definition

The row space of an m × n matrix A is the span of the rows of A. It is denoted Row A. Equivalently, it is the column space of AT: Row A = Col AT. It is a subspace of Rn. We showed before that if A has rows v T

1 , v T 2 , . . . , v T m , then

Span{v1, v2, . . . , vm}⊥ = Nul A. Hence we have shown: Fact: (Row A)⊥ = Nul A. Replacing A by AT, and remembering Row AT = Col A: Fact: (Col A)⊥ = Nul AT. Using property 2 and taking the orthogonal complements of both sides, we get: Fact: (Nul A)⊥ = Row A and Col A = (Nul AT)⊥.

slide-8
SLIDE 8

Orthogonal Complements

Reference sheet

Orthogonal Complements of Most of the Subspaces We’ve Seen

For any vectors v1, v2, . . . , vm: Span{v1, v2, . . . , vm}⊥ = Nul     — v T

1 —

— v T

2 —

. . . — v T

m —

    For any matrix A: Row A = Col AT and (Row A)⊥ = Nul A Row A = (Nul A)⊥ (Col A)⊥ = Nul AT Col A = (Nul AT)⊥ For any other subspace W , first find a basis v1, . . . , vm, then use the above trick to compute W ⊥ = Span{v1, . . . , vm}⊥.

slide-9
SLIDE 9

Section 7.3

Orthogonal Projections

slide-10
SLIDE 10

Best Approximation

Suppose you measure a data point x which you know for theoretical reasons must lie on a subspace W .

W y x x − y

Due to measurement error, though, the measured x is not actually in W . Best approximation: y is the closest point to x on W . How do you know that y is the closest point? The vector from y to x is

  • rthogonal to W : it is in the orthogonal complement W ⊥.
slide-11
SLIDE 11

Orthogonal Decomposition

Theorem

Every vector x in Rn can be written as x = xW + xW ⊥ for unique vectors xW in W and xW ⊥ in W ⊥. The equation x = xW + xW ⊥ is called the orthogonal decomposition of x (with respect to W ). The vector xW is the orthogonal projection of x onto W . The vector xW is the closest vector to x on W .

[interactive 1] [interactive 2] W xW x xW ⊥

slide-12
SLIDE 12

Orthogonal Decomposition

Justification

Theorem

Every vector x in Rn can be written as x = xW + xW ⊥ for unique vectors xW in W and xW ⊥ in W ⊥. Why?

slide-13
SLIDE 13

Orthogonal Decomposition

Example

Let W be the xy-plane in R3. Then W ⊥ is the z-axis. x =   2 1 3   = ⇒ xW = xW ⊥ = . x =   a b c   = ⇒ xW = xW ⊥ = . This is just decomposing a vector into a “horizontal” component (in the xy-plane) and a “vertical” component (on the z-axis).

x xW xW ⊥ W [interactive]

slide-14
SLIDE 14

Orthogonal Decomposition

Computation?

Problem: Given x and W , how do you compute the decomposition x = xW + xW ⊥? Observation: It is enough to compute xW , because xW ⊥ = x − xW .

slide-15
SLIDE 15

The ATA trick

Theorem (The ATA Trick)

Let W be a subspace of Rn, let v1, v2, . . . , vm be a spanning set for W (e.g., a basis), and let A =   | | | v1 v2 · · · vm | | |   . Then for any x in Rn, the matrix equation ATAv = ATx (in the unknown vector v) is consistent, and xW = Av for any solution v. ◮ Write W as a column space of a matrix A. ◮ Find a solution v of ATAv = ATx (by row reducing). ◮ Then xW = Av and xW ⊥ = x − xW . Recipe for Computing x = xW + xW ⊥

slide-16
SLIDE 16

The ATA Trick

Example

Problem: Compute the orthogonal projection of a vector x = (x1, x2, x3) in R3

  • nto the xy-plane.
slide-17
SLIDE 17

The ATA Trick

Another Example

Problem: Let x =   1 2 3   W =      x1 x2 x3   in R3 x1 − x2 + x3 = 0    . Compute the distance from x to W .

slide-18
SLIDE 18

The ATA Trick

Another Example, Continued

Problem: Let x =   1 2 3   W =      x1 x2 x3   in R3 x1 − x2 + x3 = 0    . Compute the distance from x to W .

[interactive]

slide-19
SLIDE 19

The ATA trick

Proof

Theorem (The ATA Trick)

Let W be a subspace of Rn, let v1, v2, . . . , vm be a spanning set for W (e.g., a basis), and let A =   | | | v1 v2 · · · vm | | |   . Then for any x in Rn, the matrix equation ATAv = ATx (in the unknown vector v) is consistent, and xW = Av for any solution v. Proof:

slide-20
SLIDE 20

Orthogonal Projection onto a Line

Problem: Let L = Span{u} be a line in Rn and let x be a vector in Rn. Compute xL. The projection of x onto a line L = Span{u} is xL = u · x u · u u xL⊥ = x − xL. Projection onto a Line

L u x xL = u · x u · u u xL⊥

slide-21
SLIDE 21

Orthogonal Projection onto a Line

Example

Problem: Compute the orthogonal projection of x = −6

4

  • nto the line L

spanned by u = 3

2

  • , and find the distance from u to L.

L

  • 3

2

  • −6

4

  • − 10

13

  • 3

2

  • [interactive]
slide-22
SLIDE 22

Summary

Let W be a subspace of Rn. ◮ The orthogonal complement W ⊥ is the set of all vectors orthogonal to everything in W . ◮ We have (W ⊥)⊥ = W and dim W + dim W ⊥ = n. ◮ Row A = Col AT, (Row A)⊥ = Nul A, Row A = (Nul A)⊥, (Col A)⊥ = Nul AT, Col A = (Nul AT)⊥. ◮ Orthogonal decomposition: any vector x in Rn can be written in a unique way as x = xW + xW ⊥ for xW in W and xW ⊥ in W ⊥. The vector xW is the orthogonal projection of x onto W . ◮ The vector xW is the closest point to x in W : it is the best approximation. ◮ The distance from x to W is xW ⊥. ◮ If W = Col A then to compute xW , solve the equation ATAv = ATx; then xW = Av. ◮ If W = L = Span{u} is a line then xL = u·x

u·u u.