A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review - - PowerPoint PPT Presentation

a review of linear algebra
SMART_READER_LITE
LIVE PREVIEW

A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review - - PowerPoint PPT Presentation

A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review of Linear Algebra p.1/13 Basics Column vector x R n , Row vector x T , Matrix A R m n . Matrix Multiplication, ( m n )( n k ) m k , AB = BA .


slide-1
SLIDE 1

A Review of Linear Algebra

Mohammad Emtiyaz Khan CS,UBC

A Review of Linear Algebra – p.1/13

slide-2
SLIDE 2

Basics

Column vector x ∈ Rn, Row vector xT, Matrix A ∈ Rm×n. Matrix Multiplication, (m × n)(n × k) ⇒ m × k, AB = BA. Transpose AT, (AB)T = BTAT, Symmetric A = AT Inverse A−1, doesn’t exist always, (AB)−1 = B−1A−1.

xTx is a scalar, xxT is a matrix. Ax = b, three ways of expressing: n

j=1 aijxj = bj, ∀j

rT

j x = bj, ∀j, where rj is jth row.

x1a1 + x2a2 + . . . + xnan = b (Linear Combination,

l.c.) System of equations : Non-singular (unique solution), singular (no solution, infinite solution).

A Review of Linear Algebra – p.2/13

slide-3
SLIDE 3

LU factorization

   2 1 1 4 −6 −2 7 2       x1 x2 x3    =    5 −2 9    ⇒ Ax = b    2 1 1 −8 −2 1   

  • U

   x1 x2 x3    =    5 −12 2    ⇒ Ux = EFGb    1 1 1 1   

  • E

   1 1 1 1   

  • F

   1 −2 1 1 1   

  • G

,

L = G−1F −1E−1

A Review of Linear Algebra – p.3/13

slide-4
SLIDE 4

LU factorization

(First non-singular case) If no row exchanges are required, then A = LU (unique). Solve Lc = b, then Ux = c Another form A = LDU . (Second non-singular case) There exist a permutation matrix P that reorders the rows, so that PA = LU. (Singular Case) No such P exist. (Cholesky Decomposition) If A is symmetric, and

A = LU can be found without any row exchanges, then A = LLT (also called square root of a matrix). (proof).

Positive Definite matrix always have a Cholesky decompostion.

A Review of Linear Algebra – p.4/13

slide-5
SLIDE 5

Vector Space, Subspace and Matrix

(Real Vector Space) A set of “vectors" with rules for vector addition and multiplication by real numbers. E.g.

R1, R2, . . . , R∞, Hilbert Space.

(8 conditions) Includes an identity vector and zero vector, closed under addition and multiplication etc. etc. (Subspace) Subset of a vector space, closed under addition and multiplication (should contain zero). Subspace “spanned" by a matrix (Outline the concept)

x1    1 5 2    + x2    4 4    =    b1 b2 b3   

A Review of Linear Algebra – p.5/13

slide-6
SLIDE 6

Linear Independence, Basis, Dimension

(Linear Independence, l.i.) If x1a1 + x2a2 + . . . + xnan

  • nly happens when x1 = x2 = . . . = 0, {ak} are called

linearly independent. A set of n vectors in Rm are not l.i. if n > m (proof). (Span) If every vector v in V can be expressed as a l.c.

  • f {ak}, then {ak} are said to span V .

(Basis) {ak} are called basis of V if they are l.i. and span V (Too many and unique) (Dimension) Number of vectors in any basis is called dimension (and is same for all basis).

A Review of Linear Algebra – p.6/13

slide-7
SLIDE 7

Four Fundamental Spaces

Fundamental Theorem of Linear Algebra I

  • 1. R(A) = Column Space of A; l.c. of columns; dim r.
  • 2. N(A) = Nullspace of A; All x : Ax = 0; dim n − r.
  • 3. R(AT) = Row space of A; l.c. of rows; dim r.
  • 4. N(AT) = Left nullspace of A; All y : ATy = 0; dim m − r.

(Rank) r is called rank of the matrix. Inverse exist iff rank is as large as possible. Question: Rank of uvT

A Review of Linear Algebra – p.7/13

slide-8
SLIDE 8

Orthogonality

(Norm) ||x||2 = xTx = x2

1 + . . . + x2 n

(Inner Product) xTy = x1y1 + . . . + xnyn (Orthogonal) xTy = 0 Orthogonal ⇒ l.i. (proof). (Orthonormal basis) Orthogonal vectors with norm =1 (Orthogonal Subspaces) V ⊥ W if v ⊥ w, ∀v ∈ V, w ∈ W (Orthogonal Complement) The space of all vectors

  • rthogonal to V denoted as V ⊥.

The row space is orthogonal to the nullspace (in Rn) and the column space is orthogonal to the left nullspace (in Rm).(proof).

A Review of Linear Algebra – p.8/13

slide-9
SLIDE 9

Finally...

Fundamental Theorem of Linear Algebra II

  • 1. R(AT)⊥ = N(A)
  • 2. R(A)⊥ = N(AT)

Any vector can be expressed as

x = x1b1 + . . . + xrbr

  • xr

+ xr+1br+1 + . . . + xnbn

  • xn

(1)

= xr + xn

(2)

Every matrix transforms its row space to its column space (Comments about pseudo-inverse and invertibility)

A Review of Linear Algebra – p.9/13

slide-10
SLIDE 10

Gram-Schmidt Orthogonalization

(Projection) of b on a is aT b

aT aa, for unit vector (aTb)a

(Schwartz Inequality) |aTb| ≤ ||a||||b|| (Orthogonal Matrix)Q = [q1 . . . qn], QTQ = I. (proof). (Length preservation) ||Qx|| = ||x|| (proof). Given vectors {ak}, construct orthogonal vectors{qk}

  • 1. q1 = a1/||a1||
  • 2. for each j, a

j = aj − (qT 1 aj)q1 − . . . − (qT j−1aj)qj−1

  • 3. qj = a

j/||a

j||

QR Decomposition (Example)

A Review of Linear Algebra – p.10/13

slide-11
SLIDE 11

Eigenvalues and Eigenvectors

(Invariance)Ax = λx. (Characteristics Equation) (A − λI)x = 0 (Nullspace)

λ1 + . . . + λn = a11 + . . . + ann. λ1 . . . λn = det(A).

(A = SΛS−1) Suppose there exist n linear independent eigenvectors for A. If S is the matrix whose columns are those independent vectors, then A = SΛS−1 where

Λ = diag(λ1, . . . , λn).

Diagonalizability is concerned with eigenvectors, and invertibility is concerned with eigenvalues. (Real symmetric matrix) Eigenvectors are orthogonal. So A = QΛQT. (Spectral Theorem)

A Review of Linear Algebra – p.11/13

slide-12
SLIDE 12

Singular Value Decomposition

Any matrix can be factorized as A = UΣV T. Insightful? Fin- ish.

A Review of Linear Algebra – p.12/13

slide-13
SLIDE 13

Finish

Thanks to Maria (Marisol Flores Gorrido) for helping me with this tutorial.

A Review of Linear Algebra – p.13/13