ECS231 Mathematics Review I: Linear Algebra
Reference: Chap.1 of Solomon
1 / 23
ECS231 Mathematics Review I: Linear Algebra Reference: Chap.1 of - - PowerPoint PPT Presentation
ECS231 Mathematics Review I: Linear Algebra Reference: Chap.1 of Solomon 1 / 23 Vector spaces over R Denote a (abstract) vector by v . A vector space V = { a collection of vectors v } which satisfies All v, w V can be added and
1 / 23
◮ All v, w ∈ V can be added and multiplied by α ∈ R:
◮ The operations ‘+, ·’ must satisfy the axioms: For arbitrary u, v, w ∈ V,
(u + v) + w = u + (v + w).
2 / 23
◮ Euclidean space:
◮ Addition:
◮ Multiplication:
◮ Illustration in R2:
3 / 23
◮ Polynomials:
◮ Addition and multiplication in the usual way,
◮ Addition:
◮ Multiplication:
4 / 23
◮ Start with v1, . . . , vn ∈ V, and ai ∈ R, we can define
n
◮ For a set of vectors
i
◮ Observation from (c): adding a new vector does not
6 / 23
◮ A set S of vectors is linearly dependent if it contains a
k
◮ Otherwise, S is called linearly independent. ◮ Two other equivalent defs. of linear dependence:
◮ There exists {v1, . . . , vk} ⊂ S\{0} such that
k
◮ There exists v ∈ S such that
7 / 23
◮ Given a vector space V, it is natural to build a finite set
◮ The max number n of such vectors defines the
◮ Any set S of such vectors is a basis of V, and satisfies
8 / 23
◮ The standard basis for Rn is given by the n vectors
i−1
n−i
◮ ei is not linear combination of the rest of vectors. ◮ For all c ∈ Rn, we have c = n
i=1 ciei.
◮ A basis of polynomials R[x] is given by monomials
9 / 23
◮ Dot product: for a = (a1, . . . , an), b = (b1, . . . , bn) ∈ Rn
n
◮ Length of a vector
1 + · · · + a2 n = √a · a. ◮ Angle between two vectors
◮ Vectors a, b are orthogonal if a · b = 0 = cos 90◦.
10 / 23
◮ Given two vector spaces V, V′, a function
◮ Namely, for all v1, v2 ∈ V and c ∈ R,
◮ L[v1 + v2] = L[v1] + L[v2]. ◮ L[cv1] = cL[v1].
◮ L is completely defined by its action on a basis of V:
i civi and {v1, v2, . . . } is a basis of V.
11 / 23
◮ Linear map in Rn:
◮ Integration operator: linear map
12 / 23
◮ Write vectors in Rm in ‘column forms’, e.g.,
◮ Put n columns together we obtain an m × n matrix
◮ The space of all such matrices is denoted by Rm×n.
13 / 23
◮ A scalar c ∈ R is viewed as a 1 × 1 matrix
◮ A column vector v ∈ Rn is viewed as an n × 1 matrix
14 / 23
◮ A matrix V ∈ Rm×n can be multiplied by a vector c ∈ Rn:
15 / 23
◮ Matrix vector multiplication can be denoted by
◮ M ∈ Rm×n multiplied by another matrix in Rn×k can be
16 / 23
◮ Identity matrix
17 / 23
◮ Linear map L[(x, y)] = (3x, 2x + y, −y) satisfies
◮ All linear maps L: Rn → Rm can be expressed as
18 / 23
◮ Use Aij to denote the element of A at row i column j. ◮ The transpose of A ∈ Rm×n is defined as AT ∈ Rn×m
◮ Basic identities:
19 / 23
◮ Dot product of a, b ∈ Rn:
n
◮ Residual norms of r = Ax − b:
2 = (Ax − b)T(Ax − b)
2 − 2bTAx + Ax2 2.
20 / 23
◮ Storage of matrices in memory:
◮ Multiplication b = Ax for A ∈ Rm×n and x ∈ Rn:
21 / 23
◮ Example: find (x, y, z) satisfying
◮ Given A = [a1, . . . , an] ∈ Rm×n, b ∈ Rm, find x ∈ Rn:
◮ Solution exists if b is in column space of A:
22 / 23
◮ Let A ∈ Rn×n be a square matrix, and suppose Ax = b
◮ The inverse satisfies (why?)
◮ Hence, for any b, we can express the solution as
23 / 23