Graphics 2014 Linear Algebra II Linear Maps & Matrices Linear - - PowerPoint PPT Presentation
Graphics 2014 Linear Algebra II Linear Maps & Matrices Linear - - PowerPoint PPT Presentation
Graphics 2014 Linear Algebra II Linear Maps & Matrices Linear Maps & Matrices CORE core topics important Linear Combinations x 1 2x 2 + x 1 = x 2 =1 Linear Combinations Algebra Linear Combinations
core topics important
CORE
Linear Maps & Matrices
Linear Combinations
Linear Combinations as Mappings
- Fix vectors ๐ฒ1, โฆ , ๐ฒ๐ โ โ๐.
- Factors ฮป1, โฆ , ฮป๐ โ ๐ณ
x1 Linear Combinations x2 2x2 + x1
๐ณ = ฮป๐๐ฒ๐
๐ ๐=1 Algebra
Linear Mappings
Linear Map
- Fix vectors
๐ฒ1, โฆ , ๐ฒ๐ โ โ๐
- Input coordinates
ฮป1, โฆ , ฮป๐
- Output vector
๐ณ โ โ๐
Linear Combinatio tion
ฮป1 โฏ ฮป๐
fixed: x1,โฆ , xn
๐ณ = ฮป๐๐ฒ๐
๐ ๐=1
Map ฮป1, โฆ , ฮป๐ โ ๐ณ is called a linear map
๐ณ
Linear Mappings
Linear Map
- Fix vectors
๐ฒ1, โฆ , ๐ฒ๐ โ โ๐
- Input coordinates
ฮป1, โฆ , ฮป๐
- Output vector
๐ณ โ โ๐
Linear Combinatio tion
ฮป1 โฏ ฮป๐
fixed: x1,โฆ , xn
๐ณ = ฮป๐๐ฒ๐
๐ ๐=1
Map ฮป1, โฆ , ฮป๐ โ ๐ณ is called a linear map
Input Vectors
๐ณ
Linear Mappings
Linear Combinatio tion ๐ = ๐1 โฎ ฮป๐
fixed: x1, โฆ , xn โ โ๐
๐ณ = ๐ง1 โฎ ๐ง๐ ๐1 ๐2 y = ๐1x1 + ๐2x2
How the machine works
๐ณ = ฮป๐๐ฒ๐
๐ ๐=1
= | | ๐ฆ1 โฏ ๐ฆ๐ | | โ ๐1 โฎ ฮป๐ = ฮป๐ ๐ฆ1,๐ โฎ ๐ฆ๐,๐
๐ ๐=1
= ๐ฆ1,1 โฏ ๐ฆ1,๐ โฎ โฎ ๐ฆ๐,1 โฏ ๐ฆ๐,๐ โ ๐1 โฎ ฮป๐
Matrix Representation
Matrix Representation
๐ณ = ฮป๐๐ฒ๐
๐ ๐=1
= | | ๐ฒ1 โฏ ๐ฒ๐ | | โ ๐1 โฎ ฮป๐ = ฮป๐ ๐ฆ1,๐ โฎ ๐ฆ๐,๐
๐ ๐=1
= ๐ฆ1,1 โฏ ๐ฆ1,๐ โฎ โฎ ๐ฆ๐,1 โฏ ๐ฆ๐,๐ โ ๐1 โฎ ฮป๐
Short
๐ณ = ๐ โ ๐
Matrix
๐ = ๐ฆ1,1 โฏ ๐ฆ1,๐ โฎ โฎ ๐ฆ๐,1 โฏ ๐ฆ๐,๐ Vectors ๐ = ๐1 โฎ ฮป๐ , ๐ณ = ๐ง1 โฎ ๐ง๐
Convention
Taken from Textbook [Shirley et al.]
- Matrix elements
๐ฆ๐ ๐๐ฅ,๐๐๐๐ฃ๐๐
- Row first, then column
- โyโ-coordinate of the array first
(unintuitive, but common convention)
๐ฆ1,1 โฏ ๐ฆ1,๐ โฎ โฎ ๐ฆ๐,1 โฏ ๐ฆ๐,๐
๐ ๐ ๐๐ฅ๐ก ๐ ๐๐๐๐ฃ๐๐๐ก
Matrix Representation
Matrix-vector product Construction
- Maps from โ๐ โ โ๐
- ๐ โ โ๐
- ๐ฒ๐ โ โ๐ โ ๐ณ โ โ๐
- Columns of ๐ = images of the basis vectors of โ๐
๐ณ ๐ = | | ๐ฒ1 โฏ ๐ฒ๐ | | โ ๐1 โฎ ฮป๐
Example
Example: rotation matrix
๏ท ๏ท ๏ธ ๏ถ ๏ง ๏ง ๏จ ๏ฆ 1 ๏ท ๏ท ๏ธ ๏ถ ๏ง ๏ง ๏จ ๏ฆ 1
๐๐ ๐๐ข = cos ๐ฝ โ sin ๐ฝ sin ๐ฝ cos ๐ฝ
cos ๐ฝ sin ๐ฝ โ sin ๐ฝ cos ๐ฝ
General Matrix Product (Notation)
Algebraic rule:
- Vector-matrix product:
๐ โ ๐ฒ = ๐ณ
=
๐ณ ๐ ๐ฒ
General Matrix Product (Notation)
Algebraic rule:
- Vector-matrix product:
ยฐ
๐ โ ๐ฒ = ๐ณ
ยฐ
๐ณ ๐ ๐ฒ
ร ร ร ร โ โ ร ร ร ร
General Matrix Product (Notation)
Algebraic rule:
- Vector-matrix product:
ยฐ
๐ โ ๐ฒ = ๐ณ ๐ณ ๐ ๐ฒ
ร ร ร ร โ โ ร ร ร ร
ยฐ ยฐ
General Matrix Product (Notation)
Algebraic rule:
- Vector-matrix product:
ยฐ
๐ โ ๐ฒ = ๐ณ ๐ณ ๐ ๐ฒ
ร ร ร ร โ โ ร ร ร ร
ยฐ ยฐ ยฐ
General Matrix Product (Notation)
Algebraic rule:
- Vector-matrix product:
ยฐ
๐ โ ๐ฒ = ๐ณ ๐ณ ๐ ๐ฒ
ร ร ร ร โ ร ร ร ร โ
ยฐ ยฐ ยฐ ยฐ
Matrix Representation
Matrix-Vector Multiplication
๐ฆ1,1 โฏ ๐ฆ1,1 โฎ โฎ ๐ฆ๐,1 โฏ ๐ฆ๐,๐ โ ๐1 โฎ ฮป๐ โ ๐๐ ๐ฆ1,๐ โฎ ๐ฆ๐,๐
๐ ๐=1
= ๐1 โ ๐ฆ1,1 + โฏ + ๐๐ โ ๐ฆ1,๐ โฎ ๐1 โ ๐ฆ๐,1 + โฏ + ๐๐ โ ๐ฆ๐,๐ ยฐ
basic topics study completely
BASIC
Standard Transformations
Identity Transform
Example: identity matrix
๏ท ๏ท ๏ธ ๏ถ ๏ง ๏ง ๏จ ๏ฆ 1 ๏ท ๏ท ๏ธ ๏ถ ๏ง ๏ง ๏จ ๏ฆ 1
๐๐๐๐๐๐ข๐๐ข๐ง = ๐ = 1 1
๏ท ๏ท ๏ธ ๏ถ ๏ง ๏ง ๏จ ๏ฆ 1 ๏ท ๏ท ๏ธ ๏ถ ๏ง ๏ง ๏จ ๏ฆ 1
General case ๐: โ๐ โ โ๐, ๐ = 1 โฏ 1 โฎ โฑ โฎ โฏ 1
1 1
Scaling (Center = Origin)
๐ ๐
General case ๐๐: โ๐ โ โ๐, ๐๐ = ๐ โฏ ๐ โฎ โฑ โฎ โฏ ๐
1 1
Non-Uniform Scaling
๐1 ๐2
General case ๐๐: โ๐ โ โ๐, ๐๐ = ๐1 โฏ ๐2 โฎ โฑ โฎ โฏ ๐3
Rotation (2D)
๏ท ๏ท ๏ธ ๏ถ ๏ง ๏ง ๏จ ๏ฆ 1 ๏ท ๏ท ๏ธ ๏ถ ๏ง ๏ง ๏จ ๏ฆ 1
๐๐ ๐๐ข = cos ๐ฝ โ sin ๐ฝ sin ๐ฝ cos ๐ฝ
cos ๐ฝ sin ๐ฝ โ sin ๐ฝ cos ๐ฝ
Rotation (3D)
๐ฐ = ๐ฆ ๐ง ๐จ
๐ณ ๐ฒ ๐ฉ๐ฌ๐ฃ๐ก๐ฃ๐จ ๐ด
๐ = โ3
๐ณ ๐ฉ๐ฌ๐ฃ๐ก๐ฃ๐จ ๐ด ๐ฒ ๐ฉ๐ฌ๐ฃ๐ก๐ฃ๐จ ๐ด ๐ฒ ๐ณ ๐ฉ๐ฌ๐ฃ๐ก๐ฃ๐จ ๐ด ๐ฒ ๐ณ
Rotation (3D)
๐ณ ๐ฉ๐ฌ๐ฃ๐ก๐ฃ๐จ ๐ด ๐ฒ ๐ฉ๐ฌ๐ฃ๐ก๐ฃ๐จ ๐ด ๐ฒ ๐ณ ๐ฉ๐ฌ๐ฃ๐ก๐ฃ๐จ ๐ด ๐ฒ ๐ณ ๐๐ฆ = 1 cos ๐ฝ โ sin ๐ฝ sin ๐ฝ cos ๐ฝ ๐๐จ = cos ๐ฝ โ sin ๐ฝ sin ๐ฝ cos ๐ฝ 1 ๐๐ง = cos ๐ฝ โ sin ๐ฝ 1 sin ๐ฝ cos ๐ฝ
Reflection
General case ๐๐: โ๐ โ โ๐, ๐๐ = 1 โฏ โ1 โฎ โฑ โฎ โฏ 1
1 1 โ1 1 ๐๐ ๐๐๐ = โ1 1 Reflection Axis
Shearing
1 1
๐๐กโ๐๐๐ = 1 ๐ 1
0.5 1
General Case
You can combine all of these Example: General axis of rotation
- First rotate rotation axis to x-axis
- Rotate around x
- Rotate back
Question
- How to combine multiple matrix multiplications?
basic topics study completely
BASIC
Combining Transformations
Matrix Products
Matrix Multiplication
Execute multiple linear maps,
- ne after another
- Written as product
- ๐ โ ๐ โ ๐ฒ:
- Apply ๐ to ๐ฒ first
- Then ๐
- ๐ โ ๐ is again a matrix
How does it work?
Consider ๐ โ ๐ :
- Rotate first (๐)
- Then scale (๐)
๏ท ๏ท ๏ธ ๏ถ ๏ง ๏ง ๏จ ๏ฆ 1 ๏ท ๏ท ๏ธ ๏ถ ๏ง ๏ง ๏จ ๏ฆ 1
cos ๐ฝ sin ๐ฝ โ sin ๐ฝ cos ๐ฝ
1 1 2 2
๐ ๐
How does it work?
How to compute ๐ โ ๐ ?
- Transform basis vectors
- Transform again
cos ๐ฝ sin ๐ฝ โ sin ๐ฝ cos ๐ฝ ๐
๐ โ ๐
2cos ๐ฝ 2sin ๐ฝ โ 2sin ๐ฝ 2cos ๐ฝ
Matrix product: ๐
Matrix Multiplication
๐ ๐1 ๐4
column 4
๐3
column 3
๐2
column 2 column 1
๐
Matrix Multiplication
Matrix product: ๐
ยฐ
Matrix Multiplication
General matrix products:
- ๐ โ ๐: possible if
#Row(๐) = #Columns(๐)
ยฐ
๐ ๐
๐ = ๐1,1 โฏ ๐1,๐ โฎ โฎ ๐๐,1 โฏ ๐๐,๐ ๐ = ๐1,1 โฏ ๐1,๐ โฎ โฎ ๐๐,1 โฏ ๐๐,๐
๐ ๐ ๐ ๐ ๐ ๐
๐ = ๐
1,1
โฏ ๐
1,๐
โฎ โฎ ๐
๐,1
โฏ ๐
๐,๐
๐ = ๐ โ ๐ ๐
๐,๐ = ๐๐,๐ โ ๐๐,๐ ๐ ๐=1
Rules for Matrix Multiplication
Matrix-Multiplication
- Associative
๐ โ ๐ โ ๐ = ๐ โ ๐ โ ๐
- Includes vector-multiplication
๐ โ ๐ โ ๐ฐ = ๐ โ ๐ โ ๐ฐ
- In general, not commutative:
It might be that ๐ โ ๐ โ ๐ โ ๐
- Linear
๐ โ ๐ฐ + ๐ฑ = ๐ โ ๐ฐ + ๐ โ ๐ฑ ๐ โ ๐ โ ๐ฐ = ๐ โ ๐ โ ๐ฐ
(Remark: linearity is used to define linear maps axiomatically)
๐ โ โ ๐, ๐, ๐ - matrices ๐ฐ, ๐ฑ - vectors
Settings
core topics important
CORE
Reversing Transformations
Matrix Inversion
Inverse Matrix
Can we find the inverse matrix?
- โUndo effectโ
- Formally
๐โ1 โ ๐ = ๐
Inverse Matrix
Examples
- Rotation matrix
๐๐ ๐๐ข = cos ๐ฝ โ sin ๐ฝ sin ๐ฝ cos ๐ฝ
- Inverse?
cos ๐ฝ sin ๐ฝ โ sin ๐ฝ cos ๐ฝ ๐๐ ๐๐ข cos(โ๐ฝ) sin(โ๐ฝ) ๐๐ ๐๐ข
โ1
๐ฝ โ๐ฝ
๐๐ ๐๐ข
โ1 = cos(โ๐ฝ)
โ sin(โ๐ฝ) sin(โ๐ฝ) cos(โ๐ฝ)
โ sin(โ๐ฝ) cos(โ๐ฝ)
Inverse Matrix
Examples
- Null matrix
๐ = 0
- Inverse?
๐
Inverse Matrix
Examples
- Projection matrix (remove x-component)
๐๐๐ ๐ = 0
1
- Inverse?
1 ๐๐๐ ๐
Inverse Matrix
Examples
- Projection matrix (remove x-component)
๐๐๐๐๐๐ง = 2
1 4 2
- Inverse?
2 4 1 2 ๐๐๐๐๐๐ง
Invertible Matrices
Invertible matrices
- Are always square (#rows = #columns)
- In addition
- Columns are linearly independent
Equivalent characterizations:
- Square and rows are linearly independent
- Columns form basis of vector space
- Rows form basis of vector space
Invertible Matrices
Rank
- Number of linearly independent columns
- Dimension of span{๐๐ฉ๐ฆ๐ฏ๐ง๐จ_๐ฐ๐๐๐ฎ๐ฉ๐ฌ๐ญ}
Theorem
- Rank = number of linearly independent rows
Full rank
- rank(๐) = dim
(๐)
- Then: ๐ is invertible
Linear Systems of Equations
First consider simpler case
- Say, we know that
๐ โ ๐ฒ = ๐ณ
- Square matrix ๐ โ โ๐ร๐
- Vectors ๐ฒ, ๐ณ โ โ๐ร๐
Knowns & Unknowns
- We are given ๐, ๐ณ
- We should compute ๐ฒ
- Linear system of equations
Linear Systems of Equations
Linear System of Equations
๐ โ ๐ฒ = ๐ณ โ
๐1,1 โฏ ๐1,๐ โฎ โฎ ๐๐,1 โฏ ๐๐,๐ โ ๐ฆ1 โฎ ๐ฆ๐ = ๐ง1 โฎ ๐ง๐
โ
๐1,1๐ฆ1 + โฏ + ๐1,๐ ๐ฆ๐ = ๐ง1 ๐2,1๐ฆ1 + โฏ + ๐2,๐ ๐ฆ๐ = ๐ง2 โฎ ๐๐,1๐ฆ1 + โฏ + ๐๐,๐ ๐ฆ๐ = ๐ง๐
and and
Gaussian Elimination
Linear System
โง ๐1,1๐ฆ1 + โฏ + ๐1,๐ ๐ฆ๐ = ๐ง1 โง ๐2,1๐ฆ1 + โฏ + ๐2,๐ ๐ฆ๐ = ๐ง2 โฎ โง ๐๐,1๐ฆ1 + โฏ + ๐๐,๐ ๐ฆ๐ = ๐ง๐
Row Operations
- Swap rows ๐
๐, ๐ ๐
- Scale row ๐
๐ by factor ๐ โ 0
- Add multiple of row ๐
๐ to row ๐ ๐, ๐ โ ๐
(i.e., ๐
๐ += ๐๐ ๐)
Convert to Upper Triangle Matrix
=
๐ณ ๐ ๐ฒ
= = =
0 0
(use row-operations)
Convert to Diagonal Matrix
=
๐ณ ๐ ๐ฒ
= = =
0 0
=
0 0
=
0 0
1 1 1 1 ๐ง1
โฒ/m1,1 โฒ
๐ง2
โฒ/m2,2 โฒ
๐ง3
โฒ/m3,3 โฒ
๐ง4
โฒ/m4,4 โฒ
๐ฆ1 ๐ฆ2 ๐ฆ3 ๐ฆ4
(use row-operations)
๐1,1
โฒ
๐2,2
โฒ
๐3,3
โฒ
๐4,4
โฒ
๐ง1
โฒ
๐ง2
โฒ
๐ง3
โฒ
๐ง4
โฒ
๐ฆ1 ๐ฆ2 ๐ฆ3 ๐ฆ4
Gauss-Algorithm
Gauss-Algorithm
- Substract rows to cancel front-coefficient
- Create upper triangle matrix first
- Then create diagonal matrix
- If current row starts with 0
- Swap with another row
- If all rows start with 0: matrix not invertible
- Diagonal form: Solution can be read-off
- Data structure
- Modify matrix M, โright-hand-sideโ y.
- x remains unknown (no change)
Matrix Inverse
Solve for
๐ โ ๐ฒ1 = 1 โฎ , ๐ โ ๐ฒ2 = 1 โฎ , โฆ , ๐ โ ๐ฒ๐ = โฎ 1
- The resulting ๐ฒ1, ๐ฒ2, โฆ , ๐ฒ๐ are the columns of ๐โ1:
๐โ1 = | | ๐ฒ1 โฏ ๐ฒ๐ | |
Matrix Inverse
Algorithm
- Simultaneous Gaussian elimination
- Start as follows:
- Handle all right-hand sides simultaneously
- After Gauss-algorithm, the right-hand matrix
is the inverse
=
0 0
1 1 1 1
๐ ๐ฒ ๐
Alternative: Kramerโs Rule
Small Matrices
- Direct formula based on determinants
- โKramerโs ruleโ
- (more later)
- Naive implementation has run-time ๐ซ(๐!)
โ Gauss: ๐ซ(๐3)
- Not advised for ๐ > 3
basic topics study completely
BASIC
More Vector Operations:
Scalar Products
Additional Vector Operations
Length of Vectors
๐ฐ2 = ๐. ๐cm
โlengthโ or โnormโ โ๐ฐโ yields real number โฅ 0
๐ฐ1 = ๐. ๐cm ๐ฐ1 ๐ฐ2
Additional Vector Operations
Angle between Vectors
๐ฝ = โ ๐ฐ1, ๐ฐ2 = ๐๐ยฐ
angle โ ๐ฐ1, ๐ฐ2 yields real number 0, โฆ , 2๐ = [0, โฆ , 360ยฐ)
๐ฐ1 ๐ฐ2
๐ฝ
Additional Vector Operations
Angle between Vectors
right angles ๐ฐ1 ๐ฐ2
90ยฐ
Additional Vector Operations
Projection Projection: determine length of ๐ฐ along direction of ๐ฑ
๐ฐ ๐ฑ
90ยฐ
๐ฐ prj on ๐ฑ
Additional Vector Operations
Scalar Product*)
๐ฐ โ ๐ฑ = ๐ฐ โ ๐ฑ โ cos โ (๐ฐ, ๐ฑ)
*) also known as inner product
- r dot-product
also: ๐ฐ, ๐ฑ
90ยฐ
๐ฐ ๐ฑ
Signature
- ut
- perato
tor โ
Scalar Product
(dot product, inner-product)
in
42.0
in
Additional Vector Operations
Scalar Product*)
90ยฐ
๐ฐ โ ๐ฑ = ๐ฐ โ ๐ฑ โ cos โ (๐ฐ, ๐ฑ)
*) also known as inner product
- r dot-product
also: ๐ฐ, ๐ฑ ๐ฐ ๐ฑ
Additional Vector Operations
Scalar Product*)
90ยฐ
๐ฐ โ ๐ฑ = ๐ฐ โ ๐ฑ โ cos โ (๐ฐ, ๐ฑ)
Comprises: length, projection, angles
*) also known as inner product
- r dot-product
๐ฐ ๐ฑ
Additional Vector Operations
๐ฐ โ ๐ฑ = ๐ฐ โ ๐ฑ โ cos โ (๐ฐ, ๐ฑ)
Comprises: length, projection, angles
Length: ๐ฐ = ๐ฐ โ ๐ฐ Angle: โ ๐ฐ, ๐ฑ = arccos ๐ฐ โ ๐ฑ Projection: โ๐ฐ prj on ๐ฑโ = ๐ฐโ ๐ฑ
๐ฑ
basic topics study completely
BASIC
Algebraic Representation
(Implementation)
Scalar Product
Scalar Product*)
๐ฐ ๐ฑ
90ยฐ
๐ฐ โ ๐ฑ = ๐ค1 ๐ค2 โ ๐ฅ1 ๐ฅ2 โ ๐ค1 โ ๐ฅ1 + ๐ค2 โ ๐ฅ2 Theorem: ๐ฐ โ ๐ฑ = ๐ฐ โ ๐ฑ โ cos โ (๐ฐ, ๐ฑ)
Scalar Product
Scalar product ๐ฐ โ ๐ฑ = ๐ค1 ๐ค2 โ ๐ฅ1 ๐ฅ2 ๐ฐ ๐ฐ = 3 2 ๐ฑ = 1 2 ๐ฑ
Scalar Product
2D Scalar product ๐ฐ โ ๐ฑ = ๐ค1 ๐ค2 โ ๐ฅ1 ๐ฅ2 โ ๐ค1 โ ๐ฅ1 + ๐ค2 โ ๐ฅ2 d-dim scalar product ๐ฐ โ ๐ฑ = ๐ค1 โฎ ๐ค๐ โ ๐ฅ1 โฎ ๐ฅ๐ โ ๐ค1 โ ๐ฅ1 + โฏ + ๐ค๐ โ ๐ฅ๐
Algebraic Properties
Properties
- Symmetry (commutativity)
๐ฏ, ๐ฐ = ๐ฐ, ๐ฏ
- Bilinearity
๐๐ฐ, ๐ฑ = ๐ ๐ฐ, ๐ฑ = ๐ฐ, ๐๐ฑ ๐ฏ + ๐ฐ, ๐ฑ = ๐ฏ, ๐ฑ + ๐ฐ, ๐ฑ
(symmetry: same for second argument)
- Positive definite
๐ฏ, ๐ฏ โฅ 0, ๐ฏ, ๐ฏ = ๐ โ ๐ฏ = ๐
These three: axiomatic definition
๐ โ โ ๐ฏ, ๐ฐ, ๐ฑ โ โ๐
Settings
Attention!
Do not mix
- Scalar-vector product
- Inner (scalar) product
In general ๐ฒ, ๐ณ โ ๐ด โ ๐ฒ โ ๐ณ, ๐ด Beware of notation: ๐ฒ โ ๐ณ โ ๐ด โ ๐ฒ โ ๐ณ โ ๐ด
(no violation of associativity: different operations; details later)
core topics important
CORE
Applications of the Scalar Product
Applications
Obvious applications
- Measuring length
- Measuring angles
- Projections
More complex applications
- Creating orthogonal (90ยฐ) pairs of vectors
- Creating orthogonal bases
Projection
Scalar Product*)
90ยฐ
๐ฐ โ ๐ฑ = ๐ฐ โ ๐ฑ โ cos โ (๐ฐ, ๐ฑ) ๐ฐ ๐ฑ
Projection
Scalar Product*)
90ยฐ
๐ฐ ๐ฑ ๐ฑn ๐ฑn = ๐ฑ ๐ฑ = ๐ฑ ๐ฑ, ๐ฑ
projection prj.-vector
Projection: ๐ฐ โ
๐ฑ ๐ฑโ ๐ฑ
Prj.-Vector: ๐ฐ,
๐ฑ ๐ฑ,๐ฑ
โ
๐ฑ ๐ฑ,๐ฑ
= ๐ฐ, ๐ฑ โ
๐ฑ ๐ฑ,๐ฑ
Orthogonalization
Scalar Product*)
90ยฐ
๐ฐ ๐ฑ ๐ฑn = ๐ฑ ๐ฑ = ๐ฑ ๐ฑ, ๐ฑ
projection prj.-vector
Orthogonalize ๐ฐ wrt. ๐ฑ:
๐ฐโฒ = ๐ฐ โ ๐ฐ, ๐ฑ โ ๐ฑ ๐ฑ, ๐ฑ ๐ฐโฒ
Orthogonalization
Scalar Product*)
90ยฐ
๐ฐ ๐ฑ ๐ฐโฒ
Orthogonalize ๐ฐ wrt. ๐ฑ:
๐ฐโฒ = ๐ฐ โ ๐ฐ, ๐ฑ โ ๐ฑ ๐ฑ, ๐ฑ
Gram-Schmidt Orthogonalization
Orthogonal basis
- All vectors in 90ยฐ angle to each other
๐๐, ๐๐ = 0 for ๐ โ ๐
Create orthogonal bases
- Start with arbitrary one
- Orthogonalize ๐2 by ๐1
- Orthogonalize ๐3 by ๐1, then by ๐2
- Orthogonalize ๐4 by ๐1, then by ๐2, then by ๐3
- ...
Orthonormal Basis
Orthonormal bases
- Orthogonal and all vectors have unit length
Computation
- Orthogonalize first
- Then scale each vector ๐๐ by 1/ ๐๐ .
Matrices
Orthogonal Matrices
- A matrix with orthonormal columns
is called orthogonal matrix
- Yes, this terminology is not quite logical...
Orthogonal Matrices are always
- Rotation matrices
- Or reflection matrices
- Or products of the two
core topics important
CORE
Further Operations
Cross Product
Cross-Product: Exists Only For 3D Vectors!
- ๐ฒ, ๐ณ โ โ3
- ๐ฒ ร ๐ณ =
๐ฆ1 ๐ฆ2 ๐ฆ3 ร ๐ง1 ๐ง2 ๐ง3 โ ๐ฆ2๐ง3 โ ๐ฆ3๐ง2 ๐ฆ3๐ง1 โ ๐ฆ1๐ง3 ๐ฆ1๐ง2 โ ๐ฆ2๐ง1
Geometrically: Theorem
- ๐ฒ ร ๐ณ orthogonal to ๐ฒ, ๐ณ
- Right-handed system ๐ฒ, ๐ณ, ๐ฒ ร ๐ณ
- ๐ฒ ร ๐ณ
= ๐ฒ โ ๐ณ โ sinโ ๐ฒ, ๐ณ
y x x ๏ด y โx ๏ด yโ
Cross-Product Properties
Bilinearity
- Distributive:
๐ฏ ร ๐ฐ + ๐ฑ = ๐ฏ ร ๐ฐ + ๐ฏ ร ๐ฑ
- Scalar-Mult.:
๐๐ฏ ร ๐ฐ = ๐ฏ ร ๐๐ฐ = ๐ ๐ฏ ร ๐ฐ
But beware of
- Anti-Commutative: ๐ฏ ร ๐ฐ = โ๐ฐ ร ๐ฏ
- Not associative;
we can have ๐ฏ ร ๐ฐ ร ๐ฑ โ ๐ฏ ร ๐ฐ ร ๐ฑ
Determinants
Determinants
- Square matrix M
- det(M) = |M| = volume of parallelepiped
- f column vectors
v2 v1 det ๐
๐ = | | | ๐ฐ1 ๐ฐ2 ๐ฐ3 | | |
v3
Determinants
Sign:
- Positive for right handed coordinates
- Negative for left-handed coordinates
v2 v1 det ๐ > 0
๐ = | | | ๐ฐ1 ๐ฐ2 ๐ฐ3 | | |
v3 v1 v2 det ๐โฒ
< 0 ๐โฒ = | | | ๐ฐ2 ๐ฐ1 ๐ฐ3 | | |
v3
negative determinant โ map contains reflection
Properties
A few properties:
- det(A) det(B) = det(Aโ B)
- det(๐A) = ๐d det(A) (d ๏ด d matrix A)
- det(A-1) = det(A)-1
- det(AT) = det(A)
- det ๐ โ 0 โ ๐ invertible
- Efficient computation using Gaussian elimination
sign flips! โ reflections cancel each
- ther (parity)
Computing Determinants
๐ ๐ ๐ ๐ ๐ ๐ ๐ โ ๐ = +๐ ๐ ๐ โ ๐ โ ๐ ๐ ๐ ๐ ๐ + ๐ ๐ ๐ ๐ โ
Recursive Formula
- Sum over first row
- Multiply element there
with subdeterminant
- Subdeterminant :
Leave out row and column
- f selected element
- Recursion ends with |a|= a
- Alternate signs +/โ/+/โ/โฆ
๐ ๐ ๐ ๐ ๐ ๐ ๐ โ ๐ ๐ ๐ ๐ ๐
subdeterminants
+๐ โ๐ +๐ ๐ ๐ ๐ ๐ โ ๐
signs
Beware of ๐ซ ๐๐๐! complexity
+ โ + |a|= a
Computing Determinants
Result in 3D Case
det ๐ ๐ ๐ ๐ ๐ ๐ ๐ โ ๐ = ๐๐๐ + ๐๐๐ + ๐๐โ โ ๐๐๐ โ ๐๐๐ โ ๐๐โ
Solving Linear Systems
Consider ๐ โ ๐ฒ = ๐
- Invertible matrix ๐ โ โ๐ร๐
- Known vector ๐ โ โ๐
- Unknown vector ๐ฒ โ โ๐
Solution with Determinants (Cramarโs rule): ๐ฆ๐ = det ๐๐ det ๐
๐๐ = | | | ๐ฐ1 โฏ ๐ โฏ ๐ฐ3 | | |
column ๐
advanced topics main ideas
ADV
Addendum
Matrix Algebra
Matrix Algebra
Define three operations
- Matrix addition
๐1,1 โฏ ๐1,๐ โฎ โฑ โฎ ๐๐,1 โฏ ๐๐,๐ + ๐1,1 โฏ ๐1,๐ โฎ โฑ โฎ ๐๐,1 โฏ ๐๐,๐ = ๐1,1 + ๐1,1 โฏ ๐1,๐ + ๐1,๐ โฎ โฑ โฎ ๐๐,1 + ๐๐,1 โฏ ๐๐,๐ + ๐๐,๐
- Scalar matrix multiplication
๐ โ ๐1,1 โฏ ๐1,๐ โฎ โฑ โฎ ๐๐,1 โฏ ๐๐,๐ = ๐ โ ๐1,1 โฏ ๐ โ ๐1,๐ โฎ โฑ โฎ ๐ โ ๐๐,1 โฏ ๐ โ ๐๐,๐
- Matrix-matrix multiplication
๐1,1 โฏ ๐1,๐ โฎ โฑ โฎ ๐๐,1 โฏ ๐๐,๐ โ ๐1,1 โฏ ๐1,๐ โฎ โฑ โฎ ๐๐,1 โฏ ๐๐,๐ = โฑ โฐ ๐๐,๐ โ ๐๐,๐
๐ ๐=1
โฐ โฑ
Transposition
Matrix Transposition
- Swap rows and columns
- Formally:
โฑ โ โฐ โ โ โ โ โ โ โ โ โฐ โ โฑ
T
= โฑ โ โ โ โฐ โ โ โ โ โฐ โ โ โ โฑ
T =
๐๐,๐ ๐๐,๐
Vectors
Vectors
- Column matrices
- Matrix-Vector product consistent
Co-Vectors
- โprojectorsโ, โdual vectorsโ,
โlinear formsโ, โrow vectorsโ
- Vectors to be projected on
Transposition
- Convert vectors into projectors and vice versa
๐ฒ โ โ๐ ๐ณT โ โ๐
Vectors
Inner product (as a generalized โprojectionโ)
- Matrix-product ๐๐ฉ๐ฆ๐ฏ๐ง๐จ โ ๐ฌ๐ฉ๐ฑ
โ๐ฒ โ ๐ณโ = ๐ฒ, ๐ณ = ๐ฒT โ ๐ณ
- People use all three notations
- Meaning of โ โ โ clear from context
๐ฒT โ ๐ณ โ โ
Matrix-Vector Products
Two Interpretations
- Linear combination of column vectors
- Projection on row (co-)vectors
ยฐ
๐ โ ๐ฒ = ๐ณ ๐ณ ๐ ๐ฒ
โ + โ + โ + โ
ยฐ ยฐ ยฐ ยฐ
โ โ โ โ โ =
Matrix Algebra
We can add and scalar multiply
- Matrices and vectors (special case)
We can matrix-multiply
- Matrices with other matrices
(execute one-after-another)
- Vectors in certain cases (next)
We can โdivideโ by some (not all) matrices
- Determine inverse matrix
- Full-rank, square matrices only
Algebraic Rules: Addition
Addition: like real numbers (โcommutative groupโ)
- Prerequisites:
- Number of rows match
- Number of columns match
- Associative:
๐ + ๐ + ๐ = ๐ + ๐ + ๐
- Commutative: ๐ + ๐ = ๐ + ๐
- Subtraction:
๐ + โ๐ = ๐
- Neutral Op.:
๐ + ๐ = ๐
๐, ๐, ๐ โ โ๐ร๐
(matrices, same size)
Settings
Algebraic Rules: Scalar Multiplication
Scalar Multiplication: Vector space
- Prerequisites:
- Always possible
- Repeated Scaling: ๐ ๐๐ = ๐๐ ๐
- Neutral Operation: 1 โ ๐ = ๐
- Distributivity 1:
๐(๐ + ๐) = ๐๐ + ๐๐
- Distributivity 2:
๐ + ๐ ๐ = ๐๐ + ๐๐
So far:
- Matrices form vector space
- Just different notation, same semantics!
๐ โ โ ๐, ๐ โ โ๐ร๐
(same size)
Settings
Algebraic Rules: Multiplication
Multiplication: Non-Commutative Ring / Group
- Prerequisites:
- Number of columns right
= number of rows left
- Associative:
๐ โ ๐ โ ๐ = ๐ โ ๐ โ ๐
- Not commutative: often ๐ โ ๐ โ ๐ โ ๐
- Neutral Op.:
๐ โ ๐ = ๐
- Inverse:
๐ โ ๐โ1 = ๐
- Additional prerequisite:
โ Matrix must be square! โ Matrix must have full rank
Set of invertible matrices: ๐ป๐ ๐ โ โ๐ร๐ โgeneral linear groupโ
Algebraic Rules: Multiplication
Multiplication: Non-Commutative Ring / Group
- Prerequisites:
- Number of columns right
= number of rows left
- Associative:
๐ โ ๐ โ ๐ = ๐ โ ๐ โ ๐
- Not commutative: often ๐ โ ๐ โ ๐ โ ๐
- Neutral Op.:
๐ โ ๐ = ๐
- Inverse:
๐ โ ๐โ1 = ๐
- Additional prerequisite:
โ Matrix must be square! โ Matrix must have full rank
Set of invertible matrices: ๐ป๐ ๐ โ โ๐ร๐ โgeneral linear groupโ
๐ โ โ๐ร๐ ๐ โ โ๐ร๐ ๐ โ โ๐ร๐
Settings
Transposition Rules
Transposition
- Addition:
๐ + ๐ T = ๐T + ๐T = ๐T + ๐T
- Scalar-mult.:
๐๐ T = ๐๐T
- Multiplication:
๐ โ ๐ T = ๐T โ ๐T
- Self-inverse:
๐T T = ๐
- (Inversion:)
๐ โ ๐ โ1 = ๐โ1 โ ๐โ1
- Inverse-transp.:
๐T โ1 = ๐โ1 T
- Othogonality:
๐T = ๐โ1 โ ๐ is orthogonal
Matrix Multiplication
Matrix Multiplication ๐ โ ๐ = โ ๐1 โ โฎ โ ๐๐ โ โ | | ๐1 โฏ ๐๐ | | = โฑ โฐ ๐๐, ๐๐ โฐ โฑ
- Scalar products of rows and columns
Orthogonal Matrices
Othogonal Matrices
- (i.e., column vectors orthonormal)
๐๐ = ๐โ1
- Proof: previous slide.
Scalar Product
Matrix Algebra:
- Scalar product is a special case
๐ฒ, ๐ณ = ๐ฒT โ ๐ณ
- Caution when mixing with scalar-vector product!
๐ฒ, ๐ณ โ ๐ด โ ๐ฒ โ ๐ณ, ๐ด ๐ฒT โ ๐ณ โ ๐ด โ ๐ฒ โ ๐ณT โ ๐ด
โ โ โ
Scalar multiplication not a matrix-product!
Scalar Product
NOT OK
โ โ
OK
Scalar Product
What does work:
- Associativity with outer product
๐ฒ โ ๐ณ, ๐ด = ๐ฒ โ ๐ณT โ ๐ด = ๐ฒ โ ๐ณT โ ๐ด
โ โ =
advanced topics main ideas
ADV
Addendum
Axiomatic Mathematics
(This is not a core topic of the course; material is provided just for your information.)
โClass Diagramโ for Real Numbers
field
binary operation tion magma semi-grou
- up
monoid group Abeli lian group
- perator +
- perator โข
- rdered
ed field Real Numbers rs
Real Numbers
field
binary operation ion
binary operation: template <set T, operator โ> T operatorโโโ(T, T) throws DoesNotCompute
magma
closed binary operation: T operatorโโโ(T, T) no-exceptions
semi mi-gr group
- up
associativity: (A โ B) โ C = A โ (B โ C)
monoid
identity element โidโ: id โ A = A โ id = A
group
inverse โT-1โ: A โ A-1 = A-1 โ A = id
abelia ian group
commutativity: A โ B = B โ A
- perator +
- perator โข
set with two operations template<set F> F operator+(F, F) F operator*(F, F)
- rdered
ed field
full order: template<set F> bool operator<(F, F)
Real Numbers rs
completeness: โall Cauchy series convergeโ
advanced topics main ideas
ADV
Structure: Vector Space
Vector Spaces
Vector space:
- Set of vectors V
- Based on field F (we use only F = โ)
- Two operations:
- Adding vectors u = v + w (u, v, w ๏ V)
- Scaling vectors w = ๏ฌv (u ๏ V, ๏ฌ ๏ F)
Vector Spaces
Vector space axioms:
- Vector addition โ Abelian group:
- โ๐ฏ, ๐ฐ, ๐ฑ โ V: ๐ฏ + ๐ฐ + ๐ฑ = ๐ฏ + ๐ฐ + ๐ฑ
- โ๐ฏ, ๐ฐ โ V: ๐ฏ + ๐ฐ = ๐ฐ + ๐ฏ
- โ๐ โ V: โ๐ฐ โ V: ๐ฐ + ๐ = ๐ฐ
- โ๐ฐ โ V: โ"โv" โ V: v +(โv) = ๐
- Compatibility with scalar multiplication:
- โ๐ฐ โ V, ๐, ๐ โ ๐บ: ๐ ๐๐ฏ = ๐๐ ๐ฏ
- โ๐ฐ โ V: 1 โ ๐ฐ = ๐ฐ
- โ๐ฐ, ๐ฑ โ V, ๐ โ ๐บ: ๐(๐ฐ + ๐ฑ) = ๐๐ฐ + ๐๐ฑ
- โ๐ฐ โ V, ๐, ๐ โ ๐บ: ๐ + ๐ ๐ฐ = ๐๐ฐ + ๐v
Settings
๐: vector space ๐บ: field (e.g., โ)
Properties
Some differences to our definition
- Abstract vector spaces can have infinite dimension
- For example: The set of all functions
๐: โ โ โ forms an โ-dimensional vector space
- But they always have a basis
โ coordinate representation
- We can use other fields than โ, such as โ or finite
fields such as (โค mod ๐, ๐ prime)
- We can recognize them before we have a
coordinate representation
Theorem
Theorem (โBasis-Isomorphismโ)
- Any finite-dimensional vector space can be
represented by columns of numbers
- Use the ๐ coordinates of the ๐ basis vectors (dim= ๐)
Our definition makes sense
- Special case
advanced topics main ideas
ADV
Structure: Scalar Product
Scalar Product
Aximatic Definition: Scalar Product
- Function
- two vector arguments (input)
- one scalar output
- ๐: ๐ ร ๐ โ ๐บ
โ think ๐ == โ๐๐๐๐ ๐๐ข๐๐ โโ
- ๐ is a vector space, F is a field (such as โ)
๐: vector space ๐บ: field (e.g., โ)
Settings
Axiomatic Definition: Scalar Product
Properties
- Symmetry
๐ ๐ฏ, ๐ฐ = ๐ ๐ฐ, ๐ฏ
- Bilinearity
๐ ๐ฏ + ๐๐ฐ, ๐ฑ = ๐ ๐ฏ, ๐ฑ + ๐ ๐๐ฐ, ๐ฑ
(linearity in second argument follows from symmetry)
- Positive definite
๐ ๐ฏ, ๐ฏ โฅ 0, ๐ ๐ฏ, ๐ฏ = ๐ โ ๐ = ๐
Symmetric, positive-definite, bilinear function
๐ โ ๐บ ๐ฏ, ๐ฐ, ๐ฑ โ ๐
Settings
General Scalar Product
Theorem
- In a finite-dimensional vector space, any scalar
product has the following form: ๐ ๐ฒ, ๐ณ = ๐๐ฒ โ ๐๐ณ = ๐ฒT ๐T๐ ๐ณ
- โ โ โ is the standard scalar product as we defined it
- M is a square matrix with linearly-independent columns
โ I.e., M transforms to a different coordinate frame
Our definition still makes senseโฆ
- Special case: undistorted coordinates
- General scalar products can take non-standard
coordinate frames into account
advanced topics main ideas
ADV
Structure: Linear Map
Definition of Linear Maps
Axioms
- Linear Map: A function
๐: ๐
1 โ V2
maps from one vector space (๐
1) to another (๐ 2)
- Linearity requires
๐ ๐ฐ + ๐ฑ = ๐ โ ๐ฐ + ๐ โ ๐ฑ ๐ โ ๐ โ ๐ฐ = ๐ โ ๐ โ ๐ฐ
Theorem
- Linear maps in finite-dimensional vector spaces
can always be represented by matrices
- Our definition makes sense: special case
๐ โ linear map ๐ฐ โ ๐
1 - vector
Settings