Shaina Race, PhD Institute for Advanced Analytics.
Matrix Arithmetic
(Multidimensional Math)
Matrix Arithmetic (Multidimensional Math) Shaina Race, PhD - - PowerPoint PPT Presentation
Matrix Arithmetic (Multidimensional Math) Shaina Race, PhD Institute for Advanced Analytics. Element-wise Operations T ABLE OF C ONTENTS Linear Combinations of Matrices and Vectors. Vector Multiplication Inner products and Matrix-Vector
Shaina Race, PhD Institute for Advanced Analytics.
(Multidimensional Math)
Element-wise Operations
Linear Combinations of Matrices and Vectors.
Matrix Multiplication
Inner product and linear combination viewpoint
Vector Multiplication
The Outer Product
Vector Multiplication
Inner products and Matrix-Vector Multiplication
and only if they have the same size
+ =(
+ + + + + + + + + + + + + + +
A B A+B
Aij Bij (A + B)ij
(Element-wise)
M
=
(Element-wise)
Vector addition and scalar multiplication
a
a = 1 2 ⎛ ⎝ ⎜ ⎞ ⎠ ⎟
Vectors have both direction and magnitude Direction arrow points from
Magnitude is the length of that arrow #pythagoras
2a a
a b a+b b a addition is still commutative
Average/Mean (Centroid)
(x1,x2)
x1 x2
x1 x2
x1 x2 New mean is the origin (0,0)
A linear combination of vectors is a just weighted sum:
Scalar Coefficients ⍺i Vectors vi
columns of the identity matrix (elementary vectors):
parts” where the parts give directions along the 3 coordinate axes.
a b
a-3b
(axis 1) (axis 2) (axis 3)
Element-wise Operations
Linear Combinations of Matrices and Vectors.
Vector Multiplication
Inner products and Matrix-Vector Multiplication
Matrix Multiplication
Inner product and linear combination viewpoint
Vector Multiplication
The Outer Product
all vectors are assumed to be columns.
then we can automatically assume that xT is a row vector:
x = x1 x2 ! xn ⎛ ⎝ ⎜ ⎜ ⎜ ⎜ ⎞ ⎠ ⎟ ⎟ ⎟ ⎟
vector times a column vector.
=
1 2 3 4 5 6
aT
! " # # $ ##
b =
i=1 n
1 2 3 4 5 6
* * * * * *
+ + + + + =
a and b must have the same number of elements.
Inner Product View (I-P View)
1 2 3
Sizes must match up! 1 2 3
1 2 3
1 2 3
=
Linear Combination View (L-C View)
1 2 3
1 2 3
⎛ ⎝ ⎜ ⎞ ⎠ ⎟
2 −1 5 ⎛ ⎝ ⎜ ⎜ ⎜ ⎞ ⎠ ⎟ ⎟ ⎟ 3 4 1 ⎛ ⎝ ⎜ ⎜ ⎜ ⎞ ⎠ ⎟ ⎟ ⎟
3 2
2 −1 5 ⎛ ⎝ ⎜ ⎜ ⎜ ⎞ ⎠ ⎟ ⎟ ⎟ 3 4 1 ⎛ ⎝ ⎜ ⎜ ⎜ ⎞ ⎠ ⎟ ⎟ ⎟
3 2 + =
Element-wise Operations
Linear Combinations of Matrices and Vectors.
Matrix Multiplication
Inner product and linear combination viewpoint
Vector Multiplication
The Outer Product
Vector Multiplication
Inner products and Matrix-Vector Multiplication
compatible matrices
compute the product AB by multiplying every row
A multiplied by the jth column of B
A and B are dimension compatible for the product AB if the number of columns in A is equal to the number of rows in B
1 2 3 6 5 4 9 8 7 A S
1 2 3 6 5 4 9 8 7 A S 3 4 7 S =
A B (AB) 4 x 3 3 x 4 x 4 4
1 2 3 6 5 4 9 8 7 A S =
1 2 3 6 5 4 9 8 7 A S 3 4 7 S
(AB)23
to compute product AB when the reverse product, BA, is not even defined.
the case that AB = BA.
Matrix multiplication is NOT commutative!
Multiplication by a diagonal matrix
The net effect is that the rows of A are scaled by the corresponding diagonal element of D
Rather than computing DA, what if we instead put the diagonal matrix on the right hand side and compute AD?
As a Collection of Linear Combinations (L-C View)
(linear combinations) with different coefficients.
vectors (the green columns) with different coefficients (the purple columns).
1 2 3 6 5 4 9 8 7 A S =
1 2 3 6 5 4 9 8 7 A S 3 4 7 S
1 2 3 6 5 4 9 8 7 A S = =
1 2 3 6 5 4 9 8 7 A S 3 4 7 S
1 2 3 6 5 4 9 8 7 A S = =
1 2 3 6 5 4 9 8 7 A S 3 4 7 S
∑
a11 ⎛ ⎝ ⎜ ⎜ ⎜ ⎜ ⎞ ⎠ ⎟ ⎟ ⎟ ⎟
1 2 3 6 5 4 9 8 7 A S = =
1 2 3 6 5 4 9 8 7 A S 3 4 7 S
1 2 3 6 5 4 9 8 7 A S = =
1 2 3 6 5 4 9 8 7 A S 3 4 7 S
(linear combinations) with different coefficients.
vectors (the green columns) with different coefficients (the purple columns).
Element-wise Operations
Linear Combinations of Matrices and Vectors.
Matrix Multiplication
Inner product and linear combination viewpoint
Vector Multiplication
The Outer Product
Vector Multiplication
Inner products and Matrix-Vector Multiplication
column vector times a row vector.
= (mx1) (1xn) (mxn)
1 2 3 4 5
! " # # $ ##
aT
! " # # $ ##
b
=(
baT
! " # # $ ##
! " # # $ ##
aT
! " # # $ ##
b =
baT
! " # # $ ##
! " # # $ ##
aT
! " # # $ ##
b =
baT
! " # # $ ##
From the previous example, you can see that the rows of an outer product are necessarily multiples of each other.
! " # # $ ##
aT
! " # # $ ##
b =(
baT
! " # # $ ##
As a Sum of Outer Products (O-P View)
We can write the product AB as a sum of outer products of columns of A(mxn) and rows of B(nxp)
i=1 n
This view decomposes the product AB into the sum of n matrices, each of which has rank 1 (discussed later).
divided into 5 different groups each year for 20 years.
binary columns indicating whether each individual was a member of each group (yLgK: yearLgroupK): (y1g1, y1g2, y1g3, y1g4, y1g5, y2g1, … y20g5)
Cij = # times person i grouped with person j
The Identity and the Inverse
number ‘1’ is to scalars.
(appropriately sized) identity matrix on either side does not change A:
I4 = 1 1 1 1 ⎛ ⎝ ⎜ ⎜ ⎜ ⎜ ⎞ ⎠ ⎟ ⎟ ⎟ ⎟
A-1, exists such that:
multiplicative inverse in scalar algebra:
multiplicative identity, I
called:
Canceling numbers in scalar algebra: Canceling matrices in linear algebra:
Inverses can help solve an equation…
WHEN THEY EXIST!