Solving Linear System of Equations The Undo button for Linear - - PowerPoint PPT Presentation
Solving Linear System of Equations The Undo button for Linear - - PowerPoint PPT Presentation
Solving Linear System of Equations The Undo button for Linear Operations Matrix-vector multiplication: given the data and the operator , we can find such that = transformation What if we know
The โUndoโ button for Linear Operations
Matrix-vector multiplication: given the data ๐ and the operator ๐ฉ, we can find ๐ such that ๐ = ๐ฉ ๐ What if we know ๐ but not ๐? How can we โundoโ the transformation?
๐ ๐ ๐ฉ
transformation
๐ ๐ ๐ฉ!๐ ?
Solve ๐ฉ ๐ = ๐ for ๐
Image Blurring Example
- Image is stored as a 2D array of real numbers between 0 and 1
(0 represents a white pixel, 1 represents a black pixel)
- ๐๐๐๐ has 40 rows of pixels and 100 columns of pixels
- Flatten the 2D array as a 1D array
- ๐ contains the 1D data with dimension 4000,
- Apply blurring operation to data ๐, i.e.
๐ = ๐ฉ ๐ where ๐ฉ is the blur operator and ๐ is the blurred image
Blur operator
๐ = ๐ฉ ๐
"originalโ image (4000,) blurred image (4000,) Blur operator (4000,4000) Blur operator
๐ ๐
๐ฉ
โUndoโ Blur to recover original image
Solve ๐ฉ ๐ = ๐ for ๐ Assumptions:
- 1. we know the blur
- perator ๐ฉ
- 2. the data set ๐ does not
have any noise (โclean dataโ What happens if we add some noise to ๐? ๐ ๐
โUndoโ Blur to recover original image
Solve ๐ฉ ๐ = ๐ for ๐
๐ + ๐ โ 10!" (๐ โ 0,1 ) How much noise can we add and still be able to recover meaningful information from the original image? At which point this inverse transformation fails? We will talk about sensitivity of the โundoโ operation later. ๐ + ๐ โ 10!# (๐ โ 0,1 )
Linear System of Equations
We can start with an โeasierโ system of equationsโฆ How do we actually solve ๐ฉ ๐ = ๐ ? Letโs consider triangular matrices (lower and upper):
๐!! ๐"! ๐"" โฆ โฆ โฎ โฎ ๐#! ๐#" โฑ โฎ โฆ ๐## ๐ฆ! ๐ฆ" โฎ ๐ฆ# = ๐ ๐" โฎ ๐# ๐!! ๐!" ๐"" โฆ ๐!# โฆ ๐"# โฎ โฎ โฑ โฎ โฆ ๐## ๐ฆ! ๐ฆ" โฎ ๐ฆ# = ๐ ๐" โฎ ๐#
2 3 2 1 2 1 3 6 4 2 ๐ฆ! ๐ฆ" ๐ฆ# ๐ฆ$ = 2 2 6 4
Example: Forward-substitution for lower triangular systems
2 ๐ฆ$ = 2 โ ๐ฆ$= 1 3 ๐ฆ$ + 2 ๐ฆ% = 2 โ ๐ฆ%= 2 โ 3 2 = โ0.5 1 ๐ฆ$ + 2 ๐ฆ% + 6 ๐ฆ& = 6 โ ๐ฆ&= 6 โ 1 + 1 6 = 1.0 1 ๐ฆ$ + 3 ๐ฆ% + 4 ๐ฆ& + 2 ๐ฆ# = 4 โ ๐ฆ&= 4 โ 1 + 1.5 โ 4 2 = 0.25
๐ฆ! ๐ฆ" ๐ฆ# ๐ฆ$ = 1 โ0.5 1.0 0.25
Example: Backward-substitution for upper
triangular systems
2 8 4 4 2 4 3 6 2 2 ๐ฆ! ๐ฆ" ๐ฆ$ ๐ฆ% = 2 4 4 1
๐ฆ# = 1 2 ๐ฆ& = 4 โ 2 1 2 6 = 1 2 ๐ฆ% = 4 โ 4 1 2 โ 3 1 2 4 = 1/2 4 = 1 8 ๐ฆ$ = 2 โ 8 1 8 โ 4 1 2 โ 2 1 2 2 = โ2 2 = โ1
LU Factorization
How do we solve ๐ฉ ๐ = ๐ when ๐ฉ is a non-triangular matrix? We can perform LU factorization: given a ๐ร๐ matrix ๐ฉ,
- btain lower triangular matrix ๐ด and upper triangular matrix
๐ฝ such that where we set the diagonal entries of ๐ด to be equal to 1.
๐ฉ = ๐ด๐ฝ
1 ๐"! 1 โฆ โฆ โฎ โฎ ๐#! ๐#" โฑ โฎ โฆ 1 ๐!! ๐!" ๐"" โฆ ๐!# โฆ ๐"# โฎ โฎ โฑ โฎ โฆ ๐## = ๐ต!! ๐ต!" ๐ต"! ๐ต"" โฆ ๐ต!# โฆ ๐ต"# โฎ โฎ ๐ต#! ๐ต#" โฑ โฎ โฆ ๐ต##
LU Factorization
1 ๐"! 1 โฆ โฆ โฎ โฎ ๐#! ๐#" โฑ โฎ โฆ 1 ๐!! ๐!" ๐"" โฆ ๐!# โฆ ๐"# โฎ โฎ โฑ โฎ โฆ ๐## = ๐ต!! ๐ต!" ๐ต"! ๐ต"" โฆ ๐ต!# โฆ ๐ต"# โฎ โฎ ๐ต#! ๐ต#" โฑ โฎ โฆ ๐ต## Assuming the LU factorization is know, we can solve the general system
LU Factorization (with pivoting)
๐ธ๐ด๐ฝ ๐ = ๐ ๐ด ๐ = ๐ธ๐ผ๐
Forward-substitution
๐ฝ ๐ = ๐
(Solve for ๐) Backward-substitution (Solve for ๐)
๐ฉ = ๐ธ๐ด๐ฝ
Factorize:
๐
Example
๐ฝ = 2 8 โ2 4 1 1 2.5 3 โ1 0.75 ๐ด = 1 0.5 1 0.5 1 0.5 0.5 1 0.5 1
Assume the ๐ฉ = ๐ด๐ฝ factorization is known, yielding: Determine the solution ๐ that satisfies ๐ฉ๐ = ๐, when ๐ = 2 2 1 4 First, solve the lower-triangular system ๐ด ๐ = ๐ for the variable ๐ ๐ด ๐ฝ๐ = ๐
๐
Then, solve the upper-triangular system ๐ฝ ๐ = ๐ for the variable ๐
1 0.5 1 0.5 1 0.5 0.5 1 0.5 1 ๐ = 2 2 1 4 2 8 โ2 4 1 1 2.5 3 โ1 0.75 ๐ = 2 1 โ1 3
Methods to solve linear system of equations
๐ฉ ๐ = ๐
- LU
- Cholesky
- Sparse
LU Factorization - Algorithm
2x2 LU Factorization (simple example)
๐ต!! ๐ต!" ๐ต"! ๐ต"" = 1 ๐"! 1 ๐!! ๐!" ๐""
๐ต!! ๐ต!" ๐ต"! ๐ต"" = ๐!! ๐!" ๐"!๐!! ๐"!๐!" + ๐""
LU Factorization
๐ต!! ๐ต!" ๐ต"! ๐ต"" โฆ ๐ต!# โฆ ๐ต"# โฎ โฎ ๐ต#! ๐ต#" โฑ โฎ โฆ ๐ต## = ๐!! ๐!" ๐"! ๐ฉ"" = 1 ๐ ๐"! ๐ด"" ๐ฃ!! ๐!" ๐ ๐ฝ""
๐$$ ๐$% ๐%$ ๐ฉ%% ๐$$: scalar ๐$%: row vector (1ร(๐ โ 1)) ๐%$: column vector (๐ โ 1)ร1 ๐ฉ%%: matrix (๐ โ 1)ร(๐ โ 1)
๐!! ๐!" ๐"! ๐ฉ"" = ๐ฃ!! ๐!" ๐ฃ!! ๐"! ๐"!๐!" + ๐ด""๐ฝ""
LU Factorization
๐ต!! ๐ต!" ๐ต"! ๐ต"" โฆ ๐ต!# โฆ ๐ต"# โฎ โฎ ๐ต#! ๐ต#" โฑ โฎ โฆ ๐ต## = ๐!! ๐!" ๐"! ๐ฉ"" = 1 ๐ ๐"! ๐ด"" ๐ฃ!! ๐!" ๐ ๐ฝ""
๐$$ ๐$% ๐%$ ๐ฉ%% ๐$$: scalar ๐$%: row vector (1ร(๐ โ 1)) ๐%$: column vector (๐ โ 1)ร1 ๐ฉ%%: matrix (๐ โ 1)ร(๐ โ 1)
๐!! ๐!" ๐"! ๐ฉ"" = ๐ฃ!! ๐!" ๐ฃ!! ๐"! ๐"!๐!" + ๐ด""๐ฝ""
1) First row of ๐ฝ is the first row of ๐ฉ 2) ๐<= = =
%!! ๐<=
3) ๐ต = ๐ด""๐ฝ"" = ๐ฉ"" โ ๐"!๐!" Need another factorization!
Known!
First column of ๐ด is the first column of ๐ฉ/ ๐ฃ==
Example
๐ต = 2 8 1 2 4 1 3 3 1 2 1 3 6 2 4 2 ๐ฝ = 2 8 4 1 ๐ด = 1 0.5 0.5 0.5
1) First row of ๐ฝ is the first row of ๐ฉ 2) First column of ๐ด is the first column of ๐ฉ/ ๐ฃ==
3) ๐ด""๐ฝ"" = ๐ฉ"" โ ๐"!๐!"
๐ด%%๐ฝ%% = ๐ฉ%% โ ๐%$๐$% = 2 3 3 2 3 6 2 4 2 โ 4 2 0.5 4 4 2 0.5 2 0.5
๐ต = 2 8 1 โ2 4 1 1 2.5 1 โ2 1 โ1 4 1.5 2 1.5
๐ต = 2 8 1 โ2 4 1 1 2.5 1 โ2 1 โ1 4 1.5 2 1.5 ๐ฝ = 2 8 โ2 4 1 1 2.5 ๐ด = 1 0.5 1 0.5 1 0.5 0.5
๐ด%%๐ฝ%% = ๐ฉ%% โ ๐%$๐$% = 4 1.5 2 1.5 โ 1 2.5 0.5 1.25
๐ต = 2 8 1 โ2 4 1 1 2.5 1 โ2 1 โ1 3 โ1 1.5 0.25
๐ต = 2 8 1 โ2 4 1 1 2.5 1 โ2 1 โ1 3 โ1 1.5 0.25 ๐ฝ = 2 8 โ2 4 1 1 2.5 3 โ1 ๐ด = 1 0.5 1 0.5 1 0.5 0.5 1 0.5
๐ด%%๐ฝ%% = ๐ฉ%% โ ๐%$๐$% = 0.25 โ โ0.5 = 0.75
๐ฝ = 2 8 โ2 4 1 1 2.5 3 โ1 0.75 ๐ด = 1 0.5 1 0.5 1 0.5 0.5 1 0.5 1
LU Factorization
๐ต!! ๐ต!" ๐ต"! ๐ต"" โฆ ๐ต!# โฆ ๐ต"# โฎ โฎ ๐ต#! ๐ต#" โฑ โฎ โฆ ๐ต## = ๐!! ๐!" ๐"! ๐ฉ"" = 1 ๐ ๐"! ๐ด"" ๐ฃ!! ๐!" ๐ ๐ฝ""
๐$$ ๐$% ๐%$ ๐ฉ%% ๐$$: scalar ๐$%: row vector (1ร(๐ โ 1)) ๐%$: column vector (๐ โ 1)ร1 ๐ฉ%%: matrix (๐ โ 1)ร(๐ โ 1)
๐!! ๐!" ๐"! ๐ฉ"" = ๐ฃ!! ๐!" ๐ฃ!! ๐"! ๐"!๐!" + ๐ด""๐ฝ""
1) First row of ๐ฝ is the first row of ๐ฉ 2) ๐<= = =
%!! ๐<=
3) ๐ต = ๐ด""๐ฝ"" = ๐ฉ"" โ ๐"!๐!" Need another factorization!
Known!
First column of ๐ด is the first column of ๐ฉ/ ๐ฃ==
Cost of solving linear system of equations
Cost of solving triangular systems
๐ฆ&= ๐& โ โ'(&)!
*
๐&'๐ฆ' ๐&& , ๐ = ๐ โ 1, ๐ โ 2, โฆ , 1 ๐ฆ* = ๐*/๐**
Cost of solving triangular systems
๐ฆ&= ๐& โ โ'(&)!
*
๐&'๐ฆ' ๐&& , ๐ = ๐ โ 1, ๐ โ 2, โฆ , 1 ๐ฆ* = ๐*/๐**
๐ divisions ๐ ๐ โ 1 /2 subtractions/additions ๐ ๐ โ 1 /2 multiplications
Computational complexity is ๐(๐")
๐ divisions ๐ ๐ โ 1 /2 subtractions/additions ๐ ๐ โ 1 /2 multiplications
Computational complexity is ๐(๐")
๐ฆ&= ๐& โ โ'(!
&+! ๐&'๐ฆ'
๐&& , ๐ = 2,3, โฆ , ๐ ๐ฆ! = ๐!/๐!!
Cost of LU factorization
&
"#$ %
๐ = 1 2 ๐ ๐ + 1 &
"#$ %
๐& = 1 6 ๐ ๐ + 1 2๐ + 1
Side note:
Solving linear systems
In general, we can solve a linear system of equations following the steps: 1) Factorize the matrix ๐ฉ : ๐ฉ = ๐ด๐ฝ (complexity ๐(๐C))
2) Solve ๐ด ๐ = ๐ (complexity ๐(๐<))
3) Solve ๐ฝ ๐ = ๐ (complexity ๐(๐<))
But why should we decouple the factorization from the actual solve? (Remember from Linear Algebra, Gaussian Elimination does not decouple these two stepsโฆ)
Example
Letโs assume that when solving the system of equations ๐ณ ๐ฝ = ๐ฎ, we observe the following:
- When the matrix ๐ณ has dimensions (100,100), computing the LU factorization takes
about 1 second and each solve (forward + backward substitution) takes about 0.01 seconds. Estimate the total time it will take to find the response ๐ฝ corresponding to 10 different vectors ๐ฎ when the matrix ๐ณ has dimensions (1000,1000)? ๐ต) ~10 ๐ก๐๐๐๐๐๐ก ๐ถ) ~10" ๐ก๐๐๐๐๐๐ก ๐ท) ~10# ๐ก๐๐๐๐๐๐ก ๐ธ) ~10$ ๐ก๐๐๐๐๐๐ก ๐น) ~10, ๐ก๐๐๐๐๐๐ก
LU Factorization with pivoting
What can go wrong with the previous algorithm for LU factorization?
๐ต = 2 8 1 ๐ 4 1 3 3 1 2 1 3 6 2 4 2 ๐ฝ = 2 8 4 1 ๐ด = 1 0.5 0.5 0.5 ๐ต โ ๐"!๐!" = 2 8 1 ๐ 4 1 1 2.5 1 โ2 1 โ1 4 1.5 2 1.5 ๐"!๐!" = 4 2 0.5 4 2 0.5 4 2 0.5
The next update for the lower triangular matrix will result in a division by zero! LU factorization fails. What can we do to get something like an LU factorization?
Pivoting
Approach:
- 1. Swap rows if there is a zero entry in the diagonal
- 2. Even better idea: Find the largest entry (by absolute value) and