ECS231 Least-squares problems
(Introduction to Randomized Algorithms)
May 21, 2019
1 / 12
ECS231 Least-squares problems (Introduction to Randomized - - PowerPoint PPT Presentation
ECS231 Least-squares problems (Introduction to Randomized Algorithms) May 21, 2019 1 / 12 Outline 1. linear least squares review 2. Solving LS by sampling 3. Solving LS by randomized preconditioning 4. Gradient-based optimization
1 / 12
2 / 12
◮ Linear least squares problem
x Ax − b2 ◮ Normal equation
◮ Optimal solution
3 / 12
◮ MATLAB demo code: lsbysampling.m
◮ Further reading: Avron et al, SIAM J. Sci. Comput., 32:1217-1236,
4 / 12
◮ Linear least squares problem
x AT x − b2 ◮ Normal equation
◮ If we can find a P such that P −1A is well-conditioned, then it yields
5 / 12
◮ MATLAB demo code: lsbyrandprecond.m
◮ Further reading: Coakley et al, SIAM J. Sci. Comput., 33:849-868,
6 / 12
◮ Optimization problem
x
◮ Gradient: ∇xf(x)
2)
∂ ∂αf(x + αu) = uT ∇xf(x) ◮ To min f(x), we would like to find the direction u in which f
u,uT u=1 uT ∇xf(x) =
u,uT u=1 u2∇xf(x)2 cos θ
7 / 12
◮ The method of steepest descent
8 / 12
◮ Minimization problem
x f(x) = min x
2 ◮ Gradient: ∇xf(x) = AT Ax − AT b ◮ The method of gradient descent:
◮ set the stepsize ǫ and tolerance δ to small positive numbers. ◮ while AT Ax − AT b2 > δ do
◮ end while 9 / 12
10 / 12
◮ Minimization problem:
x
2 = argmin x
n
x
2 (ai, x − bi)2 and a1, a2... are the rows of A. ◮ Gradient: ∇xfi(x) = n(ai, x − bi)ai. ◮ The stochastic gradient descent (SGD) method solves the LS problem
◮ uniformally at random, or ◮ weighted sampling 1
11 / 12
12 / 12