lecture 7 more math image filtering
play

Lecture 7: More Math + Image Filtering Justin Johnson EECS 442 WI - PowerPoint PPT Presentation

Lecture 7: More Math + Image Filtering Justin Johnson EECS 442 WI 2020: Lecture 7 - 1 January 30, 2020 Administrative HW0 was due yesterday! HW1 due a week from yesterday Justin Johnson EECS 442 WI 2020: Lecture 7 - 2 January 30, 2020


  1. Lecture 7: More Math + Image Filtering Justin Johnson EECS 442 WI 2020: Lecture 7 - 1 January 30, 2020

  2. Administrative HW0 was due yesterday! HW1 due a week from yesterday Justin Johnson EECS 442 WI 2020: Lecture 7 - 2 January 30, 2020

  3. Cool Talk Today: https://cse.engin.umich.edu/event/numpy-a-look-at-the-past-present-and-future-of-array-computation Justin Johnson EECS 442 WI 2020: Lecture 7 - 3 January 30, 2020

  4. Last Time: Matrices, Vectorization, Linear Algebra Justin Johnson EECS 442 WI 2020: Lecture 7 - 4 January 30, 2020

  5. Eigensystems โ€ข An eigenvector ๐’˜ ๐’‹ and eigenvalue ๐œ‡ $ of a matrix ๐‘ฉ satisfy ๐‘ฉ๐’˜ ๐’‹ = ๐œ‡ $ ๐’˜ ๐’‹ ( ๐‘ฉ๐’˜ ๐’‹ is scaled by ๐œ‡ $ ) โ€ข Vectors and values are always paired and typically ' = 1 you assume ๐’˜ ๐’‹ โ€ข Biggest eigenvalue of A gives bounds on how much ๐‘” ๐’š = ๐‘ฉ๐’š stretches a vector x . โ€ข Hints of what people really mean: โ€ข โ€œLargest eigenvectorโ€ = vector w/ largest value โ€ข โ€œSpectralโ€ just means thereโ€™s eigenvectors somewhere Justin Johnson EECS 442 WI 2020: Lecture 7 - 5 January 30, 2020

  6. Suppose I have points in a grid Justin Johnson EECS 442 WI 2020: Lecture 7 - 6 January 30, 2020

  7. Now I apply f( x ) = Ax to these points Pointy-end: Ax . Non-Pointy-End: x Justin Johnson EECS 442 WI 2020: Lecture 7 - 7 January 30, 2020

  8. ๐‘ฉ = 1.1 0 0 1.1 Red box โ€“ unit square, Blue box โ€“ after f( x ) = Ax . What are the yellow lines and why? Justin Johnson EECS 442 WI 2020: Lecture 7 - 8 January 30, 2020

  9. ๐‘ฉ = 0.8 0 0 1.25 Now I apply f( x ) = Ax to these points Pointy-end: Ax . Non-Pointy-End: x Justin Johnson EECS 442 WI 2020: Lecture 7 - 9 January 30, 2020

  10. ๐‘ฉ = 0.8 0 0 1.25 Red box โ€“ unit square, Blue box โ€“ after f( x ) = Ax . What are the yellow lines and why? Justin Johnson EECS 442 WI 2020: Lecture 7 - 10 January 30, 2020

  11. ๐‘ฉ = cos(๐‘ข) โˆ’sin(๐‘ข) sin(๐‘ข) cos(๐‘ข) Red box โ€“ unit square, Blue box โ€“ after f( x ) = Ax . Can we draw any yellow lines? Justin Johnson EECS 442 WI 2020: Lecture 7 - 11 January 30, 2020

  12. Eigenvectors of Symmetric Matrices โ€ข Always n mutually orthogonal eigenvectors with n (not necessarily) distinct eigenvalues โ€ข For symmetric ๐‘ฉ , the eigenvector with the largest ๐’š ๐‘ผ ๐‘ฉ๐’š eigenvalue maximizes ๐’š ๐‘ผ ๐’š (smallest/min) โ€ข So for unit vectors (where ๐’š ๐‘ผ ๐’š = 1 ), that eigenvector maximizes ๐’š ๐‘ผ ๐‘ฉ๐’š โ€ข A surprisingly large number of optimization problems rely on (max/min)imizing this Justin Johnson EECS 442 WI 2020: Lecture 7 - 12 January 30, 2020

  13. Singular Value Decomposition Can always write a mxn matrix A as: ๐‘ฉ = ๐‘ฝ๐šป๐‘พ ๐‘ผ 0 ฯƒ 1 A = U โˆ‘ ฯƒ 2 ฯƒ 3 0 Scale Rotation Eigenvectors Sqrt of of AA T Eigenvalues of A T A Justin Johnson EECS 442 WI 2020: Lecture 7 - 13 January 30, 2020

  14. Singular Value Decomposition Can always write a mxn matrix A as: ๐‘ฉ = ๐‘ฝ๐šป๐‘พ ๐‘ผ V T A = U โˆ‘ Rotation Scale Rotation Eigenvectors Sqrt of Eigenvectors of AA T Eigenvalues of A T A of A T A Justin Johnson EECS 442 WI 2020: Lecture 7 - 14 January 30, 2020

  15. Singular Value Decomposition โ€ข Every matrix is a rotation, scaling, and rotation โ€ข Number of non-zero singular values = rank / number of linearly independent vectors โ€ข โ€œClosestโ€ matrix to A with a lower rank 0 ฯƒ 1 V T ฯƒ 2 = A U ฯƒ 3 0 Justin Johnson EECS 442 WI 2020: Lecture 7 - 15 January 30, 2020

  16. Singular Value Decomposition โ€ข Every matrix is a rotation, scaling, and rotation โ€ข Number of non-zero singular values = rank / number of linearly independent vectors โ€ข โ€œClosestโ€ matrix to A with a lower rank 0 ฯƒ 1 V T ฯƒ 2 = ร‚ U 0 0 Justin Johnson EECS 442 WI 2020: Lecture 7 - 16 January 30, 2020

  17. Singular Value Decomposition โ€ข Every matrix is a rotation, scaling, and rotation โ€ข Number of non-zero singular values = rank / number of linearly independent vectors โ€ข โ€œClosestโ€ matrix to A with a lower rank โ€ข Secretly behind basically many things you do with matrices Justin Johnson EECS 442 WI 2020: Lecture 7 - 17 January 30, 2020

  18. Solving Least-Squares Start with two points (x i ,y i ) (x 2 ,y 2 ) ๐’› = ๐‘ฉ๐’˜ ๐‘ง @ ๐‘ง ' = ๐‘ฆ @ 1 ๐‘› ๐‘ฆ ' 1 ๐‘ (x 1 ,y 1 ) ๐‘ง @ ๐‘ง ' = ๐‘›๐‘ฆ @ + ๐‘ ๐‘›๐‘ฆ ' + ๐‘ We know how to solve this โ€“ invert A and find v (i.e., (m,b) that fits points) Justin Johnson EECS 442 WI 2020: Lecture 7 - 18 January 30, 2020

  19. Solving Least-Squares Start with two points (x i ,y i ) (x 2 ,y 2 ) ๐’› = ๐‘ฉ๐’˜ ๐‘ง @ ๐‘ง ' = ๐‘ฆ @ 1 ๐‘› ๐‘ฆ ' 1 ๐‘ (x 1 ,y 1 ) ' ๐‘ง @ ๐‘ง ' โˆ’ ๐‘›๐‘ฆ @ + ๐‘ ๐’› โˆ’ ๐‘ฉ๐’˜ ' = ๐‘›๐‘ฆ ' + ๐‘ ' + ๐‘ง ' โˆ’ ๐‘›๐‘ฆ ' + ๐‘ ' = ๐‘ง @ โˆ’ ๐‘›๐‘ฆ @ + ๐‘ The sum of squared differences between the actual value of y and what the model says y should be. Justin Johnson EECS 442 WI 2020: Lecture 7 - 19 January 30, 2020

  20. Solving Least-Squares Suppose there are n > 2 points ๐’› = ๐‘ฉ๐’˜ ๐‘ง @ ๐‘ฆ @ 1 ๐‘› โ‹ฎ โ‹ฎ โ‹ฎ = ๐‘ ๐‘ง G ๐‘ฆ G 1 Compute ๐‘ง โˆ’ ๐ต๐‘ฆ ' again K ๐’› โˆ’ ๐‘ฉ๐’˜ ' = I ๐‘ง $ โˆ’ (๐‘›๐‘ฆ $ + ๐‘) ' $J@ Justin Johnson EECS 442 WI 2020: Lecture 7 - 20 January 30, 2020

  21. Solving Least-Squares Given y , A , and v with y = Av overdetermined ( A tall / more equations than unknowns) We want to minimize ๐’› โˆ’ ๐‘ฉ๐’˜ ๐Ÿ‘ , or find: arg min ๐’˜ ๐’› โˆ’ ๐‘ฉ๐’˜ ' (The value of x that makes the expression smallest) Solution satisfies ๐‘ฉ ๐‘ผ ๐‘ฉ ๐’˜ โˆ— = ๐‘ฉ ๐‘ผ ๐’› or R@ ๐‘ฉ ๐‘ผ ๐’› ๐’˜ โˆ— = ๐‘ฉ ๐‘ผ ๐‘ฉ (Donโ€™t actually compute the inverse!) Justin Johnson EECS 442 WI 2020: Lecture 7 - 21 January 30, 2020

  22. When is Least-Squares Possible? Given y , A , and v . Want y = Av Want n outputs, have n knobs to y = A v fiddle with, every knob is useful if A is full rank. A: rows (outputs) > columns = v (knobs). Thus canโ€™t get precise y A output you want (not enough knobs). So settle for โ€œclosestโ€ knob setting. Justin Johnson EECS 442 WI 2020: Lecture 7 - 22 January 30, 2020

  23. When is Least-Squares Possible? Given y , A , and v . Want y = Av Want n outputs, have n knobs to y = A v fiddle with, every knob is useful if A is full rank. A: columns (knobs) > rows y = A v (outputs). Thus, any output can be expressed in infinite ways. Justin Johnson EECS 442 WI 2020: Lecture 7 - 23 January 30, 2020

  24. Homogeneous Least-Squares Given a set of unit vectors (aka directions) ๐’š ๐Ÿ , โ€ฆ , ๐’š ๐’ and I want vector ๐’˜ that is as orthogonal to all the ๐’š ๐’‹ as possible (for some definition of orthogonal) Stack ๐’š ๐’‹ into A , compute Av ๐’š ๐’ โ€ฆ ๐’š ๐Ÿ‘ 0 if ๐‘ผ ๐‘ผ ๐’˜ โˆ’ ๐’š ๐Ÿ โˆ’ ๐’š ๐Ÿ ๐’š ๐Ÿ orthog ๐‘ฉ๐’˜ = ๐’˜ = โ‹ฎ โ‹ฎ ๐‘ผ ๐‘ผ ๐’˜ โˆ’ ๐’š ๐’ โˆ’ ๐’š ๐’ ๐’ ๐’˜ ๐Ÿ‘ ๐‘ฉ๐’˜ ๐Ÿ‘ = I ๐‘ผ ๐’˜ Compute ๐’š ๐’‹ ๐’‹ Sum of how orthog. v is to each x Justin Johnson EECS 442 WI 2020: Lecture 7 - 24 January 30, 2020

  25. Homogenous Least-Squares โ€ข A lot of times, given a matrix A we want to find the v that minimizes ๐‘ฉ๐’˜ ' . โ€ข I.e., want ๐ฐ โˆ— = arg min ' ๐‘ฉ๐’˜ ' ๐’˜ โ€ข Whatโ€™s a trivial solution? โ€ข Set v = 0 โ†’ Av = 0 โ€ข Exclude this by forcing v to have unit norm Justin Johnson EECS 442 WI 2020: Lecture 7 - 25 January 30, 2020

  26. Homogenous Least-Squares ' Letโ€™s look at ๐‘ฉ๐’˜ ' ' = ๐‘ฉ๐’˜ ' Rewrite as dot product ' = ๐๐ฐ Z (๐๐ฐ) ๐‘ฉ๐’˜ ' Distribute transpose ' = ๐’˜ ๐‘ผ ๐‘ฉ ๐‘ผ ๐๐ฐ = ๐ฐ ๐” ๐ ๐” ๐ ๐ฐ ๐‘ฉ๐’˜ ' We want the vector minimizing this quadratic form Where have we seen this? Justin Johnson EECS 442 WI 2020: Lecture 7 - 26 January 30, 2020

  27. Homogenous Least-Squares Ubiquitious tool in vision: ๐’˜ [ J@ ๐‘ฉ๐’˜ ' arg min (1) โ€œSmallestโ€* eigenvector of ๐‘ฉ ๐‘ผ ๐‘ฉ (2) โ€œ Smallestโ€ right singular vector of ๐‘ฉ For min โ†’ max, switch smallest โ†’ largest *Note: ๐‘ฉ ๐‘ผ ๐‘ฉ is positive semi-definite so it has all non-negative eigenvalues Justin Johnson EECS 442 WI 2020: Lecture 7 - 27 January 30, 2020

  28. Derivatives Justin Johnson EECS 442 WI 2020: Lecture 7 - 28 January 30, 2020

  29. Derivatives Remember derivatives? Derivative: rate at which a function f(x) changes at a point as well as the direction that increases the function Justin Johnson EECS 442 WI 2020: Lecture 7 - 29 January 30, 2020

  30. Given quadratic function f(x) ๐‘” ๐‘ฆ, ๐‘ง = ๐‘ฆ โˆ’ 2 ' + 5 ๐‘” ๐‘ฆ is function ๐‘• ๐‘ฆ = ๐‘” ] ๐‘ฆ aka ๐‘• ๐‘ฆ = ๐‘’ ๐‘’๐‘ฆ ๐‘”(๐‘ฆ) Justin Johnson EECS 442 WI 2020: Lecture 7 - 30 January 30, 2020

  31. Given quadratic function f(x) ๐‘” ๐‘ฆ, ๐‘ง = ๐‘ฆ โˆ’ 2 ' + 5 Whatโ€™s special about x=2? ๐‘” ๐‘ฆ minim. at 2 ๐‘• ๐‘ฆ = 0 at 2 a = minimum of f โ†’ ๐‘• ๐‘ = 0 Reverse is not true Justin Johnson EECS 442 WI 2020: Lecture 7 - 31 January 30, 2020

  32. Rates of change ๐‘” ๐‘ฆ, ๐‘ง = ๐‘ฆ โˆ’ 2 ' + 5 Suppose I want to increase f(x) by changing x: Blue area: move left Red area: move right Derivative tells you direction of ascent and rate Justin Johnson EECS 442 WI 2020: Lecture 7 - 32 January 30, 2020

  33. Calculus to Know โ€ข Really need intuition โ€ข Need chain rule โ€ข Rest you should look up / use a computer algebra system / use a cookbook โ€ข Partial derivatives (and thatโ€™s it from multivariable calculus) Justin Johnson EECS 442 WI 2020: Lecture 7 - 33 January 30, 2020

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend