justin solomon david bommes
play

Justin Solomon David Bommes Princeton University RWTH Aachen - PowerPoint PPT Presentation

Justin Solomon David Bommes Princeton University RWTH Aachen University Optimization. Synonym(-ish): Variational methods. Optimization. Synonym(-ish): Variational methods. Caveat: Slightly different connotation in ML Client Which


  1. Justin Solomon David Bommes Princeton University RWTH Aachen University

  2. Optimization. Synonym(-ish): Variational methods.

  3. Optimization. Synonym(-ish): Variational methods. Caveat: Slightly different connotation in ML

  4.  Client Which optimization tool is relevant?  Designer Can I design an algorithm for this problem?

  5. Patterns, algorithms, & examples common in geometry processing. Optimization is a huge field.

  6. Part I (Justin)  Vocabulary  Simple examples  Unconstrained optimization  Equality-constrained optimization

  7. Part II (David)  Inequality constraints  Advanced algorithms  Discrete problems  Conclusion

  8. Part I (Justin)  Vocabulary  Simple examples  Unconstrained optimization  Equality-constrained optimization

  9. Objective (“Energy Function”)

  10. Equality Constraints

  11. Inequality Constraints

  12. https://en.wikipedia.org/?title=Gradient Gradient

  13. http://math.etsu.edu/multicalc/prealpha/Chap2/Chap2-5/10-3a-t3.gif Hessian

  14. https://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant Jacobian

  15. (unconstrained) Saddle point Local max Local min Critical point

  16. Critical points may not be minima.

  17. More later Numerical Algorithms , Solomon

  18. Part I (Justin)  Vocabulary  Simple examples  Unconstrained optimization  Equality-constrained optimization

  19. How effective are generic optimization tools?

  20. Try the simplest solver first.

  21. (assume A is symmetric and positive definite)

  22. Normal equations (better solvers for this case!)

  23. G. Peyré, mesh processing course slides

  24.  𝒙 𝒋𝒌 ≡ 𝟐 : Tutte embedding  𝒙 𝒋𝒌 from mesh: Harmonic embedding Assumption: 𝒙 symmetric.

  25. Laplacian matrix!

  26. K. Crane, brickisland.net Leads to famous cotangent weights! Useful for interpolation.

  27.  Never construct 𝑩 −𝟐 explicitly (if you can avoid it)  Added structure helps Sparsity, symmetry, positive definite

  28.  Direct ( explicit matrix)  Dense: Gaussian elimination/LU, QR for least-squares  Sparse: Reordering (SuiteSparse, Eigen)  Iterative ( apply matrix repeatedly)  Positive definite: Conjugate gradients  Symmetric: MINRES, GMRES  Generic: LSQR

  29. Induced by the connectivity of the triangle mesh. Iteration of CG has local effect ⇒ Precondition!

  30. What if 𝑾 𝟏 = {} ?

  31. Prevents trivial solution 𝒚 ≡ 𝟏 . Extract the smallest eigenvalue.

  32. Prevents trivial solution 𝒚 ≡ 𝟏 . N contains basis for null space of A. Extract the smallest nonzero eigenvalue.

  33. Mullen et al. “Spectral Conformal Parameterization.” SGP 2008.

  34. 2 3 4 5 6 7 8 9 10 “Laplace -Beltrami Eigenfunctions ”

  35. Roughly: 1.Extract Laplace-Beltrami eigenfunctions: 2.Find mapping matrix (linear solve!): Ovsjanikov et al. “Functional Maps.” SIGGRAPH 2012.

  36. Part I (Justin)  Vocabulary  Simple examples  Unconstrained optimization  Equality-constrained optimization

  37. Unstructured.

  38. Gradient descent

  39. Line search Gradient descent

  40. Multiple optima! Gradient descent

  41. Quadratic convergence on convex problems! (Nesterov 1983) Accelerated gradient descent

  42. 1 2 Line search for stability 3 Newton’s Method

  43. Hessian approximation  (Often sparse) approximation from previous samples and gradients  Inverse in closed form! Quasi-Newton: BFGS and friends

  44. M. Kazhdan Often continuous gradient descent

  45. Fröhlich and Botsch . “Example - Driven Deformations Based on Discrete Shells.” CGF 2011.

  46. Roughly: 1. Linearly interpolate edge lengths and dihedral angles. 2. Nonlinear optimization for vertex positions. Sum of squares: Gauss-Newton

  47.  Matlab: fminunc or minfunc  C++: libLBFGS , dlib , others Typically provide functions for function and gradient (and optionally, Hessian). Try several!

  48. Part I (Justin)  Vocabulary  Simple examples  Unconstrained optimization  Equality-constrained optimization

  49. - Decrease f : −𝛂𝒈 -Violate constraint: ±𝛂𝒉

  50. Want:

  51. Turns constrained optimization into unconstrained root-finding.

  52.  Reparameterization Eliminate constraints to reduce to unconstrained case  Newton’s method Approximation: quadratic function with linear constraint  Penalty method Augment objective with barrier term, e.g. 𝒈 𝒚 + 𝝇|𝒉 𝒚 |

  53. (assume A is symmetric and positive definite)

  54. No longer positive definite!

  55. Fix (or adjust) damping parameter 𝝁 > 𝟏 . Example: Levenberg-Marquardt

  56. Huang et al. “L1 -Based Construction of Polycube Maps from Complex Shapes.” TOG 2014.

  57. Align with coordinate axes Preserve area Note: Final method includes several more terms!

  58. versus Try lightweight options

  59. versus Try lightweight options

  60. “Geometric median” Repeatedly solve linear systems

  61. d can be a Bregman divergence

  62. Decompose as sum of hard part f and easy part g . https://blogs.princeton.edu/imabandit/2013/04/11/orf523-ista-and-fista/

  63. Does nothing when constraint is satisfied Add constraint to objective

  64. https://web.stanford.edu/~boyd/papers/pdf/admm_slides.pdf

  65. Augmented part Solomon et al. “Earth Mover’s Distances on Discrete Surfaces.” SIGGRAPH 2014. Want two easy subproblems

  66. https://en.wikipedia.org/wiki/Frank%E2%80%93Wolfe_algorithm Linearize objective, not constraints

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend