some recent advances
play

Some Recent Advances in Non-convex Optimization Purushottam Kar - PowerPoint PPT Presentation

Some Recent Advances in Non-convex Optimization Purushottam Kar IIT KANPUR Outline of the Talk Recap of Convex Optimization Why Non-convex Optimization? Non-convex Optimization: A Brief Introduction Robust Regression : A


  1. Some Recent Advances in Non-convex Optimization Purushottam Kar IIT KANPUR

  2. Outline of the Talk • Recap of Convex Optimization • Why Non-convex Optimization? • Non-convex Optimization: A Brief Introduction • Robust Regression : A Non-convex Approach • Robust Regression: Application to Face Recognition • Robust PCA : A Sketch and Application to Foreground Extraction in Images

  3. Recap of Convex Optimization

  4. Convex Optimization Convex set Convex function

  5. Examples Linear Programming Quadratic Programming Semidefinite Programming

  6. Applications Regression Classification Resource Allocation Clustering/Partitioning Signal Processing Dimensionality Reduction

  7. Techniques • Projected (Sub)gradient Methods • Stochastic, mini-batch variants • Primal, dual, primal-dual approaches • Coordinate update techniques • Interior Point Methods • Barrier methods • Annealing methods • Other Methods • Cutting plane methods • Accelerated routines • Proximal methods • Distributed optimization • Derivative-free optimization

  8. Why Non-convex Optimization?

  9. Gene Expression Analysis DNA micro-array gene expression data … www.tes.com

  10. Recommender Systems 𝑙 𝑜 = 𝑛

  11. Image Reconstruction and Robust Face Recognition = + + 0.90 0.05 0.05 = ≈ + + 0.92 0.01 0.07 = ≈ + + 0.65 0.15 0.20

  12. Image Denoising and Robust Face Recognition = = + + + + + ⋯ 𝑜

  13. Large Scale Surveillance • Foreground-background separation = = + 𝑜 = + 𝑛 www.extremetech.com

  14. Non Convex Optimization Sparse Recovery Matrix Completion Robust Regression Robust PCA

  15. Non-convex Optimization: A Brief Introduction

  16. Relaxation-based Techniques • “ Convexify ” the feasible set

  17. Alternating Minimization Matrix Completion Robust PCA … also Robust Regression, coming up

  18. Projected Gradient Descent Top 𝑡 elements by magnitude Perform 𝑙 -truncated SVD Sparse Recovery

  19. Pursuit and Greedy Methods Set of “atoms” Sparse Recovery

  20. Robust Regression: A Non-convex Approach

  21. Linear Regression

  22. Linear Regression

  23. Linear Regression image.frompo.com

  24. Linear Regression with Noise ≈

  25. Linear Regression with Noise Residual

  26. Linear Regression with Noise

  27. Linear Regression with Noise

  28. Linear Regression with Noise

  29. Linear Regression with Corruptions www.toonvectors.com

  30. Robust Regression Corruptions are adversarial, adaptive, but only on a “few” locations

  31. Robust Regression Corruptions are adversarial, adaptive, but only on a “few” locations Attempt 1 3

  32. Robust Regression Corruptions are adversarial, adaptive, but only on a “few” locations Attempt 1 10

  33. Robust Regression Corruptions are adversarial, adaptive, but only on a “few” locations Attempt 1 10

  34. Robust Regression Corruptions are adversarial, adaptive, but only on a “few” locations Attempt 2 [Wright and Ma 2010*, Nguyen et al , 2013*]

  35. Lessons from History If among these errors are some which appear too large to be admissible, then those equations which produced these errors will be rejected, as coming from too faulty experiments, and the unknowns will be determined by means of the other equations, which will then give much smaller errors Adrien-Marie Legendre, On the Method of Least Squares , 1805

  36. Linear Regression with Corruptions

  37. Linear Regression with Corruptions

  38. Linear Regression with Corruptions

  39. Linear Regression with Corruptions TORRENT-FC Thresholding Operator-based Robust RegrEssioN meThod [Bhatia et al , 2015]

  40. TORRENT in Action!

  41. TORRENT in Action!

  42. TORRENT in Action!

  43. TORRENT in Action!

  44. TORRENT in Action!

  45. TORRENT in Action!

  46. TORRENT in Action!

  47. TORRENT in Action!

  48. TORRENT in Action!

  49. TORRENT in Action!

  50. TORRENT in Action!

  51. TORRENT in Action!

  52. TORRENT in Action!

  53. TORRENT in Action!

  54. TORRENT in Action!

  55. TORRENT in Action!

  56. TORRENT in Action!

  57. TORRENT in Action!

  58. TORRENT in Action!

  59. TORRENT in Action!

  60. TORRENT in Action!

  61. TORRENT in Action!

  62. TORRENT in Action!

  63. Alt-Min in Theory Recovery Guarantees Robust against adaptive adversaries has access to data , gold model , and noise Requirement: Data needs to satisfy some “nice” properties Enough data needs to be present Guarantees: TORRENT will recover the gold model if i.e.

  64. Alt-Min in Theory Recovery Guarantees Robust against adaptive adversaries has access to data , gold model , and noise Requirement: Data needs to satisfy some “nice” properties Enough data needs to be present Guarantees: TORRENT will recover the gold model if i.e.

  65. Alt-Min in Theory Convergence Rates Linear rate of convergence Suppose each alternation ≡ one step 1 After 𝑈 = log 𝜗 time steps Invariant: at time 𝑢 , “active set” s.t

  66. Alt-Min in Theory Convergence Rates Linear rate of convergence Suppose each alternation ≡ one step 1 After 𝑈 = log 𝜗 time steps Invariant: at time 𝑢 , “active set” s.t

  67. Alt-Min in Theory Convergence Rates Linear rate of convergence Suppose each alternation ≡ one step 1 After 𝑈 = log 𝜗 time steps Invariant: at time 𝑢 , “active set” s.t

  68. Alt-Min in Theory Convergence Rates Linear rate of convergence Suppose each alternation ≡ one step 1 After 𝑈 = log 𝜗 time steps Invariant: at time 𝑢 , “active set” s.t

  69. Alt-Min in Theory Convergence Rates Linear rate of convergence Suppose each alternation ≡ one step 1 After 𝑈 = log 𝜗 time steps Invariant: at time 𝑢 , “active set” s.t

  70. Alt-Min in Theory Convergence Rates Linear rate of convergence Suppose each alternation ≡ one step 1 After 𝑈 = log 𝜗 time steps Invariant: at time 𝑢 , “active set” s.t

  71. Alt-Min in Theory Convergence Rates Linear rate of convergence Suppose each alternation ≡ one step 1 After 𝑈 = log 𝜗 time steps Invariant: at time 𝑢 , “active set” s.t

  72. Alt-Min in Theory Convergence Rates Linear rate of convergence Suppose each alternation ≡ one step 1 After 𝑈 = log 𝜗 time steps Invariant: at time 𝑢 , “active set” s.t

  73. Alt-Min in Practice Quality of Recovery [Bhatia et al 2015]

  74. Alt-Min in Practice Speed of Recovery [Bhatia et al 2015]

  75. Robust Regression: Application to Face Recognition Extended Yale B dataset, 38 people, 800 images

  76. Face Recognition 10% noise 30% noise 50% noise 70% noise [Bhatia et al 2015]

  77. Robust PCA: A Sketch and Application to Foreground Extraction in Images

  78. The Alternating Projection Procedure [Netrapalli et al 2014]

  79. Concluding Comments Non-convex optimization is an exciting area Widespread applications • Much better modelling of problems • Much more scalable algorithms • Provable guarantees So … • Full of opportunities • Full of challenges

  80. Acknowledgements http://research.microsoft.com/en-us/projects/altmin/default.aspx Portions of this talk were based on joint work with Ambuj Tewari Kush Bhatia Prateek Jain U. Michigan, Ann Arbor Microsoft Research Microsoft Research

  81. The Data Sciences Gang@IITK Medha Atre Sumit Ganguly Purushottam Harish Karnick Arnab Kar Bhattacharya Vinay Gaurav Indranil Saha Sandeep Shukla Namboodiri Piyush Rai Sharma

  82. Machine Learning Vision, Image Databases, Processing Data Mining Our Strengths Online, Streaming Cyber-physical Algorithms Systems

  83. Questions?

  84. TORRENT as an Alt-Min Procedure • TORRENT indeed performs Alt-Min • Two variables in TORRENT – active set and model • encodes the complement of the corruption vector • TORRENT alternates between • Fixing model and choosing active set • Fixing active set and choosing model • Both steps reduce the residual as much as possible

  85. Linear Regression with Corruptions TORRENT-GD Thresholding Operator-based Robust RegrEssioN meThod [Bhatia et al , 2015]

  86. Linear Regression with Corruptions TORRENT-HYB Thresholding Operator-based Robust RegrEssioN meThod [Bhatia et al , 2015]

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend