restricted isometry property of low dimensional subspaces
play

Restricted Isometry Property of Low Dimensional Subspaces and Its - PowerPoint PPT Presentation

Restricted Isometry Property of Low Dimensional Subspaces and Its Application in Compressed Subspace Clustering Co-work with: Gen Li, Yuchen Jiao, Linghang Meng, Qinghua Liu Dept. EE, Tsinghua University May 2018 Yuantao Gu (EE, Tsinghua) RIP


  1. Restricted Isometry Property of Low Dimensional Subspaces and Its Application in Compressed Subspace Clustering Co-work with: Gen Li, Yuchen Jiao, Linghang Meng, Qinghua Liu Dept. EE, Tsinghua University May 2018 Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 1 / 92 Yuantao Gu ( 谷源涛 )

  2. Outline Main Results May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) Conclusion 5 Numerical Results 4 Related Theories Application and Usage Preliminary 1 Restricted Isometry Property (RIP) for Subspaces 3 Theory Application Motivation 2 Application of Random Projection Dimensionality Reduction and Random Projection Background 2 / 92

  3. Outline Main Results May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) Conclusion 5 Numerical Results 4 Related Theories Application and Usage Preliminary 1 Restricted Isometry Property (RIP) for Subspaces 3 Theory Application Motivation 2 Application of Random Projection Dimensionality Reduction and Random Projection Background 3 / 92

  4. Outline Main Results May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) Conclusion 5 Numerical Results 4 Related Theories Application and Usage Preliminary 1 Restricted Isometry Property (RIP) for Subspaces 3 Theory Application Motivation 2 Application of Random Projection Dimensionality Reduction and Random Projection Background 4 / 92

  5. High Dimensionality of Big Data http://opticalnanofjlter.com/ http://www.thelancet.com/cms/ Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 5 / 92

  6. The Curse of Dimensionality Parsons, Haque, and Liu, Subspace Clustering for High Dimensional Data: A Review, 2004 Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 6 / 92

  7. Principal Components Analysis (PCA) Jollifge, Principal Component Analysis, 1986 Ghodsi, Dimensionality Reduction A Short Tutorial, 2006. Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 7 / 92 PCA 1st dimension

  8. Dimensionality Reduction Methods Locally Linear Embedding May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) Ghodsi, Dimensionality Reduction A Short Tutorial, 2006. Semidefjnite Embedding 8 / 92 Isomap Laplacian Eigenmaps LLE LEM 2nd dimension 1st dimension 1st dimension Isomap SDE 2nd dimension 2nd dimension 1st dimension 1st dimension

  9. Random Projection x May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) 9 / 92 y Original data as vector x ∈ R N Random matrix Φ ∈ R n × N , n < N Dimension-reduced data y ∈ R n [ ] [ ] [ ] Φ =

  10. Johnson-Lindenstrauss Lemma (JL Lemma) Johnson and Lindenstrauss, Extensions of Lipschitz Maps into a Hilbert Space, 1984. May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) 10 / 92 For any set V of L points in R N , there exists a map f : R N → R n , n < N , such that for all x 1 , x 2 ∈ V (1 − ε ) ∥ x 1 − x 2 ∥ 2 2 ≤ ∥ f ( x 1 ) − f ( x 2 ) ∥ 2 2 ≤ (1 + ε ) ∥ x 1 − x 2 ∥ 2 2 if n is a positive integer satisfying 4 ln L n ≥ ε 2 /2 − ε 3 /3 where 0 < ε < 1 is a constant

  11. Outline Main Results May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) Conclusion 5 Numerical Results 4 Related Theories Application and Usage Preliminary 1 Restricted Isometry Property (RIP) for Subspaces 3 Theory Application Motivation 2 Application of Random Projection Dimensionality Reduction and Random Projection Background 11 / 92

  12. Compressed Sensing single pixel camera Sparse MRI Modulated Wideband Converter Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 12 / 92

  13. Compressed Sensing http://www.web.me.iastate.edu/sbhattac/research_cs.html Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 13 / 92

  14. Restricted Isometry Property (RIP) for Sparse Signals with probability May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) Candès, The restricted isometry property and its implications for compressed sensing, 2008 Candès and Tao, Decoding by linear programming, 2005. 14 / 92 smallest nonnegative constant, such that The projection matrix Φ ∈ R n × N , n < N satisfjes RIP with δ as the (1 − δ ) ∥ x 1 − x 2 ∥ 2 2 ≤ ∥ Φ x 1 − Φ x 2 ∥ 2 2 ≤ (1 + δ ) ∥ x 1 − x 2 ∥ 2 2 holds for any two k -sparse vectors x 1 , x 2 ∈ R N A Gaussian random matrix Φ has the RIP for ( N ) n ≥ c 1 k ln k 1 − e − c 2 n where c 1 , c 2 > 0 are constants depending only on δ

  15. RIP for Low-Dimensional Signal Models Baraniuk, Cevher, and Wakin, Low-Dimensional Models for Dimensionality Reduction and Signal Recovery: A Geometric Perspective, 2010 Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 15 / 92

  16. Least Squares Approximation Solve the induced subproblem May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) applications, 2006 Overconstrained least-squares approximation problem 16 / 92 b (denoted by S ) Approximating least-squares approximation Find a vector x such that Ax ≈ b , A ∈ R m × n , m ≫ n x ∥ Ax − b ∥ 2 x opt = arg min Randomly sample and rescale r = O ( n log n / ε 2 ) rows of A and ˜ x opt = arg min x ∥ SAx − Sb ∥ 2 Drineas, Mahoney, and Muthukrishnan, Sampling algorithms for ℓ 2 regression and

  17. Least Squares Approximation The relative-error bounds of the form denotes the conditional number of A Drineas, Mahoney, and Muthukrishnan. Relative-error CUR matrix decompositions, 2008 Drineas, Mahoney, Muthukrishnan, and Sarlós. Faster least squares approximation, 2010 Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 17 / 92 ∥ b − A ˜ x opt ∥ 2 ≤ (1 + ε ) ∥ b − Ax opt ∥ 2 x opt ∥ 2 ≤ c √ εκ ( A ) ∥ x opt ∥ 2 ∥ x opt − ˜ fail with a probability δ that is no greater than a constant, where κ ( A )

  18. Support Vector Machine https://www.safaribooksonline.com/library/view/python-machine- learning/9781783555130/graphics/ Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 18 / 92 Normalized Margin: A data set S is linearly separable by margin γ if there exists u ∈ R d , such that for all ( x , y ) ∈ S , y ⟨ u , x ⟩ ∥ u ∥∥ x ∥ ≥ γ

  19. Support Vector Machine Shi, Shen, Hill, and van den Hengel, Is margin preserved after random projection? 2012 May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) 19 / 92 any Given a random Gaussian matrix R ∈ R n × N , if the data set S = { ( x i ∈ R N , y i ∈ {− 1 , 1 } ) } M i =1 is linearly separable by margin γ ∈ (0 , 1] , then for any δ, ε ∈ (0 , 1) and 3 ε 2 − 2 ε 3 ln 6 M 12 n > δ , with probability at least 1 − δ , the data set S ′ = { ( Rx i ∈ R n , y i ∈ {− 1 , 1 } ) } M i =1 2 ε is linearly separable by margin γ − 1 − ε .

  20. Review of Background To solve big data problems dimensionality reduction: random projection Theories for dimensionality reduction and its applications Johnson-Lindenstrauss (JL) Lemma Restricted Isometry Property (RIP) for sparse signals Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 20 / 92

  21. Outline Main Results May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) Conclusion 5 Numerical Results 4 Related Theories Application and Usage Preliminary 1 Restricted Isometry Property (RIP) for Subspaces 3 Theory Application Motivation 2 Application of Random Projection Dimensionality Reduction and Random Projection Background 21 / 92

  22. Outline Main Results May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) Conclusion 5 Numerical Results 4 Related Theories Application and Usage Preliminary 1 Restricted Isometry Property (RIP) for Subspaces 3 Theory Application Motivation 2 Application of Random Projection Dimensionality Reduction and Random Projection Background 22 / 92

  23. Large Volume of Big Data Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 23 / 92

  24. Subspace Clustering Parsons, Haque, and Liu, Subspace Clustering for High Dimensional Data: A Review, 2004 René Vidal, Subspace Clustering, 2011 Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 24 / 92

  25. Applications Motion trajectories (Costeira and Kanade, 1998) Face images (Basri and Jacobs, 2003) Gene expression data (Jiang, Tang, and Zhang, 2004) Social graphs (Jalali, Chen, Sanghavi, et al., 2011) Network hop counts (Eriksson, Balzano, and Nowak, 2012) Movie ratings (Zhang, Fawaz, Ioannidis et al., 2012) Anomaly detection (Mazel, Casas, Fontugne et al, 2015) Yuantao Gu (EE, Tsinghua) RIP for Subspaces May 2018 25 / 92

  26. Applications Face Images under Difgerent May 2018 RIP for Subspaces Yuantao Gu (EE, Tsinghua) and Trends, 2007 for Data Mining - Techniques, Applications Yu, Ye, and Liu, Dimensionality Reduction 26 / 92 Illumination http://vision.ucsd.edu/content/extended- yale-face-database-b-b Gene Expression Data Expression Microarray Image Courtesy of Affymetrix Task: To classify novel samples � into known disease types (disease diagnosis) Challenge: thousands of genes, � few samples Solution: to apply dimensionality � reduction Expression Microarray Data Set

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend