distributed sensing and perception via sparse
play

Distributed Sensing and Perception via Sparse Representation Allen - PowerPoint PPT Presentation

Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Distributed Sensing and Perception via Sparse Representation Allen Y. Yang Department of EECS, UCB yang@eecs.berkeley.edu University of Texas,


  1. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Distributed Sensing and Perception via Sparse Representation Allen Y. Yang Department of EECS, UCB yang@eecs.berkeley.edu University of Texas, Austin, 2011 Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  2. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Distributed Sensing and Perception Centralized Perception Distributed Perception Up: powerful processors Down: mobile processors Up: unlimited memory Down: limited onboard memory Up: unlimited bandwidth Down: band-limited communications Down: single modality Up: distributed, multi-modality Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  3. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Distributed Sensing and Perception Centralized Perception Distributed Perception Up: powerful processors Down: mobile processors Up: unlimited memory Down: limited onboard memory Up: unlimited bandwidth Down: band-limited communications Down: single modality Up: distributed, multi-modality When the sensing resources are limited or scarce: What is the optimal strategy to deploy these agents? 1 How to effectively take measurements of the events? 2 How to properly tally the local observations to reach a global consensus? 3 An intelligent system over a sensor network shall perform better than the sum of its parts? Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  4. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Challenges Making real-time decisions on mobile devices is difficult. 1 Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  5. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Challenges Making real-time decisions on mobile devices is difficult. 1 Applications demand extremely high accuracy: 99% Precision, 99% Recall? 2 Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  6. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Challenges Making real-time decisions on mobile devices is difficult. 1 Applications demand extremely high accuracy: 99% Precision, 99% Recall? 2 Scenarios demand the ability to reconstruct 3-D models. 3 Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  7. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Outline Robust Face Recognition 1 A sparse representation framework via ℓ 1 -min . x ∗ = arg min � x � 1 subj. to b = A x . x Accelerate ℓ 1 -min algorithms towards a semi-real time face recognition system. Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  8. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Informative Feature Selection for Object Recognition 2 Informative feature selection via Sparse PCA x ∗ = arg max x T Σ A x subj. to � x � 2 = 1 , � x � 1 ≤ k . x Accelerate Sparse PCA algorithms. Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  9. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Reconstruct large-scale 3-D objects by large-baseline feature matching 3 Extract a new class of low-rank texture regions using Robust PCA A ∗ = arg min A , E ,τ � A � ∗ + λ � E � 1 subj. to I ◦ τ = A + E . Complete pipeline from low-rank texture in single views to 3-D model in multiple views. Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  10. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Face Recognition Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  11. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Classification via Sparse Representation Face-subspace model [Belhumeur et al. ’97, Basri & Jacobs ’03] 1 Assume b belongs to Class i in K classes. b = α i , 1 v i , 1 + α i , 2 v i , 2 + · · · + α i , n 1 v i , n i , = A i α i . Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  12. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Classification via Sparse Representation Face-subspace model [Belhumeur et al. ’97, Basri & Jacobs ’03] 1 Assume b belongs to Class i in K classes. b = α i , 1 v i , 1 + α i , 2 v i , 2 + · · · + α i , n 1 v i , n i , = A i α i . Nevertheless, Class i is the unknown label we need to solve: 2 α 1 2 3 α 2 5 = A x . . Sparse representation b = [ A 1 , A 2 , · · · , A K ] . 4 . α K Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  13. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Classification via Sparse Representation Face-subspace model [Belhumeur et al. ’97, Basri & Jacobs ’03] 1 Assume b belongs to Class i in K classes. b = α i , 1 v i , 1 + α i , 2 v i , 2 + · · · + α i , n 1 v i , n i , = A i α i . Nevertheless, Class i is the unknown label we need to solve: 2 α 1 2 3 α 2 5 = A x . . Sparse representation b = [ A 1 , A 2 , · · · , A K ] . 4 . α K x ∗ = [ 0 ··· 0 α T 0 ··· 0 ] T ∈ R n . 3 i Sparse representation x ∗ encodes membership! Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  14. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Image Corruption Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  15. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Image Corruption Sparse representation + sparse error 1 b = A x + e Cross-and-bouquet model [Wright et al. ’09, ’10] 2 I ´ „ x « ` A b = | = B w e Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  16. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Image Corruption Sparse representation + sparse error 1 b = A x + e Cross-and-bouquet model [Wright et al. ’09, ’10] 2 I ´ „ x « ` A b = | = B w e When size of A grows proportionally with the sparsity in x , asymptotically CAB can correct 100% noise. Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  17. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Performance on the AR database ( ℓ 1 -min): min � x � 1 + � e � 1 subj. to b = A x + e Reference: AY, et al. Robust face recognition via sparse representation . IEEE PAMI, 2009. Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  18. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion Question: How to effectively estimate HD sparse signals? “Black gold” age [Claerbout & Muir 1973, Taylor, Banks & McCoy 1979] Figure: Deconvolution of spike train. Basis pursuit / ℓ 1 -minimization [Chen-Donoho 1999] : x ∗ = arg min � x � 1 , subject to b = A x ( P 1 ) : The Lasso (least absolute shrinkage and selection operator) [Tibshirani 1996] x ∗ = arg min � b − A x � 2 , subject to � x � 1 ≤ k ( P 1 , 2 ) : Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

  19. Introduction Sparsity-based Classification Sparse Feature Selection Low-Rank Texture Conclusion ℓ 1 -Minimization via Linear Programming Using interior-point methods [Karmarkar ’84] n 1 T x − µ X Log-Barrier: min log x i , subj. to A x = b , x ≥ 0 . (1) x i =1 Using the Karush-Kuhn-Tucker (KKT) conditions 1 − µ X − 1 1 − A T y = 0 . (2) where x ≥ 0 are the primal variables, and y are the dual variables. Update by solving a linear system with O ( n 3 ) [Monteiro & Adler ’89] Z ( k ) ∆ x + X ( k ) ∆ z µ 1 − X ( k ) z ( k ) , = ˆ A ∆ x = 0 , (3) A T ∆ y + ∆ z = 0 , Distributed Sensing and Perception via Sparse Representation http://www.eecs.berkeley.edu/~yang

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend