Sparsity and image processing
Aurélie Boisbunon
INRIA-SAM, AYIN
Sparsity and image processing Aurlie Boisbunon INRIA-SAM, AYIN - - PowerPoint PPT Presentation
Sparsity and image processing Aurlie Boisbunon INRIA-SAM, AYIN March 26, 2014 Why sparsity? Main advantages Dimensionality reduction Fast computation Better interpretability Image processing pattern recognition denoising
INRIA-SAM, AYIN
◮ Dimensionality reduction ◮ Fast computation ◮ Better interpretability
◮ pattern recognition ◮ denoising / deblurring ◮ compression ◮ super-resolution ◮ source separation
(AYIN) Sparsity & Image Proc. March 26, 2014 2 / 14
j=1 ◮ Fixed: Fourier basis, Wavelets ◮ Learned
Source: [Hastie et al., 2008] Source: [Donoho et al., 1995]
(AYIN) Sparsity & Image Proc. March 26, 2014 3 / 14
α
2 + pen(α)
S&P Gauss Neg 1 2 3 4 5 x 10
4
(AYIN) Sparsity & Image Proc. March 26, 2014 4 / 14
α
2 + pen(α)
0.2 0.4 0.6 0.8 1 1.2 1.2 1.2 1.2 1.4 1.4 1.4 1.4 1.6 1.6 1.6 1.6 1.8 1.8 1.8 1.8
β1 β2 −1 1 −1 1
0.5 1 1.5 2 2.5 2.5 2.5 2.5 3 3 3 3 3.5 3.5 3.5 3.5 4 4
β1 β2 −1 1 −1 1
0.5 1 1.5 1.5 2 2 2.5 2.5 3 3 3.5 3.5 3.5 3.5 4 4 4 4 4.5 4.5
β1 β2 −1 1 −1 1
0.2 0.4 0.6 0.8 1 1.2 1.2 1.2 1.2 1.4 1.4 1.4 1.4 1.6 1.6 1.6 1.6 1.8 1.8 1.8 1.8 2 2 2 2
β1 β2 −1 1 −1 1
1with 0 belonging to subgradient of pen
(AYIN) Sparsity & Image Proc. March 26, 2014 5 / 14
α
2 + pen(α)
−10 −5 5 10 −10 −5 5 10
βj
LS
βj
l0 l0 LS
−10 −5 5 10 −10 −5 5 10
βj
LS
βj
lasso Lasso LS
−6 −4 −2 2 4 6 −6 −4 −2 2 4 6
βj
LS
βj
adalasso Adaptive LS
−10 −5 5 10 −10 −5 5 10
βj
LS
βj
FS FS LS
2with 0 belonging to subgradient of pen
(AYIN) Sparsity & Image Proc. March 26, 2014 6 / 14
j (x − D(J)α(J))|
(AYIN) Sparsity & Image Proc. March 26, 2014 7 / 14
2 4 6 1 2 3 4 5
2 4 6 1 2 3 4 5
2 4 6 1 2 3 4 5
(AYIN) Sparsity & Image Proc. March 26, 2014 8 / 14
3[Candes et al., 2008]
(AYIN) Sparsity & Image Proc. March 26, 2014 9 / 14
4[Beck and Teboulle, 2009]
(AYIN) Sparsity & Image Proc. March 26, 2014 10 / 14
α, D
2 + pen(α)
◮ Solve optimization problem for α with D
◮ Solve optimization problem for D with α
Source: [Bach et al., 2011]
(AYIN) Sparsity & Image Proc. March 26, 2014 11 / 14
5[Mairal et al., 2009] 6[Mairal et al., 2008]
(AYIN) Sparsity & Image Proc. March 26, 2014 12 / 14
(AYIN) Sparsity & Image Proc. March 26, 2014 13 / 14
Bach, F., Jenatton, R., Mairal, J., and Obozinski, G. (2011). Optimization for Machine Learning, chapter Convex optimization with sparsity-inducing norms, pages 19–54. MIT Press. Beck, A. and Teboulle, M. (2009). A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM Journal on Imaging Sciences, 2(1):183–202. Candes, E. J., Wakin, M. B., and Boyd, S. P. (2008). Enhancing sparsity by reweighted ℓ1 minimization. Journal of Fourier analysis and applications, 14(5-6):877–905. Donoho, D. L., Buckheit, J. B., Chen, S., Johnstone, I., and Scargle, J. D. (1995). About wavelab. Hastie, T., Tibshirani, R., and Friedman, J. (2008). The Elements of Statistical Learning: Data Mining, Inference and Prediction (2nd Edition), volume 1. Springer Series in Statistics. Mairal, J., Bach, F., Ponce, J., and Sapiro, G. (2009). Online dictionary learning for sparse coding. In Proceedings of the 26th Annual International Conference on Machine Learning, pages 689–696. ACM. Mairal, J., Ponce, J., Sapiro, G., Zisserman, A., and Bach, F. R. (2008). Supervised dictionary learning. In Advances in Neural Information Processing Systems, pages 1033–1040. Zhang, C. (2010). Nearly unbiased variable selection under minimax concave penalty. Annals of Statistics, 38(2):894–942.
(AYIN) Sparsity & Image Proc. March 26, 2014 14 / 14