models for image restoration
play

Models for Image Restoration Shuhang Gu Dept. of Computing The - PowerPoint PPT Presentation

Synthesis and Analysis Sparse Representation Models for Image Restoration Shuhang Gu Dept. of Computing The Hong Kong Polytechnic University Outline Sparse representation models for image modeling Synthesis based representation


  1. Synthesis and Analysis Sparse Representation Models for Image Restoration Shuhang Gu 顾舒航 Dept. of Computing The Hong Kong Polytechnic University

  2. Outline  Sparse representation models for image modeling Synthesis based representation model • Analysis based representation model • Synthesis & analysis models for image modeling •  Weighted nuclear norm and its applications in low level vision Low rank models • Weighted nuclear norm minimization (WNNM) • WNNM for image denoising • WNNM-RPCA and WNNM-MC and their applications •  Convolutional sparse coding for single image super-resolution Convolutional sparse coding (CSC) • CSC for single image super resolution • 2

  3. Synthesis and analysis sparse representation models 3

  4. Synthesis based sparse representation model Synthesis based sparse representation model assumes that a signal 𝑦 can be represented as a linear combination of a small number of atoms chosen out of a dictionary 𝐸 : 𝑦 = 𝐸𝛽 , s.t. 𝛽 0 < 𝜁 ? ? A sparse A dense . solution . solution . ? ? Elad, M., Milanfar, P., Rubinstein, R. Analysis versus synthesis in signal priors. Inverse problems 2007 . 4

  5. Analysis based sparse representation model • Analysis model generate representation coefficients by a simple multiplication operation, and assumes the coefficients are sparse: 𝑄𝑦 0 < 𝜁 ? ? . . . ? ? Elad, M., Milanfar, P., Rubinstein, R. Analysis versus synthesis in signal priors. Inverse problems 2007 . 5

  6. S&A representation models for image modeling • A geometry perspective Analysis model Synthesis model 𝛾 = 𝑄𝑦 , where 𝛾 is sparse 𝑦 = 𝐸𝛽 , where 𝛽 is sparse synthesis model emphasis Analysis model emphasis the non-zero values in the the zero values in the sparse coefficient sparse coefficient vector vector 𝛽 , because these 𝑄𝑦 , because these zero non-zero values select values select vectors in the vectors in the dictionary to projection matrix to span span the space of input the complementary space signal of input signal A hyperplane Elad, M., Milanfar, P., Rubinstein, R. Analysis versus synthesis in signal priors. Inverse problems 2007 . 6

  7. S&A representation models for image modeling  Image restoration/enhancement problems Image Denoising Image Super-resolution 𝑧 = 𝑦 + 𝑜 𝑧 = 𝐸(𝑙⨂𝑦) + 𝑜 Image Deconvolution Image Inpainting 𝑧 = 𝑙⨂𝑦 + 𝑜 𝑧 = 𝑁 ⊙ 𝑦 + 𝑜 … 7

  8. S&A representation models for image modeling • Priors for image restoration – Sparsity priors – Non-local similarity priors – Color line priors … Buades A, Coll B, Morel JM. A non-local algorithm for image denoising. In CVPR 2005 . 8

  9. S&A representation models for image modeling • Sparsity prior 𝑏𝑠𝑕𝑛𝑏𝑦 𝑦 𝑞 𝑦 𝑧 = 𝑞 𝑧 𝑦 𝑞(𝑦) Minimize the – log( 𝑞 𝑦 𝑧 ): Gaussian likelihood Data prior dist. assumption modeling 1 𝑦 = 𝑏𝑠𝑕𝑛𝑗𝑜 𝑌 2 𝑦 − 𝑧 𝐺 − log(𝑞(𝑦)) Transformation Long-tail dist. leads to Analysis model sparse solution 𝜚(𝑄𝑦) domain Prior Original signal Dist. Is not 𝐪(𝐲) domain discriminative enough modeling Decomposition Long-tail dist. leads to Synthesis model 𝜔(𝛽) domain sparse solution 9

  10. S&A representation models for image modeling Synthesis model Analysis model 1 1 𝑛𝑗𝑜 𝛽 2 𝑧 − 𝐸𝛽 𝐺 + 𝜔 𝛽 𝑛𝑗𝑜 𝑦 2 𝑧 − 𝑦 𝐺 + 𝜚(𝑄𝑦) 𝑦 = 𝐸𝛽 • Representative methods • Representative methods TV, wavelet methods, FRAME, FOE, KSVD, BM3D, LSSC, NCSR, et. al. CSF, TRD et. al. • Pros • Pros - Synthesis model can be more sparse - Patch divide free - Easier to embed non-local prior - Efficient in the inference phase • Cons - Easier to learn task specific prior • Cons - Patch prior modeling needs aggregation - Time consuming - Hard to embed non-local prior - Not as sparse as synthesis model 10

  11. S&A representation models for image modeling Synthesis model Analysis model Methods: KSVD, BM3D, LSSC, NCSR, etc. • Pros - Synthesis model can be more sparse Patch - Easier to embed non-local prior based • Cons - Patch prior modeling needs aggregation - Time consuming methods: TV, wavelet methods, FOE, CSF, TRD etc. • Pros - Patch divide free Filter - Efficient in the inference phase based - Easier to learn task specific prior • Cons - Hard to embed non-local prior - Not as sparse as synthesis model 11

  12. S&A representation models for image modeling Synthesis model Analysis model Methods: KSVD, BM3D, LSSC, NCSR, etc. • Pros - Synthesis model can be more sparse Patch Methods: Analysis-KSVD et al. - Easier to embed non-local prior based • Cons - Patch prior modeling needs aggregation - Time consuming methods: TV, wavelet methods, FOE, CSF, TRD etc. • Pros - Patch divide free Filter ? - Efficient in the inference phase based - Easier to learn task specific prior • Cons - Hard to embed non-local prior - Not as sparse as synthesis model 12

  13. S&A representation models for image modeling Synthesis model Analysis model Patch • • Embed non-local prior Embed non-local prior • • Modeling texture/details better Modeling structure better based Filter • • Aggregation free Aggregation free based • • Modeling texture/details better Modeling structure better Notes: Aggregation: Overlap aggregation method may smooth image or generate ringing artifacts Non-local prior: Non-local prior helps to generate visual plausible results on highly noisy situation 13

  14. S&A representation models for image modeling Synthesis model Analysis model Patch • • Embed non-local prior Embed non-local prior • • Modeling texture/details better Modeling structure better based Different applications may be better solved via different models! Filter • • Aggregation free Aggregation free based • • Modeling texture/details better Modeling structure better Notes: Aggregation: Overlap aggregation method may smooth image or generate ringing artifacts Non-local prior: Non-local prior helps to generate visual plausible results on highly noisy situation 14

  15. S&A representation models for image modeling • Weighted nuclear norm minimization denoisning model – A analysis model with patch based implementation • Non-local prior Denoising • Analysis model is good at structure modeling • Convolutional sparse coding super resolution – A synthesis model with filter based implementation • Aggregation free SR • Synthesis model is good at texture modeling 15

  16. Weighted nuclear norm minimization and its applications in low level vision 16

  17. Low rank models • Matrix factorization methods 𝑛𝑗𝑜 𝑉,𝑊 𝒎𝒑𝒕𝒕 𝑍 − 𝑌 𝑡. 𝑢. 𝑌 = 𝑉𝑊 Loss functions are determined by different noise models: Gaussian noise model: PCA, Probabilistic PCA Sparse noise model: Robust PCAs Partial observations: Matrix completion Complex noise model: MoG etc. 17

  18. Low rank models • Regularization methods 𝑛𝑗𝑜 𝑌 𝒎𝒑𝒕𝒕 𝑍 − 𝑌 + 𝑆(𝑌) A common used regularization term is the nuclear norm of matrix X 𝑌 ∗ = 𝜏 𝑗 (𝑌) 1 Pros: - exact recovery property (theoretically proved) - nuclear norm proximal problem has closed-form solution Character: - Regularization method balance fidelity and low-rankness via parameter - Factorization method set upper bound. Candès, Emmanuel J., et al. "Robust principal component analysis?." Journal of the ACM, 2011. Candès, Emmanuel J., and Benjamin Recht. "Exact matrix completion via convex optimization." Foundations of Computational mathematics 2009. 18 Cai, J. F., Candès, E. J., & Shen, Z. A singular value thresholding algorithm for matrix completion. SIAM Journal on Optimization, 2010 .

  19. Low rank models • Regularization methods: a 2D analysis sparse perspective Analysis sparse model 1 𝑛𝑗𝑜 𝑦 2 𝑧 − 𝑦 𝐺 + 𝜚(𝑄𝑦) Nuclear norm regularization model 1 2 𝑍 − 𝑌 𝐺 + 𝑉 𝑈 𝑌𝑊 1 𝑛𝑗𝑜 𝑌 Nuclear norm regularization model can be interpreted as a 2D analysis sparse model! 19

  20. Weighted nuclear norm minimization • Nuclear norm proximal 1 𝑛𝑗𝑜 𝑌 2 𝑍 − 𝑌 𝐺 + 𝜇 𝑌 ∗ 𝑌 ∗ = 𝑉𝑇 𝜇 𝜏 𝑍 𝑊 𝑈 • Pros – Tightest convex envelope of rank minimization. – Closed form solution. • Cons – Treat equally all the singular values. Ignore the different significances of matrix singular values. Cai, J. F., Candès, E. J., & Shen, Z. A singular value thresholding algorithm for matrix completion. SIAM Journal on Optimization, 2010 . 20

  21. Weighted nuclear norm minimization Weighted nuclear norm 𝑌 𝑥,∗ = 𝑥 𝑗 𝜏 𝑗 (𝑌) 1 Weighted nuclear norm proximal (WNNP) 𝑌 = 𝑏𝑠𝑕𝑛𝑗𝑜 𝑌 𝑌 − 𝑍 𝐺 + 𝑌 𝑥,∗ • Difficulties – The WNNM is not convex for general weight vectors – The sub-gradient method cannot be used to analyze its optimization 21

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend