image reconstruction tutorial part 1 sparse optimization
play

Image Reconstruction Tutorial Part 1: Sparse optimization and - PowerPoint PPT Presentation

Image Reconstruction Tutorial Part 1: Sparse optimization and learning approaches zurica 1 and Bart Goossens 2 Aleksandra Pi 1 Group for Artificial Intelligence and Sparse Modelling (GAIM) 2 Image Processing and Interpretation (IPI) group


  1. Image Reconstruction Tutorial Part 1: Sparse optimization and learning approaches zurica 1 and Bart Goossens 2 Aleksandra Piˇ 1 Group for Artificial Intelligence and Sparse Modelling (GAIM) 2 Image Processing and Interpretation (IPI) group TELIN, Ghent University - imec Yearly workshop FWO–WOG Turning images into value through statistical parameter estimation Ghent, Belgium, 20 September 2019

  2. Outline Model-based iterative reconstruction algorithms 1 Sparse optimization Solution strategies: greedy methods vs. convex optimization Optimization methods in sparse image reconstruction Structured sparsity 2 Wavelet-tree sparsity Markov Random Field (MRF) priors Machine learning in image reconstruction 3 Main ideas and current trends A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 2 / 45

  3. A fairly general formulation Reconstruct a signal (image) x ∈ X from data y ∈ Y where y = T ( x ) + n X and Y are Hilbert spaces, T : X �→ Y is the forward operator and n is noise. A common model-driven approach is to minimize the negative log-likelihood L : min x ∈ X L ( T ( x ) , y ) Typically, ill-posed and leads to over-fitting. Variational regularization, also called model-based iterative reconstruction seeks to minimize a regularized objective function min x ∈ X L ( T ( x ) , y ) + τφ ( x ) φ : X �→ R ∪ {−∞ , ∞} is a regularization functional. τ ≥ 0 governs the influence of the a priori knowledge against the need to fit the data. A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 3 / 45

  4. Linear inverse problems Many image reconstruction problems can be formulated as a linear inverse problem. A noisy indirect observation y , of the original image x is then y = Ax + n Matrix A is the forward operator. x ∈ R n ; y , n ∈ R m (or x ∈ C n ; y , n ∈ C m ). Here, image pixels are stacked into vectors (raster scanning). In general, m � = n . Some examples CT: A is the system matrix modeling the X-ray transformation MRI: A is (partially sampled) Fourier operator OCT: A is the first Born approximation for the scattering Compressed sensing: A is a measurement matrix (dense or sparse) A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 4 / 45

  5. Linear inverse problems For the linear inverse problem y = Ax + n , model-based reconstruction seeks to solve: 1 2 � Ax − y � 2 min 2 + τφ ( x ) ( Tikhonov formulation) x Alternatively, � Ax − y � 2 min x φ ( x ) subject to 2 ≤ ǫ ( Morozov formulation) x � Ax − y � 2 min φ ( x ) ≤ δ ( Ivanov formulation) subject to 2 Under mild conditions, these are all equivalent [Figueiredo and Wright, 2013], and which one is more convenient is problem-dependent. A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 5 / 45

  6. Outline Model-based iterative reconstruction algorithms 1 Sparse optimization Solution strategies: greedy methods vs. convex optimization Optimization methods in sparse image reconstruction Structured sparsity 2 Wavelet-tree sparsity Markov Random Field (MRF) priors Machine learning in image reconstruction 3 Main ideas and current trends A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 6 / 45

  7. Sparse optimization A common assumption: x is sparse in a well-chosen transform domain. We refer to a wavelet representation meaning any wavelet-like multiscale representation, including curvelets and shearlets.. x = Ψ θ , θ ∈ R d , Ψ ∈ R n × d The columns of Ψ are the elements of a wavelet frame (an orthogonal basis or an overcomplete dictionary) The main results hold for learned dictionaries , trained on a set of representative examples to yield optimally sparse representation for a particular class of images. A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 7 / 45

  8. Compressed sensing with y , n ∈ R m , x ∈ R n , θ ∈ R d , Consider y = Ax + n , x = Ψ θ , m < n 1 ˆ 2 � AΨ θ − y � 2 x = Ψ ˆ θ = arg min 2 + τφ ( θ ) , ˆ θ θ θ � θ � 0 s.t. � AΨ θ − y � 2 1 2 � AΨ θ − y � 2 Commonly: min 2 ≤ ǫ or min 2 + τ � θ � 1 θ [Cand` es et al., 2006], [Donoho, 2006], [Lustig et al., 2007] A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 8 / 45

  9. Compressed sensing: recovery guarantees with y , n ∈ R m , x ∈ R n , m < n Consider y = Ax + n , x = Ψ θ , Matrix Φ = AΨ has K -restricted isometry property ( K-RIP ) with constant ǫ K < 1 if ∀ K -sparse (having only K non-zero entries) θ : ( 1 − ǫ K ) � θ � 2 2 ≤ � Φ θ � 2 2 ≤ ( 1 + ǫ K ) � θ � 2 2 Suppose matrix A ∈ R m × n is formed by subsampling a given sampling operator A ∈ R n × n . The mutual coherence between ¯ ¯ A and Ψ : µ (¯ i , j | a ⊤ A , Ψ ) = max i ψ j | If m > C µ 2 (¯ A , Ψ ) Kn log ( n ) , for some constant C > 0, then 1 2 � AΨ θ − y � 2 min 2 + τ � θ � 1 θ recovers x with high probability, given the K-RIP holds for Φ = AΨ . A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 9 / 45

  10. Analysis vs. synthesis formulation with y ∈ R m , x ∈ R n , θ ∈ R d Consider y = Ax + n , x = Ψ θ , Synthesis approach: 1 2 � AΨ θ − y � 2 min 2 + τφ ( θ ) θ or in the constrained form: � AΨ θ − y � 2 min θ φ ( θ ) 2 ≤ ǫ subject to Analysis approach: 1 2 � Ax − y � 2 min 2 + τφ ( x ) x or in the constrained form: � Ax − y � 2 min x φ ( x ) subject to 2 ≤ ǫ A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 10 / 45

  11. Analysis vs. synthesis formulation with y ∈ R m , x ∈ R n , θ ∈ R d Consider y = Ax + n , x = Ψ θ , Synthesis approach: 1 2 � AΨ θ − y � 2 min 2 + τφ ( θ ) θ or in a constrained form: � AΨ θ − y � 2 min θ φ ( θ ) 2 ≤ ǫ subject to Analysis approach: 1 2 � Ax − y � 2 min 2 + τφ ( x ) x or in a constrained form: � Ax − y � 2 min x φ ( x ) subject to 2 ≤ ǫ A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 11 / 45

  12. Analysis vs. synthesis formulation with y ∈ R m , x ∈ R n , θ ∈ R d Consider y = Ax + n , x = Ψ θ , Synthesis approach: 1 2 � AΨ θ − y � 2 min 2 + τφ ( θ ) θ or in a constrained form: � AΨ θ − y � 2 min θ φ ( θ ) subject to 2 ≤ ǫ Analysis approach that also applies to wavelet regularization 1 2 � Ax − y � 2 min 2 + τφ ( Px ) x or in a constrained form: � Ax − y � 2 min x φ ( Px ) 2 ≤ ǫ subject to A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 12 / 45

  13. Analysis vs. synthesis formulation with y ∈ R m , x ∈ R n , θ ∈ R d Consider y = Ax + n , x = Ψ θ , Synthesis approach: 1 2 � AΨ θ − y � 2 min 2 + τφ ( θ ) θ or in a constrained form: � AΨ θ − y � 2 min θ φ ( θ ) 2 ≤ ǫ subject to Analysis approach that also applies to wavelet regularization 1 2 � Ax − y � 2 min 2 + τφ ( Px ) x or in a constrained form: � Ax − y � 2 min x φ ( Px ) 2 ≤ ǫ subject to P : a wavelet transform operator or P = I (standard analysis) A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 13 / 45

  14. Outline Model-based iterative reconstruction algorithms 1 Sparse optimization Solution strategies: greedy methods vs. convex optimization Optimization methods in sparse image reconstruction Structured sparsity 2 Wavelet-tree sparsity Markov Random Field (MRF) priors Machine learning in image reconstruction 3 Main ideas and current trends A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 14 / 45

  15. Solution strategies: greedy methods vs. convex optimization Solution strategy is problem-dependent. For non-convex problems like � Ax − y � 2 x � x � 0 2 ≤ ǫ min subject to Greedy algorithms , e.g., Matching Pursuit ( MP ) [Mallat and Zhang, 1993] OMP [Tropp, 2004], CoSaMP [Needell and Tropp, 2009] IHT [Blumensath and Davies, 2009] or convex relaxation can be applied leading to: � Ax − y � 2 min x � x � 1 2 ≤ ǫ subject to 1 2 � Ax − y � 2 min 2 + τφ ( x ) x known as LASSO [Tibshirani, 1996] or BPDN [Chen et al., 2001] problem. A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 15 / 45

  16. Greedy methods: OMP OMP algorithm for solving min x � x � 0 subject to Ax = y Require: k = 1 , r ( 1 ) = y , Λ ( 0 ) = ∅ 1: repeat λ ( k ) = arg max j | A j · r ( k ) | 2: Λ ( k ) = Λ ( k − 1 ) ∪ { λ ( k ) } 3: x ( k ) = arg min x � A Λ k x − y � 2 4: y ( k ) = A Λ k x ( k ) ˆ 5: r ( k + 1 ) = r ( k ) − ˆ y ( k ) 6: k = k + 1 7: 8: until stopping criterion satisfied A j is the j -th column of A , and A Λ a sub-matrix of A with columns indicated in Λ . A. Piˇ zurica and B. Goossens (UGent) Image Reconstruction Tutorial: Part 1 FWO-WOG TIVSPE 2019 16 / 45

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend