beads filtrage asym etrique de ligne de base tendance et
play

BEADS : filtrage asym etrique de ligne de base (tendance) et d - PowerPoint PPT Presentation

I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS BEADS : filtrage asym etrique de ligne de base (tendance) et d ebruitage pour des signaux positifs avec parcimonie des d eriv ees S eminaire ICube L.


  1. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS BEADS : filtrage asym´ etrique de ligne de base (tendance) et d´ ebruitage pour des signaux positifs avec parcimonie des d´ eriv´ ees S´ eminaire ICube L. DUVAL, A. PIRAYRE IFP Energies nouvelles 1 et 4 av. de Bois-Pr´ eau, 92852 Rueil-Malmaison - France X. NING, I. W. SELESNICK Polytechnic School of Engineering New York University 19 juin 2015 1 / 21

  2. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS The fast way ◮ Question: where is the string behind the bead? ◮ Smoothness, sparsity, asymmetry 2 / 21

  3. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Outline I NTRODUCTION O UTLINE B ACKGROUND M ODELING N OTATIONS C OMPOUND SPARSE DERIVATIVE MODELING BEADS ALGORITHM M AJORIZE -M INIMIZE E VALUATION AND RESULTS S IMULATED BASELINE AND NOISE P OISSON NOISE GC × GC C ONCLUSIONS 3 / 21

  4. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Background on background ◮ Background affects quantitative evaluation/comparison ◮ In some domains: (instrumental) bias, (seasonal) trend ◮ In analytical chemistry: drift, continuum, wander, baseline ◮ Rare cases of parametric modeling 4 / 21

  5. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Background on background ◮ Background affects quantitative evaluation/comparison ◮ In some domains: (instrumental) bias, (seasonal) trend ◮ In analytical chemistry: drift, continuum, wander, baseline ◮ Rare cases of parametric modeling 4 / 21

  6. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Background on background ◮ Background affects quantitative evaluation/comparison ◮ In some domains: (instrumental) bias, (seasonal) trend ◮ In analytical chemistry: drift, continuum, wander, baseline ◮ Rare cases of parametric modeling 4 / 21

  7. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Background on background ◮ Background affects quantitative evaluation/comparison ◮ In some domains: (instrumental) bias, (seasonal) trend ◮ In analytical chemistry: drift, continuum, wander, baseline ◮ Rare cases of parametric modeling 4 / 21

  8. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Background on background For analytical chemistry data: 5 / 21

  9. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Notations Morphological decomposition: ( y , x , f , w ) ∈ ( R N ) 4 . y = x + f + w , ◮ y : observation ◮ x : clean series of peaks ◮ f : baseline ◮ w : noise Assumption: in the absence of peaks, the baseline can be approximately recovered from a noise-corrupted observation by low-pass filtering ◮ ˆ f = L ( y − ˆ x ) ( L : low-pass filter) s � 2 x ) � 2 ◮ formulated as � y − ˆ 2 = � H ( y − ˆ 2 ◮ H = I − L : high-pass filter 6 / 21

  10. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Compound sparse derivative modeling An estimate ˆ x can be obtained (with D i diff. operators) via: M F ( x ) = 1 � � 2 � H ( y − x ) � 2 � ˆ x = arg min 2 + λ i R i ( D i x ) . x i = 0 7 / 21

  11. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Compound sparse derivative modeling Examples of (smooth) sparsity promoting functions for R i ◮ φ A i = | x | | x | 2 + ǫ ◮ φ B � i = ◮ φ C i = | x | − ǫ log ( | x | + ǫ ) 7 / 21

  12. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Compound sparse derivative modeling Take the positivity of chromatogram peaks into account: F ( x ) = 1 � 2 � H ( y − x ) � 2 x = arg min ˆ 2 x N i − 1 N − 1 M � � � � + λ 0 θ ǫ ( x n ; r ) + λ i φ ([ D i x ] n ) . n = 0 i = 1 n = 0 Start from: � x , x � 0 θ ( x ; r ) = − rx , x < 0 8 / 21

  13. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Compound sparse derivative modeling Take the positivity of chromatogram peaks into account: F ( x ) = 1 � 2 � H ( y − x ) � 2 x = arg min ˆ 2 x N i − 1 N − 1 M � � � � + λ 0 θ ǫ ( x n ; r ) + λ i φ ([ D i x ] n ) . n = 0 i = 1 n = 0 and majorize it The majorizer g(x, v) for the penalty function θ (x; r), r = 3 10 g(x,v) 8 θ r (x) 6 4 (s, θ r (s)) 2 (v, θ r (v)) 0 −5 0 5 x 8 / 21

  14. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Compound sparse derivative modeling Take the positivity of chromatogram peaks into account: F ( x ) = 1 � 2 � H ( y − x ) � 2 x = arg min ˆ 2 x N i − 1 N − 1 M � � � � + λ 0 θ ǫ ( x n ; r ) + λ i φ ([ D i x ] n ) . n = 0 i = 1 n = 0 then smooth it: The smoothed asymmetric penalty function θ ε (x; r), r = 3 10 8 6 4 2 (− ε , f(− ε )) ( ε , f( ε )) 0 −5 0 5 x 8 / 21

  15. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Compound sparse derivative modeling Take the positivity of chromatogram peaks into account: F ( x ) = 1 � 2 � H ( y − x ) � 2 x = arg min ˆ 2 x N i − 1 N − 1 M � � � � + λ 0 θ ǫ ( x n ; r ) + λ i φ ([ D i x ] n ) . n = 0 i = 1 n = 0 then majorize it:  4 | v | x 2 + 1 − r 1 + r 2 x + | v | 1 + r 4 , | v | > ǫ  g 0 ( x , v ) = 4 ǫ x 2 + 1 − r 1 + r 2 x + ǫ 1 + r 4 , | v | � ǫ.  8 / 21

  16. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS BEADS Algorithm We now have a majorizer for F G ( x , v ) = 1 2 � H ( y − x ) � 2 2 + λ 0 x T [ Γ ( v )] x M � λ i � 2 ( D i x ) T [Λ( D i v )] ( D i x ) � + λ 0 b T x + + c ( v ) . i = 1 Minimizing G ( x , v ) with respect to x yields M � − 1 � � � � H T H + 2 λ 0 Γ ( v ) + λ i D T H T Hy − λ 0 b x = i [Λ( D i v )] D i . i = 1 with notations φ ( v n ) − v n � � � 2 φ ′ ( v n ) c ( v ) = . n 9 / 21

  17. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS BEADS Algorithm We now have a majorizer for F G ( x , v ) = 1 2 � H ( y − x ) � 2 2 + λ 0 x T [ Γ ( v )] x M � λ i � 2 ( D i x ) T [Λ( D i v )] ( D i x ) � + λ 0 b T x + + c ( v ) . i = 1 Minimizing G ( x , v ) with respect to x yields M � − 1 � � � � H T H + 2 λ 0 Γ ( v ) + λ i D T H T Hy − λ 0 b x = i [Λ( D i v )] D i . i = 1 with notations 1 + r  4 | v n | , | v n | � ǫ  [ Γ ( v )] n , n = 1 + r  4 ǫ , | v n | � ǫ 9 / 21

  18. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS BEADS Algorithm We now have a majorizer for F G ( x , v ) = 1 2 � H ( y − x ) � 2 2 + λ 0 x T [ Γ ( v )] x M � λ i � 2 ( D i x ) T [Λ( D i v )] ( D i x ) � + λ 0 b T x + + c ( v ) . i = 1 Minimizing G ( x , v ) with respect to x yields M � − 1 � � � � H T H + 2 λ 0 Γ ( v ) + λ i D T H T Hy − λ 0 b x = i [Λ( D i v )] D i . i = 1 with notations [Λ( v )] n , n = φ ′ ( v n ) v n 9 / 21

  19. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS BEADS Algorithm We now have a majorizer for F G ( x , v ) = 1 2 � H ( y − x ) � 2 2 + λ 0 x T [ Γ ( v )] x M � λ i � 2 ( D i x ) T [Λ( D i v )] ( D i x ) � + λ 0 b T x + + c ( v ) . i = 1 Minimizing G ( x , v ) with respect to x yields M � − 1 � � � � H T H + 2 λ 0 Γ ( v ) + λ i D T H T Hy − λ 0 b x = i [Λ( D i v )] D i . i = 1 with notations [ b ] n = 1 − r 2 9 / 21

  20. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS BEADS Algorithm Writing filter H = A − 1 B ≈ BA − 1 (banded matrices) we have x = AQ − 1 � � B T BA − 1 y − λ 0 A T b where Q is the banded matrix, Q = B T B + A T MA , and M is the banded matrix, M � λ i D T M = 2 λ 0 Γ ( v ) + i [Λ( D i v )] D i . i = 1 10 / 21

  21. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS BEADS Algorithm Using previous equations, the MM iteration takes the form: M M ( k ) = 2 λ 0 Γ ( x ( k ) ) + � λ i D T Λ( D i x ( k ) ) � � D i . i i = 1 Q ( k ) = B T B + A T M ( k ) A x ( k + 1 ) = A [ Q ( k ) ] − 1 � � B T BA − 1 y − λ 0 A T b 11 / 21

  22. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS BEADS Algorithm Input: y , A , B , λ i , i = 0 , . . . , M b = B T BA − 1 y 1 . 2 . x = y (Initialization) Repeat [ Λ i ] n , n = φ ′ ([ D i x ] n ) 3 . , i = 0 , . . . , M , [ D i x ] n M � λ i D T 4 . M = i Λ i D i i = 0 Q = B T B + A T MA 5 . x = AQ − 1 b 6 . Until converged f = y − x − BA − 1 ( y − x ) 8 . Output: x , f 12 / 21

  23. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Evaluation 1 50 50 40 40 30 30 20 20 10 10 0 0 −10 −10 1 2000 1 2000 Time (sample) Time (sample) 50 50 40 40 30 30 20 20 10 10 0 0 −10 −10 1 2000 1 2000 Time (sample) Time (sample) Figure : Simulated chromatograms w/ polynomial+sine baseline. 13 / 21

  24. I NTRODUCTION M ODELING BEADS ALGORITHM E VALUATION AND RESULTS C ONCLUSIONS Evaluation 1 with Gaussian noise 14 / 21

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend