cumulant signal processing tensors and some recurring
play

Cumulant Signal Processing, Tensors and some Recurring Problems - PowerPoint PPT Presentation

Multilinear Product An Easy Problem Symmetric Case Rank r Approximation Displacement Structure Cumulant Signal Processing, Tensors and some Recurring Problems Phil Regalia Department of Electrical Engineering and Computer Science


  1. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure Cumulant Signal Processing, Tensors and some Recurring Problems Phil Regalia Department of Electrical Engineering and Computer Science Catholic University of America Washington, DC 20064 with thanks to E. Kofidis, M. Mboup, and V . S. Grigorascu Future Directions in Tensors — NSF Feb 2009

  2. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure Outline Multilinear Product 1 An “Easy” Problem 2 Symmetric Case 3 Rank r Approximation 4 Displacement Structure 5

  3. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure Multilinear Product Given N matrices A ( n ) , the N -way (a.k.a. Tucker) product is K � T j 1 , j 2 ,..., j N = A ( 1 ) ⋆ A ( 2 ) ⋆ · · · ⋆ A ( N ) = A ( 1 ) j 1 , k A ( 2 ) j 2 , k · · · A ( N ) j N , k k = 1 With a core N -dimensional tensor S , the weighted product is A ( 1 ) S ⋆ A ( 2 ) S S ⋆ A ( N ) T = ⋆ · · · K 1 K 2 K N � � � A ( 1 ) j 1 , k 1 A ( 2 ) j 2 , k 2 · · · A ( N ) = · · · j N , k N S k 1 , k 2 ,..., k N k 1 = 1 k 2 = 1 k N = 1 and is equivalent to � A ( 1 ) ⊗ A ( 2 ) ⊗ · · · ⊗ A ( N ) � vec ( T ) = vec ( S ) Reduces to unweighted product when S k 1 , k 2 ,..., k N = δ ( k 1 , k 2 , . . . , k N ) .

  4. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure An “Easy” Approximation Problem Let u ( 1 ) , . . . , u ( N ) be a collection of unit-norm column vectors. Problem Find unit-norm vectors u ( 1 ) , . . . , u ( N ) and scalar σ to minimize Frobenius norm of S − σ · u ( 1 ) ⋆ u ( 2 ) ⋆ · · · ⋆ u ( N ) � �� � rank one Equivalent problem: Find unit norm vectors to maximize the functional � u ( 1 ) , . . . , u ( N ) � ( u ( 1 ) ) T S ⋆ ( u ( 2 ) ) T S S ⋆ ( u ( N ) ) T f = ⋆ · · · � � � u ( 1 ) k 1 u ( 2 ) k 2 · · · u ( N ) = · · · k N S k 1 , k 2 ,..., k N k 1 k 2 k N

  5. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure An “Easy” Approximation Problem Let u ( 1 ) , . . . , u ( N ) be a collection of unit-norm column vectors. Problem Find unit-norm vectors u ( 1 ) , . . . , u ( N ) and scalar σ to minimize Frobenius norm of S − σ · u ( 1 ) ⋆ u ( 2 ) ⋆ · · · ⋆ u ( N ) � �� � rank one Equivalent problem: Find unit norm vectors to maximize the functional � u ( 1 ) , . . . , u ( N ) � ( u ( 1 ) ) T S ⋆ ( u ( 2 ) ) T S S ⋆ ( u ( N ) ) T f = ⋆ · · · � � � u ( 1 ) k 1 u ( 2 ) k 2 · · · u ( N ) = · · · k N S k 1 , k 2 ,..., k N k 1 k 2 k N

  6. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure Simple iterative scheme Can write 1 f , u ( 1 ) � , f = �∇ Thus, given unit-norm vectors u ( 1 , i ) , . . . , u ( N , i ) at iteration i , update u ( 1 ) as � � u ( 2 , i ) · · · u ( N , i ) v = ∇ 1 f = · · · S k 1 , k 2 ,..., k N k 2 k N k 2 k N S ⋆ ( u ( 2 , i ) ) T S S ⋆ ( u ( N , i ) ) T = I ⋆ · · · v u ( 1 , i + 1 ) = � v � and likewise for u ( 2 ) , u ( 3 ) , . . . , u ( N ) , and repeat. Each update gives an increase in f ( u ( 1 ) , . . . , u ( N ) ) , so functional converges monotonically to a local maximum.

  7. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure Symmetric case Applications in blind deconvolution, source separation, or independent component analysis have a symmetric tensor: � � S k 1 , k 2 ,..., k N = cum x k 1 , x k 2 , . . . , x k N One seeks to maximize N -th order cumulant of u T x with � u � = 1: f ( u ) = cum N ( u T x ) = u T S S ⋆ u T ⋆ · · · � �� � N terms Symmetric version of previous algorithm gives � � u ( i ) k 2 · · · u ( i ) v = · · · k N S k 1 , k 2 ,..., k N k 2 k N v u ( i + 1 ) = � v � Sometimes converges, sometimes not . . .

  8. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure Symmetric case Applications in blind deconvolution, source separation, or independent component analysis have a symmetric tensor: � � S k 1 , k 2 ,..., k N = cum x k 1 , x k 2 , . . . , x k N One seeks to maximize N -th order cumulant of u T x with � u � = 1: f ( u ) = cum N ( u T x ) = u T S S ⋆ u T ⋆ · · · � �� � N terms Symmetric version of previous algorithm gives � � u ( i ) k 2 · · · u ( i ) v = · · · k N S k 1 , k 2 ,..., k N k 2 k N v u ( i + 1 ) = � v � Sometimes converges, sometimes not . . .

  9. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure Relax unit norm constraint to unit ball: � � B = u : � u � ≤ 1 . Special case: f ( u ) = u T S ⋆ u T is convex (or concave) over B . S ⋆ · · · f ( r ) Gradient inequality: f ( r ) ≥ f ( q ) + � r − q , ∇ f ( q ) � f ( q ) for all q , r ∈ B . f ( q ) + �∇ f ( q ) , r − q � q r This applies, therefore, to successive iterates: q = u ( i ) and r = u ( i + 1 ) .

  10. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure From gradient inequality, f ( u ( i + 1 ) ) − f ( u ( i ) ) ≥ � u ( i + 1 ) − u ( i ) , ∇ f ( u ( i ) ) � and since u ( i + 1 ) = ∇ f / �∇ f � , we have if u ( i + 1 ) � = u ( i ) , � u ( i + 1 ) , ∇ f ( u ( i ) ) � = �∇ f ( u ( i ) ) � > � u ( i ) , ∇ f ( u ( i ) ) � , so that f ( u ( i + 1 ) ) > f ( u ( i ) ) . Remark: The discrepancy f ( r ) f ( q ) D f ( r , q ) = f ( r ) − f ( q ) −� r − q , ∇ f ( q ) � ≥ 0 f ( q ) + �∇ f ( q ) , r − q � q r is the Bregman distance induced by f . If f is strongly convex, i.e., D f ( r , q ) ≥ α � r − q � 2 , for all q , r ∈ B , then local convergence rate is at least linear.

  11. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure What if f ( u ) is not convex over B ? 0.005 0.000 Functional f ( u ) -0.005 -0.010 -0.015 -0.020 0 5 10 15 20 25 30 Iteration number

  12. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure Gradient interpretation Rewrite the functional as a “Rayleigh quotient” over B : J ( u ) = f ( u ) � u � N , u ∈ B\ 0 and consider a gradient ascent algorithm: u ( i ) + µ i ∇ J ( u ( i ) ) v = u ( i ) + µ i [ ∇ f ( u ( i ) ) − β i u ( i ) ] = v u ( i + 1 ) = � v � with β i = �∇ f ( u ( i ) ) , u ( i ) � . Previous algorithm is obtained using µ i = 1 /β i .

  13. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure Gradient interpretation Rewrite the functional as a “Rayleigh quotient” over B : J ( u ) = f ( u ) � u � N , u ∈ B\ 0 and consider a gradient ascent algorithm: u ( i ) + µ i ∇ J ( u ( i ) ) v = u ( i ) + µ i [ ∇ f ( u ( i ) ) − β i u ( i ) ] = v u ( i + 1 ) = � v � with β i = �∇ f ( u ( i ) ) , u ( i ) � . Previous algorithm is obtained using µ i = 1 /β i .

  14. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure A simple “trick” Consider “regularized” functional g ( u ) = f ( u ) + γ � � u T u − 1 , u ∈ B , 2 with γ a free parameter, so that g ( u ) = f ( u ) for u ∈ ∂ B . Now, ∇ 2 g ( u ) ≥ 0 over B . g ( u ) is convex over B ⇔ The Hessians are related as ∇ 2 g ( u ) = ∇ 2 f ( u ) + γ I so with � �� ∆ � ∇ 2 f ( u ) λ ∗ = min λ min u ∈B choose γ ≥ − λ ∗ .

  15. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure Gradient ascent algorithm for “regularized” functional is no different: u ( i ) + µ i [ ∇ g i − α i u ( i ) ] α i = �∇ g i , u ( i ) � v = u ( i ) + µ i [ ∇ f i + γ u ( i ) − β i u ( i ) − γ u ( i ) ] = u ( i ) + µ i [ ∇ f i − β i u ( i ) ] = v u ( i + 1 ) = � v � This gives f ( u ( i + 1 ) ) > f ( u ( i ) ) if µ i = 1 /α i = 1 / ( β i + γ ) . Theorem (R. & Kofidis, 2003) If u ( i ) is not a stationary point, then f ( u ( i + 1 ) ) > f ( u ( i ) ) whenever 2 ( β i − λ ∗ ) 0 < µ i < i + ( β i − λ ∗ ) 2 − �∇ f ( u ( i ) ) � 2 β 2

  16. Multilinear Product An “Easy” Problem Symmetric Case Rank r Approximation Displacement Structure Gradient ascent algorithm for “regularized” functional is no different: u ( i ) + µ i [ ∇ g i − α i u ( i ) ] α i = �∇ g i , u ( i ) � v = u ( i ) + µ i [ ∇ f i + γ u ( i ) − β i u ( i ) − γ u ( i ) ] = u ( i ) + µ i [ ∇ f i − β i u ( i ) ] = v u ( i + 1 ) = � v � This gives f ( u ( i + 1 ) ) > f ( u ( i ) ) if µ i = 1 /α i = 1 / ( β i + γ ) . Theorem (R. & Kofidis, 2003) If u ( i ) is not a stationary point, then f ( u ( i + 1 ) ) > f ( u ( i ) ) whenever 2 ( β i − λ ∗ ) 0 < µ i < i + ( β i − λ ∗ ) 2 − �∇ f ( u ( i ) ) � 2 β 2

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend