numerical methods for the best low multilinear rank
play

Numerical methods for the best low multilinear rank approximation of - PowerPoint PPT Presentation

Introduction Some applications Algorithms Local minima Numerical methods for the best low multilinear rank approximation of higher-order tensors M. Ishteva 1 .-A. Absil 1 S. Van Huffel 2 L. De Lathauwer 2 , 3 P 1 Department of Mathematical


  1. Introduction Some applications Algorithms Local minima Numerical methods for the best low multilinear rank approximation of higher-order tensors M. Ishteva 1 .-A. Absil 1 S. Van Huffel 2 L. De Lathauwer 2 , 3 P 1 Department of Mathematical Engineering, Université catholique de Louvain 2 Department of Electrical Engineering ESAT/SCD, K.U.Leuven 3 Group Science, Engineering and Technology, K.U.Leuven Campus Kortrijk CESAME seminar, February 9, 2010

  2. Introduction Some applications Algorithms Local minima Outline Introduction 1 Some applications 2 Epileptic seizure onset localization Parameter estimation Algorithms 3 Geometric Newton algorithm Trust-region based algorithm Conjugate gradient based algorithm Local minima 4

  3. Introduction Some applications Algorithms Local minima Outline Introduction 1 Some applications 2 Epileptic seizure onset localization Parameter estimation Algorithms 3 Geometric Newton algorithm Trust-region based algorithm Conjugate gradient based algorithm Local minima 4

  4. Introduction Some applications Algorithms Local minima Scalars 4 67 R 2 3 . 23 − 45 . 8 √ 2 − 14 / 8 − 339 / 7534 − 4 . 397534

  5. Introduction Some applications Algorithms Local minima Vectors 4 R n 15 − 45 . 3 12 . 345 1 . 3 1 . 7 2 . 1 1 . 4 0 . 2 65 − 15 − 23 . 44

  6. Introduction Some applications Algorithms Local minima Matrices R m × n 1 . 3 0 . 6 1 . 1 2 . 0 1 . 9 1 . 7 1 . 0 1 . 3 2 . 8 2 . 6 2 . 1 2 . 1 1 . 6 3 . 7 2 . 7 1 . 4 1 . 4 1 . 4 2 . 2 2 . 0 0 . 2 1 . 3 0 . 9 1 . 9 1 . 7

  7. Introduction Some applications Algorithms Local minima Tensors R m × n × p

  8. Introduction Some applications Algorithms Local minima Ranks Rank- ( R 1 , R 2 , R 3 ) Rank- R • Rank-1 tensor: r � • R = min ( r ) , s.t. A = {rank-1 tensor} i i = 1 In general, R 1 � = R 2 � = R 3 � = R 1 and R 1 , R 2 , R 3 ≤ R .

  9. Introduction Some applications Algorithms Local minima Decompositions Singular value decomposition (SVD) Higher-order SVD (HOSVD) PARAFAC/CANDECOMP

  10. Introduction Some applications Algorithms Local minima Main problem Truncated SVD → Best rank- R approximation of a matrix Truncated HOSVD → Good but not best rank- ( R 1 , R 2 , R 3 ) approximation of a tensor Best rank- ( R 1 , R 2 , R 3 ) approximation of a tensor: third-order tensor A ∈ R I 1 × I 2 × I 3 Given: �A − ˆ A� 2 minimize ˆ A ∈ R I 1 × I 2 × I 3 rank 1 ( ˆ subject to: A ) ≤ R 1 , rank 2 ( ˆ A ) ≤ R 2 , rank 3 ( ˆ A ) ≤ R 3

  11. Introduction Some applications Algorithms Local minima Reformulation of the main problem third-order tensor A ∈ R I 1 × I 2 × I 3 Given: �A • 1 U T • 2 V T • 3 W T � 2 = � U T A ( 1 ) ( V ⊗ W ) � 2 maximize U ∈ St ( R 1 , I 1 ) V ∈ St ( R 2 , I 2 ) W ∈ St ( R 3 , I 3 ) A = A • 1 UU T • 2 VV T • 3 WW T . Then ˆ Notation : St ( R , I ) = { X ∈ R I × R : X T X = I }

  12. Introduction Some applications Algorithms Local minima Higher-order orthogonal iteration (HOOI) Initial values: HOSVD. To maximize � U T A ( 1 ) ( V ⊗ W ) � 2 ( ∗ ) , iterate until convergence: optimize ( ∗ ) over U : left dominant subspace of A ( 1 ) ( V ⊗ W ) � columns of U optimize ( ∗ ) over V , optimize ( ∗ ) over W . → Linear convergence

  13. Introduction Some applications Algorithms Local minima Goal and motivation Goal: conceptually faster algorithms matrix-based algorithms → tensor-based algorithms optimization w.r.t. numerical accuracy & comput. efficiency Applications of the best rank- ( R 1 , R 2 , R 3 ) approximation Chemometrics, biomedical signal processing, telecommunications, etc. Dimensionality reduction; signal subspace estimation

  14. Introduction Some applications Algorithms Local minima Outline Introduction 1 Some applications 2 Epileptic seizure onset localization Parameter estimation Algorithms 3 Geometric Newton algorithm Trust-region based algorithm Conjugate gradient based algorithm Local minima 4

  15. Introduction Some applications Algorithms Local minima Outline Introduction 1 Some applications 2 Epileptic seizure onset localization Parameter estimation Algorithms 3 Geometric Newton algorithm Trust-region based algorithm Conjugate gradient based algorithm Local minima 4

  16. Introduction Some applications Algorithms Local minima Electrodes

  17. Introduction Some applications Algorithms Local minima Electroencephalogram

  18. Introduction Some applications Algorithms Local minima Epileptic seizure onset localization: results of PARAFAC with 2 atoms temporal component frequency component spatial component temporal atom of seizure activity frequency distribution of seizure activity 0.06 0.18 0.16 0.04 0.14 0.02 0.12 0.1 0 0.08 −0.02 0.06 0.04 −0.04 0.02 −0.06 0 0 1 2 3 4 5 6 7 8 9 10 0 5 10 15 20 25 30 35 40 45 Time (sec) Freq (Hz) temporal atom of eye−blink activity frequency distribution of eye−blink activity 0.12 0.2 0.1 0.18 0.08 0.16 0.06 0.14 0.04 0.12 0.02 0.1 0 0.08 −0.02 0.06 −0.04 0.04 −0.06 0.02 −0.08 0 1 2 3 4 5 6 7 8 9 10 0 0 5 10 15 20 25 30 35 40 45 Time (sec) Freq (Hz) red atom: related to the seizure, blue atom: related to the eye blinks.

  19. Introduction Some applications Algorithms Local minima Dimensionality reduction A Rank- ( R 1 , R 2 , R 3 ) approx. Rank- R approx.

  20. Introduction Some applications Algorithms Local minima Outline Introduction 1 Some applications 2 Epileptic seizure onset localization Parameter estimation Algorithms 3 Geometric Newton algorithm Trust-region based algorithm Conjugate gradient based algorithm Local minima 4

  21. Introduction Some applications Algorithms Local minima Problem formulation: Multi-channel case x (3) x (3) x (3) x (3) x (3) . 0 1 2 3 M − 1 . . x (3) x (2) x (2) x (2) x (2) x (2) ch 3 3 0 1 2 3 M − 1 x (3) x (3) x (3) x (3) x (3) x (3) x (3) · · · x (2) 0 1 2 3 4 5 x (1) x (1) x (1) x (1) x (1) 4 1 0 1 2 3 M − 1 x (3) x (2) x (1) ch 2 5 2 1 x (2) x (2) x (2) x (2) x (2) x (2) x (2) x (1) · · · 0 1 2 3 4 5 3 x (3) x (3) 2 L − 1 N − 1 x (1) ch 1 3 x (2) x (2) L − 1 N − 1 x (1) x (1) x (1) x (1) x (1) x (1) · · · 0 1 2 3 4 5 x (1) x (1) L − 1 N − 1 x ( q ) k = 1 c ( q ) k + e ( q ) � K k z n = n n � K k = 1 a ( q ) exp { j ϕ ( q ) k } exp { ( − α k + 2 j πν k ) t n } + e ( q ) = . n k samples x ( q ) Given: n , n = 0 ,..., N − 1 , q = 1 ,..., Q estimates of the complex amplitudes c ( q ) Find: and the poles z k . k

  22. Introduction Some applications Algorithms Local minima HO-HTLSstack algorithm Compute the best rank- ( K , K , R 3 ) approx. ( R 3 ≤ min ( K , Q ) ) , ˆ H = S • 1 U • 2 V • 3 W . U ↑ ≈ U ↓ Z . Compute the eigenvalues λ k of Z : Estimate the poles: ˆ z k = λ k . Estimate the complex amplitudes: z k , x ( q ) c ( q ) ˆ → ˆ k . n

  23. Introduction Some applications Algorithms Local minima Observations  c Q .  1 .  . . .  c 3 .  1 c 2 · · · . 1 . c Q  c 1 . . . 1 K . . . . . H .  c 3 .  K =  c 2  K 1  c 1 · · · K  1 z 1   · · · 1   z 2 z K 1 1  1 · · ·  z 1 . z 2  . z 2   . K .  . 1 · .  · ·  . . 1 · . . z M · · . .  .  z L 1 . .  1 1 · · · · . ·  z K · . z L z 2 .  K K · · ·  z M K If the matrix of the amplitudes is ill-conditioned then reduce the mode-3 rank,

  24. Introduction Some applications Algorithms Local minima Outline Introduction 1 Some applications 2 Epileptic seizure onset localization Parameter estimation Algorithms 3 Geometric Newton algorithm Trust-region based algorithm Conjugate gradient based algorithm Local minima 4

  25. Introduction Some applications Algorithms Local minima Outline Introduction 1 Some applications 2 Epileptic seizure onset localization Parameter estimation Algorithms 3 Geometric Newton algorithm Trust-region based algorithm Conjugate gradient based algorithm Local minima 4

  26. Introduction Some applications Algorithms Local minima Reformulation g = � U T A ( 1 ) ( V ⊗ W ) � 2 , X = ( U , V , W ) , R 1 ( X ) = U T A ( 1 ) ( V ⊗ W ) . New goal F ( X ) ≡ ( F 1 ( X ) , F 2 ( X ) , F 3 ( X )) = 0 , F 1 ( X ) = U R 1 ( X ) R 1 ( X ) T − A ( 1 ) ( V ⊗ W ) R 1 ( X ) T . Invariance property F ( X ) = 0 ⇔ F ( XQ ) = 0 , XQ = ( UQ 1 , VQ 2 , WQ 3 ) , Q i –orth. ⇒ the zeros of F are not isolated ⇒ convergence issues → Work on R I 1 × R 1 / O R 1 × R I 2 × R 2 / O R 2 × R I 3 × R 3 / O R 3 . ∗ ∗ ∗

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend