n umerical r esults q uality s peedup
play

N UMERICAL R ESULTS (Q UALITY , S PEEDUP ) H 2 -matrix stored - PowerPoint PPT Presentation

H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS A PPROXIMATING G AUSSIAN P ROCESSES WITH H 2 - MATRICES Steffen Brm 1 Jochen Garcke 2 1 Christian-Albrechts-Universitt zu Kiel 2 Universitt Bonn and Fraunhofer SCAI 1 / 23 H 2


  1. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS A PPROXIMATING G AUSSIAN P ROCESSES WITH H 2 - MATRICES Steffen Börm 1 Jochen Garcke 2 1 Christian-Albrechts-Universität zu Kiel 2 Universität Bonn and Fraunhofer SCAI 1 / 23

  2. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS O UTLINE 1 G AUSSIAN P ROCESSES 2 H IERARCHICAL MATRICES 3 H 2 - MATRIX 4 R ESULTS 2 / 23

  3. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS G AUSSIAN P ROCESSES given set of data S = { ( x i , y i ) ∈ ❘ d × ❘ } N i = 1 assuming a Gaussian process prior on f ( x ) , i.e values f ( x ) are Gaussian distributed with zero mean and covariance matrix K kernel (or covariance) function k ( · , · ) defines K via K i , j = k ( x i , x j ) . typical kernel: Gaussian RBF k ( x , y ) = e −� x − y � 2 / w representer theorem gives solution f ( x ) as N � f ( x ) = α i k ( x i , x ) i = 1 coefficient vector α is the solution of the linear equation system ( K + σ 2 I ) α = y full N × N matrix → O ( N 2 ) complexity, unfeasible for large data approximation needed, e.g. use subset of size M in computational core → O ( M 2 · N ) use iterative solver with approximation of matrix vector product K α 3 / 23

  4. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS H IERARCHICAL MATRICES data sparse approximation of kernel matrix O ( Nm log N ) for storage, (local rank) m controls accuracy operations like matrix-vector product, matrix multiplication or inversion can now be computed efficiently efficient computation of H -matrix approximation needed H -matrix approach developed for efficient treatment of dense matrix arising from discretization of integral operators efficient computation for 2D, 3D problems exists strongly related to fast multipole, panel clustering, fast gauss transform 4 / 23

  5. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS 1D M ODEL P ROBLEM in the following we present the underlying ideas in one dimension τ we look at blocks in the (permuted) matrix whose ̺ corresponding subregions have a certain 1D-distance ̺ τ employ Taylor-expansion to approximate kernel Taylor-expansion only used for explanation, but not in algorithm 5 / 23

  6. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS P ANEL CLUSTERING D EGENERATE APPROXIMATION : If k is sufficiently smooth in a subdomain τ × ̺ , we can approximate by a Taylor series: m − 1 � ( x − x τ ) ν ∂ ν k ˜ k ( x , y ) := ∂ x ν ( x τ , y ) ( x ∈ τ, y ∈ ̺ ) ν ! ν = 0 F ACTORIZATION : For i , j ∈ I with x i ∈ τ and x j ∈ ̺ we find m − 1 � ( x i − x τ ) ν ∂ ν k K ij = k ( x i , x j ) ≈ ˜ k ( x i , x j ) = ∂ x ν ( x τ , x j ) ν ! � �� � � �� � ν = 0 =( A τ,̺ ) i ν =( B τ,̺ ) j ν m − 1 � = ( A τ,̺ ) i ν ( B τ,̺ ) j ν ν = 0 6 / 23

  7. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS P ANEL CLUSTERING D EGENERATE APPROXIMATION : If k is sufficiently smooth in a subdomain τ × ̺ , we can approximate by a Taylor series: m − 1 � ( x − x τ ) ν ∂ ν k ˜ k ( x , y ) := ∂ x ν ( x τ , y ) ( x ∈ τ, y ∈ ̺ ) ν ! ν = 0 F ACTORIZATION : For the sets ˆ τ := { i : x i ∈ τ } , ˆ ̺ := { j : x j ∈ ̺ } we find ̺ ≈ A τ,̺ B ⊤ K| ˆ τ × ˆ τ,̺ Storage m (#ˆ τ + #ˆ ̺ ) instead of (#ˆ τ )(#ˆ ̺ ) . R ESULT : Significant reduction of storage requirements if m ≪ #ˆ τ, #ˆ ̺ . 6 / 23

  8. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS C LUSTER TREE AND BLOCK PARTITION G OAL : Split Ω × Ω into subdomains satisfying the admissibility condition diam ( τ ) ≤ 2 dist ( τ, ̺ ) ( ≤ 2 for demonstration purposes) Start with τ = ̺ = Ω . Nothing is admissible. 7 / 23

  9. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS C LUSTER TREE AND BLOCK PARTITION G OAL : Split Ω × Ω into subdomains satisfying the admissibility condition diam ( τ ) ≤ 2 dist ( τ, ̺ ) τ and ̺ are subdivided. Still nothing is admissible. 7 / 23

  10. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS C LUSTER TREE AND BLOCK PARTITION G OAL : Split Ω × Ω into subdomains satisfying the admissibility condition diam ( τ ) ≤ 2 dist ( τ, ̺ ) τ ̺ We split the intervals again. And find an admissible block. ̺ τ dist 7 / 23

  11. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS C LUSTER TREE AND BLOCK PARTITION G OAL : Split Ω × Ω into subdomains satisfying the admissibility condition diam ( τ ) ≤ 2 dist ( τ, ̺ ) We find six admissible blocks on this level. 7 / 23

  12. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS C LUSTER TREE AND BLOCK PARTITION G OAL : Split Ω × Ω into subdomains satisfying the admissibility condition diam ( τ ) ≤ 2 dist ( τ, ̺ ) The procedure is repeated... 7 / 23

  13. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS C LUSTER TREE AND BLOCK PARTITION G OAL : Split Ω × Ω into subdomains satisfying the admissibility condition diam ( τ ) ≤ 2 dist ( τ, ̺ ) (up to a small remainder). The procedure is repeated until only a small subdomain remains. R ESULT : Domain Ω × Ω partitioned into blocks τ × ̺ . Clusters τ, ̺ ⊆ Ω organized in a cluster tree. 7 / 23

  14. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS H IERARCHICAL MATRIX I DEA : Use low-rank approximation in all admissible blocks ˆ τ × ˆ ̺ . Standard representation of original matrix K requires N 2 units of storage. 8 / 23

  15. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS H IERARCHICAL MATRIX I DEA : Use low-rank approximation in all admissible blocks ˆ τ × ˆ ̺ . Replace admissible block K| ˆ ̺ by τ × ˆ low-rank approximation � ̺ = A τ,̺ B ⊤ K| ˆ τ,̺ . τ × ˆ 8 / 23

  16. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS H IERARCHICAL MATRIX I DEA : Use low-rank approximation in all admissible blocks ˆ τ × ˆ ̺ . Replace all admissible blocks by low-rank approximations, leave inadmissible blocks unchanged. R ESULT : Hierarchical matrix approximation � K of K . S TORAGE REQUIREMENTS : One row of � K represented by only O ( m log N ) units of storage, total storage requirements O ( Nm log N ) . 8 / 23

  17. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS S ECOND APPROACH : C ROSS APPROXIMATION O BSERVATION : If M is a rank 1 matrix and we have pivot indices i ∗ , j ∗ with M i ∗ j ∗ � = 0, we get the representation M = ab ⊤ , a i := M ij ∗ / M i ∗ j ∗ , b j := M i ∗ j . I DEA : If M can be approximated by a rank 1 matrix, we still can find i ∗ , j ∗ with M i ∗ j ∗ � = 0 and M ≈ ab ⊤ . H IGHER RANK : Repeating the procedure for the error matrix yields rank m approximation of arbitrary accuracy. E FFICIENT : If the pivot indices are known, only m rows and columns of M are required to construct a rank m approximation. P ROBLEM : Selection of pivot indices. Efficient strategies needed. Provable in certain settings. For our case it works (but till did not work on a proof) 9 / 23

  18. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS U NIFORM HIERARCHICAL MATRIX G OAL : Reduce the storage requirements. A PPROACH : Expansion in both variables � ∂ x ν ∂ y µ ( x τ , y ̺ )( x − x τ ) ν ∂ ν + µ k ( y − y ̺ ) µ k ( x , y ) ≈ ν ! µ ! ν + µ< m yields low-rank factorization ̺ ≈ V τ S τ,̺ V ⊤ K| ˆ ̺ , τ × ˆ ( V τ ) i ν := ( x i − x τ ) ν ( S τ,̺ ) νµ := ∂ ν + µ k dx , ∂ x ν ∂ y µ ( x τ , y ̺ ) . ν ! I MPORTANT : V τ depends only on one cluster ( τ ). Only the small matrix S τ,̺ ∈ R m × m depends on both clusters. 10 / 23

  19. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS H 2 - MATRIX I DEA : Use three-term factorization in all admissible blocks ˆ τ × ˆ ̺ . Standard representation of original matrix K requires N 2 units of storage. 11 / 23

  20. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS H 2 - MATRIX I DEA : Use three-term factorization in all admissible blocks ˆ τ × ˆ ̺ . Replace admissible block K| ˆ ̺ by τ × ˆ low-rank approximation � ̺ = V τ S τ,̺ V ⊤ K| ˆ ̺ . τ × ˆ 11 / 23

  21. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS H 2 - MATRIX I DEA : Use three-term factorization in all admissible blocks ˆ τ × ˆ ̺ . Replace all admissible blocks by low-rank approximations, leave inadmissible blocks unchanged. 11 / 23

  22. H 2 - MATRIX G AUSSIAN P ROCESSES H IERARCHICAL MATRICES R ESULTS H 2 - MATRIX I DEA : Use three-term factorization in all admissible blocks ˆ τ × ˆ ̺ . Use nested representation for the cluster basis. Use transfer matrices T τ ′ ∈ R k × k with V τ | ˆ τ ′ × k = V τ ′ T τ ′ for all sons τ ′ ∈ sons ( τ ) to handle cluster basis ( V τ ) efficiently. R ESULT : H 2 -matrix approximation � K of K . S TORAGE REQUIREMENTS : One row of � K represented by only O ( m ) units of storage, total storage requirements O ( Nm ) . 11 / 23

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend