minkowski decomposition and geometric predicates in
play

Minkowski Decomposition and Geometric Predicates in Sparse - PowerPoint PPT Presentation

Minkowski Decomposition and Geometric Predicates in Sparse Implicitization I.Z. Emiris, C. Konaxis and Z. Zafeirakopoulos University of Athens Talk given by Ilias Kotsireas Wilfrid Laurier University ISSAC15, Bath, July 2015 Ilias


  1. Minkowski Decomposition and Geometric Predicates in Sparse Implicitization I.Z. Emiris, C. Konaxis and Z. Zafeirakopoulos University of Athens Talk given by Ilias Kotsireas Wilfrid Laurier University ISSAC’15, Bath, July 2015 Ilias Kotsireas ISSAC’15 1 / 23

  2. Outline Implicitization by interpolation. 1 Minkowski decomposition of the predicted polytope. 2 Geometric predicates as matrix operations. 3 Ilias Kotsireas ISSAC’15 2 / 23

  3. Implicitization by interpolation Ilias Kotsireas ISSAC’15 3 / 23

  4. Implicitization Given parameterization x 0 = α 0 ( t ) , . . . , x n = α n ( t ) , t := ( t 1 , . . . , t n ) , compute the smallest algebraic variety containing the closure of the image of α : R n → R n +1 : t �→ α ( t ) , α := ( α 0 , . . . , α n ) . This is contained in the variety defined by the ideal � p ( x 0 , . . . , x n ) | p ( α 0 ( t ) , . . . , α n ( t )) = 0 , ∀ t � . When this is a principal ideal we wish to compute its defining polynomial p ( x ), given its Newton polytope N ( p ( x )) ( implicit polytope ), or, a superset of the exponents of its monomials with nonzero coefficient ( implicit support ). Ilias Kotsireas ISSAC’15 4 / 23

  5. Approach Support prediction followed by interpolation. Superset S of the implicit support: support of specialized sparse resultant. Construct µ × | S | ( µ ≥ | S | ) matrix M : columns indexed by monomials with exponents in S , rows indexed by values of t at which the monomials are evaluated. The (ideally) unique kernel vector of M contains the coefficients of these monomials in the implicit equation. Example (Folium of Descartes) 3 t 2 3 t x 0 = t 3 + 1 , x 1 = t 3 + 1 , S = { (0 , 3) , (3 , 0) , (1 , 1) } ➯  x 3 x 3  1 ( τ 1 ) 0 ( τ 1 ) x 0 ( τ 1 ) x 1 ( τ 1 ) x 3 x 3 1 ( τ 2 ) 0 ( τ 2 ) x 0 ( τ 2 ) x 1 ( τ 2 )  ➯ p ( x 0 , x 1 ) = x 3 0 − 3 x 0 x 1 + x 3   M = 1 .  x 3 x 3  1 ( τ 3 ) 0 ( τ 3 ) x 0 ( τ 3 ) x 1 ( τ 3 )  x 3 x 3 1 ( τ 4 ) 0 ( τ 4 ) x 0 ( τ 4 ) x 1 ( τ 4 ) Ilias Kotsireas ISSAC’15 5 / 23

  6. Previous work 16 variables Integration of matrix SS T over each parameter t 1 , . . . , t n . Successively larger supports to capture sparseness [Corless-Giesbrecht-Kotsireas-Watt ’00]. Successively larger supports also used in [Dokken-Thomassen ’03] in the setting of approximate implicitization. Tropical geometry [Sturmfels-Tevelev-Yu ’07] leads to algorithms for the polytope of (specialized) resultants [Jensen-Yu ’12]. Our method developed in [Emiris-Kalinka-Konaxis-Luu Ba ’13a, ’13b]. Ilias Kotsireas ISSAC’15 6 / 23

  7. Implicitization reduced to elimination Setup (for omitted details see paper) Given x i = α i ( t ) β i ( t ) , i = 0 , . . . , n , define polynomials in ( R [ x 0 , . . . , x n ])[ t ]: f i := x i β i ( t ) − α i ( t ) , i = 0 , . . . , n , with supports A i ⊂ Z n . If the parameterization is generically 1-1, then p ( x 0 , . . . , x n ) = Res ( f 0 , . . . , f n ) , provided that Res ( f 0 , . . . , f n ) �≡ 0 . Projection Let F := { F 0 , . . . , F n } ∈ R ( c ij )[ t ] be generic polynomials wrt A i and let φ be the specialization of their symbolic coefficients c ij to those of the f ′ i s : φ : c ij �→ a ij + b ij x i . The φ -projection of the Newton polytope N ( Res ( F )) of the sparse resultant of polynomials in F , contains (a translate of) the Newton polytope of the implicit polynomial, or implicit polytope . Ilias Kotsireas ISSAC’15 7 / 23

  8. Example Given parameterization ( Buchberger’88 ): x 1 = t 1 t 2 x 2 = t 2 x 0 = t 1 t 2 , 2 , 1 , we define: f 1 := x 1 − t 1 t 2 f 2 := x 2 − t 2 f 0 := x 0 − t 1 t 2 , 2 , 1 , F 1 := c 10 − c 11 t 1 t 2 F 2 := c 20 − c 21 t 2 F 0 := c 00 − c 01 t 1 t 2 , 2 , 1 and φ ( c 00 , c 01 , c 10 , c 11 , c 20 , c 21 ) = ( x 0 , − 1 , x 1 , − 1 , x 3 , − 1) . Res( F ) = − c 4 00 c 2 11 c 21 + c 4 01 c 2 10 c 20 , N (Res( F )) = ((4 , 0 , 0 , 2 , 0 , 1) , (0 , 4 , 2 , 0 , 1 , 0)) and projection by φ yields implicit polytope ((4 , 0 , 0) , (0 , 2 , 1)). We construct M by evaluating ( t 1 , t 2 ) at random τ 1 , τ 2 , τ 3 ∈ C 2 : x 4 x 2   0 ( τ 1 ) 1 ( τ 1 ) x 2 ( τ 1 ) x 4 x 2  . 0 ( τ 2 ) 1 ( τ 2 ) x 2 ( τ 2 ) M =  x 4 x 2 0 ( τ 3 ) 1 ( τ 3 ) x 2 ( τ 3 ) Kernel vector ( − 1 , 1) yields implicit equation: − x 4 0 + x 2 1 x 2 . Ilias Kotsireas ISSAC’15 8 / 23

  9. Implicit support Input: n + 1 supports A i ⊂ Z n , and projection φ . Output (predicted implicit polytope): the φ -projection of the Newton polytope of the sparse resultant of A 0 , . . . , A n . ResPol [Emiris-Fisikopoulos-Konaxis-Pe˜ naranda’12] computes precise polytope of bicubic surface in 1sec, with 715 terms. The lattice points in the implicit polytope correspond to the exponent vectors of the implicit support. Ilias Kotsireas ISSAC’15 9 / 23

  10. Geometry of predicted support & higher dimensional kernel Assumption The interpolation matrix M is built with sufficiently generic evaluation points. Theorem ( Emiris-Kalinka-Konaxis-Luu Ba ’13) For projection φ : F i �→ f i s.t. not all leading coefficients of f i vanish, φ ( Res ( F )) = c ( x ) · p ( x ) . If P = N ( p ) is the implicit and Q = φ ( N ( Res ( F ))) the predicted polytopes, then Q ⊇ E + P , for some polytope E . ( + : Minkowski addition) . In particular, the kernel of M has dimension equal to the # lattice points in E. Corollary If v 1 , . . . , v k is a basis of the kernel of M, and g 1 , . . . , g k the corresponding polynomials, i.e. g i = v T i S, then gcd( g 1 , . . . , g k ) = p ( x ) ; can also use factoring Ilias Kotsireas ISSAC’15 10 / 23

  11. Geometry of predicted support & higher dimensional kernel Example ( Folium of Descartes: x 0 = 3 t 2 / t 3 + 1 , x 1 = 3 t / t 3 + 1 ) Polynomials from kernel vectors g 1 = x 0 ( x 3 0 − 3 x 0 x 1 + x 3 1 ) g 2 = x 1 ( x 3 0 − 3 x 0 x 1 + x 3 1 ) Q = P + E P E Q = P + E To extract the implicit polytope we employ Minkowski decomposition of Q . Ilias Kotsireas ISSAC’15 11 / 23

  12. Minkowski decomposition of the predicted polytope Ilias Kotsireas ISSAC’15 12 / 23

  13. Minkowski Decomposition of the Predicted polytope: Representation Input A list of polygons: Facets of the 3-dimensional predicted polytope. Primitive edges V = [ v 1 , v 2 , . . . , v m ] ordered list of vertices. For every edge ( v i 1 , v i 2 ), define vector v i 2 − v i 1 . A cube flattened. Let ℓ i be its integer length and e i the Orientation shown in some corresponding primitive vector. of its facets. Common edges can have different orientation (red) in each facet. Ilias Kotsireas ISSAC’15 13 / 23

  14. The Integer Linear Program (ILP) Conditions for Decomposition Given a facet F , it satisfies � σ i , F ℓ i e i = 0 , i s.t. e i ∈ F where σ i , F is the sign of e i and depends on the orientation of the edge ℓ i e i in the facet F . ILP A set of sub-edges a i e i (red) of the square forming a a i ∈ N , 0 ≤ a i ≤ ℓ i , polygon. � σ i , F a i e i = 0 , for every facet F . i s.t. e i ∈ F Solutions that give summands homothetic to the input are excluded by adding two more appropriate equations. Ilias Kotsireas ISSAC’15 14 / 23

  15. Recursion and Balance Balancing We can choose the objective function of the ILP in order to decompose to polytopes with specific characteristics. Due to the nature of the motivating problem, we prefer the decomposition to two summands of almost equal edge length sum, i.e., “balanced”. This is achieved by setting as the objective function the sum of the edge lengths. Recursion We recurse over the obtained two summand polytopes until we reach an indecomposable polytope. Ilias Kotsireas ISSAC’15 15 / 23

  16. Example (Eight-surface) x 0 = 4 s ( − 1 + t 2 2 )( − 1 + t 2 x 1 = − 8 t 1 t 2 ( − 1 + t 2 1 ) 1 ) 2 t 1 , 1 ) 2 , x 2 = (1 + t 2 2 )(1 + t 2 (1 + t 2 2 )(1 + t 2 (1 + t 2 1 ) 2 1 ) Predicted polytope leads to a 67 × 67 interpolation matrix. Minkowksi decomposition gives true implicit polytope (2nd summand) and a 10 × 10 matrix. + = Ilias Kotsireas ISSAC’15 16 / 23

  17. Geometric predicates as matrix operations Ilias Kotsireas ISSAC’15 17 / 23

  18. Geometric predicates using interpolation matrices Setup S : superset of implicit support, m ( x ): vector of monomials (in the variables x i ) with exponents in S . Fix generic distinct values τ k , k = 1 , . . . , | S | − 1. Construct ( | S | − 1) × | S | numeric matrix M ′ whose k th row, k = 1 , . . . , | S | − 1, is vector m ( x ) evaluated at τ k . Construct � M ′ � M ( x ) = . m ( x ) Lemma Assuming M ′ is of full rank (equiv., the predicted polytope Q contains only one translate of P), the determinant of matrix M ( x ) equals the implicit polynomial p ( x ) up to a constant. Ilias Kotsireas ISSAC’15 18 / 23

  19. Membership predicate Membership Given x i = f i ( t ) / g i ( t ) , i = 0 , . . . , n , and query point q ∈ R n +1 , decide if p ( q ) = 0, where p ( x ) is the implicit polynomial, by using the interpolation matrix. Lemma Given M ( x ) and a query point q in ( R ∗ ) n +1 , let M ( q ) denote matrix M ( x ) where its last row is evaluated by q. Then q lies on the hypersurface defined by p ( x ) = 0 if and only if corank ( M ( q )) = corank ( M ′ ) . Numeric matrix M ′ need not be of full rank. Ilias Kotsireas ISSAC’15 19 / 23

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend