hierarchical clustering on special manifolds
play

Hierarchical Clustering on Special Manifolds Motivation Background - PowerPoint PPT Presentation

Hierarchical Clustering on Special Manifolds Markos & Menexes Hierarchical Clustering on Special Manifolds Motivation Background Manifolds Angelos Markos 1 George Menexes 2 Special Manifolds HCA on Special 1 Democritus University of


  1. Riemannian Manifolds 3/4 Hierarchical Clustering on The exponential map, exp X : T X �→ M , maps the vector y in Special Manifolds the tangent space to the point on the manifold reached by the Markos & geodesic after unit time exp X ( y ) = 1 . Menexes Motivation Background Manifolds Special Manifolds HCA on Special Manifolds Distance & Mean Intrinsic Case Extrinsic Case HCA Algorithms Experiments Figure: Basic geometry of a manifold and its tangent space at a point Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  2. Riemannian Manifolds 4/4 Hierarchical The inverse exponential mapping log X : M �→ T X takes the Clustering on Special point Y on the manifold and returns it on the tangent space. It Manifolds is uniquely defined only around the neighborhood of the point Markos & Menexes X . Motivation Background Manifolds Special Manifolds HCA on Special Manifolds Distance & Mean Intrinsic Case Extrinsic Case HCA Algorithms Experiments Figure: Basic geometry of a manifold and its tangent space at a point Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  3. Outline Hierarchical 1 Motivation Clustering on Special Manifolds 2 Background Markos & Manifolds Menexes Special Manifolds Motivation 3 HCA on Special Manifolds Background Manifolds Distance & Mean Special Manifolds Intrinsic Case HCA on Special Extrinsic Case Manifolds HCA Algorithms Distance & Mean Intrinsic Case Extrinsic Case 4 Experiments HCA Algorithms Experiments 5 Summary & Future Work Summary & Future Work 6 References References Markos & Menexes Hierarchical Clustering on Special Manifolds

  4. Special Manifolds Hierarchical Clustering on Special We consider two differentiable manifolds with well-defined Manifolds mathematical properties, the Stiefel and Grassmann Markos & Menexes manifolds. A good introduction to their geometry can be found in Edelman et al. (1999). Motivation Background In terms of their (differential) topology, the special Manifolds Special manifolds can be described Manifolds a. as embedded submanifolds of the real Euclidean space HCA on Special b. as quotient spaces of the orthogonal group under Manifolds Distance & different equivalence relations. Mean Intrinsic Case Extrinsic Case The equivalence classes on special manifolds induce some HCA Algorithms nice mathematical properties and make geodesic distance Experiments computable. Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  5. Special Manifolds Hierarchical Clustering on Special We consider two differentiable manifolds with well-defined Manifolds mathematical properties, the Stiefel and Grassmann Markos & Menexes manifolds. A good introduction to their geometry can be found in Edelman et al. (1999). Motivation Background In terms of their (differential) topology, the special Manifolds Special manifolds can be described Manifolds a. as embedded submanifolds of the real Euclidean space HCA on Special b. as quotient spaces of the orthogonal group under Manifolds Distance & different equivalence relations. Mean Intrinsic Case Extrinsic Case The equivalence classes on special manifolds induce some HCA Algorithms nice mathematical properties and make geodesic distance Experiments computable. Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  6. Special Manifolds Hierarchical Clustering on Special We consider two differentiable manifolds with well-defined Manifolds mathematical properties, the Stiefel and Grassmann Markos & Menexes manifolds. A good introduction to their geometry can be found in Edelman et al. (1999). Motivation Background In terms of their (differential) topology, the special Manifolds Special manifolds can be described Manifolds a. as embedded submanifolds of the real Euclidean space HCA on Special b. as quotient spaces of the orthogonal group under Manifolds Distance & different equivalence relations. Mean Intrinsic Case Extrinsic Case The equivalence classes on special manifolds induce some HCA Algorithms nice mathematical properties and make geodesic distance Experiments computable. Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  7. Special Manifolds Hierarchical Clustering on Special We consider two differentiable manifolds with well-defined Manifolds mathematical properties, the Stiefel and Grassmann Markos & Menexes manifolds. A good introduction to their geometry can be found in Edelman et al. (1999). Motivation Background In terms of their (differential) topology, the special Manifolds Special manifolds can be described Manifolds a. as embedded submanifolds of the real Euclidean space HCA on Special b. as quotient spaces of the orthogonal group under Manifolds Distance & different equivalence relations. Mean Intrinsic Case Extrinsic Case The equivalence classes on special manifolds induce some HCA Algorithms nice mathematical properties and make geodesic distance Experiments computable. Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  8. Special Manifolds Hierarchical Clustering on Special We consider two differentiable manifolds with well-defined Manifolds mathematical properties, the Stiefel and Grassmann Markos & Menexes manifolds. A good introduction to their geometry can be found in Edelman et al. (1999). Motivation Background In terms of their (differential) topology, the special Manifolds Special manifolds can be described Manifolds a. as embedded submanifolds of the real Euclidean space HCA on Special b. as quotient spaces of the orthogonal group under Manifolds Distance & different equivalence relations. Mean Intrinsic Case Extrinsic Case The equivalence classes on special manifolds induce some HCA Algorithms nice mathematical properties and make geodesic distance Experiments computable. Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  9. The Stiefel Manifold Hierarchical Clustering on Special Manifolds Let V k,m denote the Stiefel manifold which is the space of Markos & Menexes k orthonormal vectors in R m represented by the set of m × k ( k ≤ m ) matrices X such that X ′ X = I k , where I k Motivation Background is the k × k identity matrix (Chikuse, 2003). Manifolds Special For m = k , V k,m is the orthogonal group O ( m ) of m × m Manifolds HCA on orthonormal matrices. Special Manifolds The Stiefel manifold may be thought of as the quotient Distance & Mean space O ( m ) /O ( m − k ) with respect to the group of Intrinsic Case Extrinsic Case left-orthogonal transformations X → HX for H ∈ O ( m ) . HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  10. The Stiefel Manifold Hierarchical Clustering on Special Manifolds Let V k,m denote the Stiefel manifold which is the space of Markos & Menexes k orthonormal vectors in R m represented by the set of m × k ( k ≤ m ) matrices X such that X ′ X = I k , where I k Motivation Background is the k × k identity matrix (Chikuse, 2003). Manifolds Special For m = k , V k,m is the orthogonal group O ( m ) of m × m Manifolds HCA on orthonormal matrices. Special Manifolds The Stiefel manifold may be thought of as the quotient Distance & Mean space O ( m ) /O ( m − k ) with respect to the group of Intrinsic Case Extrinsic Case left-orthogonal transformations X → HX for H ∈ O ( m ) . HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  11. The Stiefel Manifold Hierarchical Clustering on Special Manifolds Let V k,m denote the Stiefel manifold which is the space of Markos & Menexes k orthonormal vectors in R m represented by the set of m × k ( k ≤ m ) matrices X such that X ′ X = I k , where I k Motivation Background is the k × k identity matrix (Chikuse, 2003). Manifolds Special For m = k , V k,m is the orthogonal group O ( m ) of m × m Manifolds HCA on orthonormal matrices. Special Manifolds The Stiefel manifold may be thought of as the quotient Distance & Mean space O ( m ) /O ( m − k ) with respect to the group of Intrinsic Case Extrinsic Case left-orthogonal transformations X → HX for H ∈ O ( m ) . HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  12. The Stiefel Manifold Hierarchical Clustering on Special Manifolds Let V k,m denote the Stiefel manifold which is the space of Markos & Menexes k orthonormal vectors in R m represented by the set of m × k ( k ≤ m ) matrices X such that X ′ X = I k , where I k Motivation Background is the k × k identity matrix (Chikuse, 2003). Manifolds Special For m = k , V k,m is the orthogonal group O ( m ) of m × m Manifolds HCA on orthonormal matrices. Special Manifolds The Stiefel manifold may be thought of as the quotient Distance & Mean space O ( m ) /O ( m − k ) with respect to the group of Intrinsic Case Extrinsic Case left-orthogonal transformations X → HX for H ∈ O ( m ) . HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  13. The Stiefel Manifold Hierarchical Clustering on Special Manifolds Let V k,m denote the Stiefel manifold which is the space of Markos & Menexes k orthonormal vectors in R m represented by the set of m × k ( k ≤ m ) matrices X such that X ′ X = I k , where I k Motivation Background is the k × k identity matrix (Chikuse, 2003). Manifolds Special For m = k , V k,m is the orthogonal group O ( m ) of m × m Manifolds HCA on orthonormal matrices. Special Manifolds The Stiefel manifold may be thought of as the quotient Distance & Mean space O ( m ) /O ( m − k ) with respect to the group of Intrinsic Case Extrinsic Case left-orthogonal transformations X → HX for H ∈ O ( m ) . HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  14. The Grassmann Manifold Hierarchical The Grassmann manifold G k,m is the space whose points Clustering on Special are k -dimensional linear subspaces of R m ( k -planes in R m , Manifolds containing the origin). To each k -plane ν in G k,m Markos & Menexes corresponds a unique m × m orthogonal projection matrix Motivation P idempotent of rank k onto ν . If the columns of an m × k matrix Y span ν , then Y Y ′ = P . (Mardia & Jupp, Background Manifolds 2009) Special Manifolds HCA on The Grassmann manifold can be identified by a quotient Special Manifolds representation O ( m ) /O ( k ) × O ( m − k ) . Distance & Mean Using the quotient representation of Stiefel manifolds, Intrinsic Case Extrinsic Case G k,m = V k,m /O ( k ) with respect to the group of HCA Algorithms right-orthogonal transformations X → XH for H ∈ O ( k ) . Experiments Summary & We can represent subspaces using their unique orthogonal Future Work projections References Markos & Menexes Hierarchical Clustering on Special Manifolds

  15. The Grassmann Manifold Hierarchical The Grassmann manifold G k,m is the space whose points Clustering on Special are k -dimensional linear subspaces of R m ( k -planes in R m , Manifolds containing the origin). To each k -plane ν in G k,m Markos & Menexes corresponds a unique m × m orthogonal projection matrix Motivation P idempotent of rank k onto ν . If the columns of an m × k matrix Y span ν , then Y Y ′ = P . (Mardia & Jupp, Background Manifolds 2009) Special Manifolds HCA on The Grassmann manifold can be identified by a quotient Special Manifolds representation O ( m ) /O ( k ) × O ( m − k ) . Distance & Mean Using the quotient representation of Stiefel manifolds, Intrinsic Case Extrinsic Case G k,m = V k,m /O ( k ) with respect to the group of HCA Algorithms right-orthogonal transformations X → XH for H ∈ O ( k ) . Experiments Summary & We can represent subspaces using their unique orthogonal Future Work projections References Markos & Menexes Hierarchical Clustering on Special Manifolds

  16. The Grassmann Manifold Hierarchical The Grassmann manifold G k,m is the space whose points Clustering on Special are k -dimensional linear subspaces of R m ( k -planes in R m , Manifolds containing the origin). To each k -plane ν in G k,m Markos & Menexes corresponds a unique m × m orthogonal projection matrix Motivation P idempotent of rank k onto ν . If the columns of an m × k matrix Y span ν , then Y Y ′ = P . (Mardia & Jupp, Background Manifolds 2009) Special Manifolds HCA on The Grassmann manifold can be identified by a quotient Special Manifolds representation O ( m ) /O ( k ) × O ( m − k ) . Distance & Mean Using the quotient representation of Stiefel manifolds, Intrinsic Case Extrinsic Case G k,m = V k,m /O ( k ) with respect to the group of HCA Algorithms right-orthogonal transformations X → XH for H ∈ O ( k ) . Experiments Summary & We can represent subspaces using their unique orthogonal Future Work projections References Markos & Menexes Hierarchical Clustering on Special Manifolds

  17. The Grassmann Manifold Hierarchical The Grassmann manifold G k,m is the space whose points Clustering on Special are k -dimensional linear subspaces of R m ( k -planes in R m , Manifolds containing the origin). To each k -plane ν in G k,m Markos & Menexes corresponds a unique m × m orthogonal projection matrix Motivation P idempotent of rank k onto ν . If the columns of an m × k matrix Y span ν , then Y Y ′ = P . (Mardia & Jupp, Background Manifolds 2009) Special Manifolds HCA on The Grassmann manifold can be identified by a quotient Special Manifolds representation O ( m ) /O ( k ) × O ( m − k ) . Distance & Mean Using the quotient representation of Stiefel manifolds, Intrinsic Case Extrinsic Case G k,m = V k,m /O ( k ) with respect to the group of HCA Algorithms right-orthogonal transformations X → XH for H ∈ O ( k ) . Experiments Summary & We can represent subspaces using their unique orthogonal Future Work projections References Markos & Menexes Hierarchical Clustering on Special Manifolds

  18. The Grassmann Manifold Hierarchical The Grassmann manifold G k,m is the space whose points Clustering on Special are k -dimensional linear subspaces of R m ( k -planes in R m , Manifolds containing the origin). To each k -plane ν in G k,m Markos & Menexes corresponds a unique m × m orthogonal projection matrix Motivation P idempotent of rank k onto ν . If the columns of an m × k matrix Y span ν , then Y Y ′ = P . (Mardia & Jupp, Background Manifolds 2009) Special Manifolds HCA on The Grassmann manifold can be identified by a quotient Special Manifolds representation O ( m ) /O ( k ) × O ( m − k ) . Distance & Mean Using the quotient representation of Stiefel manifolds, Intrinsic Case Extrinsic Case G k,m = V k,m /O ( k ) with respect to the group of HCA Algorithms right-orthogonal transformations X → XH for H ∈ O ( k ) . Experiments Summary & We can represent subspaces using their unique orthogonal Future Work projections References Markos & Menexes Hierarchical Clustering on Special Manifolds

  19. The Grassmann Manifold Hierarchical The Grassmann manifold G k,m is the space whose points Clustering on Special are k -dimensional linear subspaces of R m ( k -planes in R m , Manifolds containing the origin). To each k -plane ν in G k,m Markos & Menexes corresponds a unique m × m orthogonal projection matrix Motivation P idempotent of rank k onto ν . If the columns of an m × k matrix Y span ν , then Y Y ′ = P . (Mardia & Jupp, Background Manifolds 2009) Special Manifolds HCA on The Grassmann manifold can be identified by a quotient Special Manifolds representation O ( m ) /O ( k ) × O ( m − k ) . Distance & Mean Using the quotient representation of Stiefel manifolds, Intrinsic Case Extrinsic Case G k,m = V k,m /O ( k ) with respect to the group of HCA Algorithms right-orthogonal transformations X → XH for H ∈ O ( k ) . Experiments Summary & We can represent subspaces using their unique orthogonal Future Work projections References Markos & Menexes Hierarchical Clustering on Special Manifolds

  20. The Grassmann Manifold Hierarchical The Grassmann manifold G k,m is the space whose points Clustering on Special are k -dimensional linear subspaces of R m ( k -planes in R m , Manifolds containing the origin). To each k -plane ν in G k,m Markos & Menexes corresponds a unique m × m orthogonal projection matrix Motivation P idempotent of rank k onto ν . If the columns of an m × k matrix Y span ν , then Y Y ′ = P . (Mardia & Jupp, Background Manifolds 2009) Special Manifolds HCA on The Grassmann manifold can be identified by a quotient Special Manifolds representation O ( m ) /O ( k ) × O ( m − k ) . Distance & Mean Using the quotient representation of Stiefel manifolds, Intrinsic Case Extrinsic Case G k,m = V k,m /O ( k ) with respect to the group of HCA Algorithms right-orthogonal transformations X → XH for H ∈ O ( k ) . Experiments Summary & We can represent subspaces using their unique orthogonal Future Work projections References Markos & Menexes Hierarchical Clustering on Special Manifolds

  21. Statistics on Special Manifolds - Applications Hierarchical Clustering on Statistics on Riemannian manifolds have found wide Special Manifolds applicability in Shape Analysis (Goodall & Mardia 1999, Markos & Patrangenaru & Mardia 2003). Menexes The Grassmann manifold structure of the affine shape Motivation space was exploited in Begelfor & Werman (2006) to Background Manifolds perform affine invariant clustering of shapes. Special Manifolds Srivasatava & Klassen (2004) exploited the geometry of HCA on Special the Grassmann manifold for subspace tracking in array Manifolds signal processing applications. Distance & Mean Intrinsic Case Turaga & Srivastava (2010) showed how a large class of Extrinsic Case HCA Algorithms problems drawn from face, activity, and object recognition Experiments can be recast as statistical inference problems on the Summary & Future Work Stiefel and/or Grassmann manifolds. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  22. Statistics on Special Manifolds - Applications Hierarchical Clustering on Statistics on Riemannian manifolds have found wide Special Manifolds applicability in Shape Analysis (Goodall & Mardia 1999, Markos & Patrangenaru & Mardia 2003). Menexes The Grassmann manifold structure of the affine shape Motivation space was exploited in Begelfor & Werman (2006) to Background Manifolds perform affine invariant clustering of shapes. Special Manifolds Srivasatava & Klassen (2004) exploited the geometry of HCA on Special the Grassmann manifold for subspace tracking in array Manifolds signal processing applications. Distance & Mean Intrinsic Case Turaga & Srivastava (2010) showed how a large class of Extrinsic Case HCA Algorithms problems drawn from face, activity, and object recognition Experiments can be recast as statistical inference problems on the Summary & Future Work Stiefel and/or Grassmann manifolds. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  23. Statistics on Special Manifolds - Applications Hierarchical Clustering on Statistics on Riemannian manifolds have found wide Special Manifolds applicability in Shape Analysis (Goodall & Mardia 1999, Markos & Patrangenaru & Mardia 2003). Menexes The Grassmann manifold structure of the affine shape Motivation space was exploited in Begelfor & Werman (2006) to Background Manifolds perform affine invariant clustering of shapes. Special Manifolds Srivasatava & Klassen (2004) exploited the geometry of HCA on Special the Grassmann manifold for subspace tracking in array Manifolds signal processing applications. Distance & Mean Intrinsic Case Turaga & Srivastava (2010) showed how a large class of Extrinsic Case HCA Algorithms problems drawn from face, activity, and object recognition Experiments can be recast as statistical inference problems on the Summary & Future Work Stiefel and/or Grassmann manifolds. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  24. Statistics on Special Manifolds - Applications Hierarchical Clustering on Statistics on Riemannian manifolds have found wide Special Manifolds applicability in Shape Analysis (Goodall & Mardia 1999, Markos & Patrangenaru & Mardia 2003). Menexes The Grassmann manifold structure of the affine shape Motivation space was exploited in Begelfor & Werman (2006) to Background Manifolds perform affine invariant clustering of shapes. Special Manifolds Srivasatava & Klassen (2004) exploited the geometry of HCA on Special the Grassmann manifold for subspace tracking in array Manifolds signal processing applications. Distance & Mean Intrinsic Case Turaga & Srivastava (2010) showed how a large class of Extrinsic Case HCA Algorithms problems drawn from face, activity, and object recognition Experiments can be recast as statistical inference problems on the Summary & Future Work Stiefel and/or Grassmann manifolds. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  25. Statistics on Special Manifolds - Applications Hierarchical Clustering on Statistics on Riemannian manifolds have found wide Special Manifolds applicability in Shape Analysis (Goodall & Mardia 1999, Markos & Patrangenaru & Mardia 2003). Menexes The Grassmann manifold structure of the affine shape Motivation space was exploited in Begelfor & Werman (2006) to Background Manifolds perform affine invariant clustering of shapes. Special Manifolds Srivasatava & Klassen (2004) exploited the geometry of HCA on Special the Grassmann manifold for subspace tracking in array Manifolds signal processing applications. Distance & Mean Intrinsic Case Turaga & Srivastava (2010) showed how a large class of Extrinsic Case HCA Algorithms problems drawn from face, activity, and object recognition Experiments can be recast as statistical inference problems on the Summary & Future Work Stiefel and/or Grassmann manifolds. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  26. Statistics on Special Manifolds - Applications Hierarchical Clustering on Statistics on Riemannian manifolds have found wide Special Manifolds applicability in Shape Analysis (Goodall & Mardia 1999, Markos & Patrangenaru & Mardia 2003). Menexes The Grassmann manifold structure of the affine shape Motivation space was exploited in Begelfor & Werman (2006) to Background Manifolds perform affine invariant clustering of shapes. Special Manifolds Srivasatava & Klassen (2004) exploited the geometry of HCA on Special the Grassmann manifold for subspace tracking in array Manifolds signal processing applications. Distance & Mean Intrinsic Case Turaga & Srivastava (2010) showed how a large class of Extrinsic Case HCA Algorithms problems drawn from face, activity, and object recognition Experiments can be recast as statistical inference problems on the Summary & Future Work Stiefel and/or Grassmann manifolds. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  27. Statistics on Special Manifolds - Applications Hierarchical Clustering on Statistics on Riemannian manifolds have found wide Special Manifolds applicability in Shape Analysis (Goodall & Mardia 1999, Markos & Patrangenaru & Mardia 2003). Menexes The Grassmann manifold structure of the affine shape Motivation space was exploited in Begelfor & Werman (2006) to Background Manifolds perform affine invariant clustering of shapes. Special Manifolds Srivasatava & Klassen (2004) exploited the geometry of HCA on Special the Grassmann manifold for subspace tracking in array Manifolds signal processing applications. Distance & Mean Intrinsic Case Turaga & Srivastava (2010) showed how a large class of Extrinsic Case HCA Algorithms problems drawn from face, activity, and object recognition Experiments can be recast as statistical inference problems on the Summary & Future Work Stiefel and/or Grassmann manifolds. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  28. Hierarchical Clustering on Special Manifolds Hierarchical Clustering on Special Manifolds Markos & Menexes Need to define: a distance metric applicable to subspaces (points on Motivation G k,m ). Single, complete and average linkage hierarchical Background Manifolds clustering. Special Manifolds a suitable notion of the mean on Riemannian manifolds HCA on Special (intrinsic or extrinsic). Centroid-linkage, Ward-like Manifolds Distance & Clustering (?). Mean Intrinsic Case Extrinsic Case HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  29. Hierarchical Clustering on Special Manifolds Hierarchical Clustering on Special Manifolds Markos & Menexes Need to define: a distance metric applicable to subspaces (points on Motivation G k,m ). Single, complete and average linkage hierarchical Background Manifolds clustering. Special Manifolds a suitable notion of the mean on Riemannian manifolds HCA on Special (intrinsic or extrinsic). Centroid-linkage, Ward-like Manifolds Distance & Clustering (?). Mean Intrinsic Case Extrinsic Case HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  30. Outline Hierarchical 1 Motivation Clustering on Special Manifolds 2 Background Markos & Manifolds Menexes Special Manifolds Motivation 3 HCA on Special Manifolds Background Manifolds Distance & Mean Special Manifolds Intrinsic Case HCA on Special Extrinsic Case Manifolds HCA Algorithms Distance & Mean Intrinsic Case Extrinsic Case 4 Experiments HCA Algorithms Experiments 5 Summary & Future Work Summary & Future Work 6 References References Markos & Menexes Hierarchical Clustering on Special Manifolds

  31. Measuring the Distance on Manifolds Hierarchical Traditional distance measures, such as the Euclidean Clustering on Special measure, are not reasonable to use when measuring Manifolds distances between spaces. This point has been either Markos & Menexes missed or ignored in many simulation studies where Motivation inappropriate distance measures have been used (Larsson Background & Villani, 2001). Manifolds Special The concept of principal angles is fundamental to Manifolds HCA on understand the measure of closeness or similarity between Special Manifolds two subspaces. Distance & Mean Principal angles reflect the closeness of two subspaces in Intrinsic Case Extrinsic Case each individual dimension, while subspace distances reflect HCA Algorithms Experiments the distance of two subspaces along the Grassmann Summary & manifold or in embedding space. Distances on G k,m have Future Work clear geometrical interpretation. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  32. Measuring the Distance on Manifolds Hierarchical Traditional distance measures, such as the Euclidean Clustering on Special measure, are not reasonable to use when measuring Manifolds distances between spaces. This point has been either Markos & Menexes missed or ignored in many simulation studies where Motivation inappropriate distance measures have been used (Larsson Background & Villani, 2001). Manifolds Special The concept of principal angles is fundamental to Manifolds HCA on understand the measure of closeness or similarity between Special Manifolds two subspaces. Distance & Mean Principal angles reflect the closeness of two subspaces in Intrinsic Case Extrinsic Case each individual dimension, while subspace distances reflect HCA Algorithms Experiments the distance of two subspaces along the Grassmann Summary & manifold or in embedding space. Distances on G k,m have Future Work clear geometrical interpretation. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  33. Measuring the Distance on Manifolds Hierarchical Traditional distance measures, such as the Euclidean Clustering on Special measure, are not reasonable to use when measuring Manifolds distances between spaces. This point has been either Markos & Menexes missed or ignored in many simulation studies where Motivation inappropriate distance measures have been used (Larsson Background & Villani, 2001). Manifolds Special The concept of principal angles is fundamental to Manifolds HCA on understand the measure of closeness or similarity between Special Manifolds two subspaces. Distance & Mean Principal angles reflect the closeness of two subspaces in Intrinsic Case Extrinsic Case each individual dimension, while subspace distances reflect HCA Algorithms Experiments the distance of two subspaces along the Grassmann Summary & manifold or in embedding space. Distances on G k,m have Future Work clear geometrical interpretation. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  34. Measuring the Distance on Manifolds Hierarchical Traditional distance measures, such as the Euclidean Clustering on Special measure, are not reasonable to use when measuring Manifolds distances between spaces. This point has been either Markos & Menexes missed or ignored in many simulation studies where Motivation inappropriate distance measures have been used (Larsson Background & Villani, 2001). Manifolds Special The concept of principal angles is fundamental to Manifolds HCA on understand the measure of closeness or similarity between Special Manifolds two subspaces. Distance & Mean Principal angles reflect the closeness of two subspaces in Intrinsic Case Extrinsic Case each individual dimension, while subspace distances reflect HCA Algorithms Experiments the distance of two subspaces along the Grassmann Summary & manifold or in embedding space. Distances on G k,m have Future Work clear geometrical interpretation. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  35. Measuring the Distance on Manifolds Hierarchical Traditional distance measures, such as the Euclidean Clustering on Special measure, are not reasonable to use when measuring Manifolds distances between spaces. This point has been either Markos & Menexes missed or ignored in many simulation studies where Motivation inappropriate distance measures have been used (Larsson Background & Villani, 2001). Manifolds Special The concept of principal angles is fundamental to Manifolds HCA on understand the measure of closeness or similarity between Special Manifolds two subspaces. Distance & Mean Principal angles reflect the closeness of two subspaces in Intrinsic Case Extrinsic Case each individual dimension, while subspace distances reflect HCA Algorithms Experiments the distance of two subspaces along the Grassmann Summary & manifold or in embedding space. Distances on G k,m have Future Work clear geometrical interpretation. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  36. Principal angles Hierarchical Let X and Y be two orthonormal matrices of size m by k . The Clustering on Special principal angles or canonical angles 0 ≤ θ 1 ≤ . . . θ m ≤ π/ 2 Manifolds between span( X ) and span( Y ) are defined recursively by Markos & Menexes v k ∈ span ( Y ) u ′ cos( θ k ) = max max k v k Motivation u k ∈ span ( X ) Background Manifolds subject to u ′ k u k = 1 , v ′ k v k = 1 , Special Manifolds u ′ k u i = 0 , v ′ k v i = 0 , ( i = 1 , . . . , k − 1) . HCA on Special The principal angles can be computed from the SVD of X ′ Y Manifolds Distance & (Bj¨ orck & Golub, 1973), Mean Intrinsic Case Extrinsic Case HCA Algorithms X ′ Y = U (cos Θ) V ′ Experiments Summary & where U = [ u 1 . . . u m ] , V = [ v 1 . . . v m ] , and cos Θ is the Future Work diagonal matrix cos Θ = diag(cos θ 1 . . . cos θ m ) . References Markos & Menexes Hierarchical Clustering on Special Manifolds

  37. Principal angles Hierarchical Let X and Y be two orthonormal matrices of size m by k . The Clustering on Special principal angles or canonical angles 0 ≤ θ 1 ≤ . . . θ m ≤ π/ 2 Manifolds between span( X ) and span( Y ) are defined recursively by Markos & Menexes v k ∈ span ( Y ) u ′ cos( θ k ) = max max k v k Motivation u k ∈ span ( X ) Background Manifolds subject to u ′ k u k = 1 , v ′ k v k = 1 , Special Manifolds u ′ k u i = 0 , v ′ k v i = 0 , ( i = 1 , . . . , k − 1) . HCA on Special The principal angles can be computed from the SVD of X ′ Y Manifolds Distance & (Bj¨ orck & Golub, 1973), Mean Intrinsic Case Extrinsic Case HCA Algorithms X ′ Y = U (cos Θ) V ′ Experiments Summary & where U = [ u 1 . . . u m ] , V = [ v 1 . . . v m ] , and cos Θ is the Future Work diagonal matrix cos Θ = diag(cos θ 1 . . . cos θ m ) . References Markos & Menexes Hierarchical Clustering on Special Manifolds

  38. Outline Hierarchical 1 Motivation Clustering on Special Manifolds 2 Background Markos & Manifolds Menexes Special Manifolds Motivation 3 HCA on Special Manifolds Background Manifolds Distance & Mean Special Manifolds Intrinsic Case HCA on Special Extrinsic Case Manifolds HCA Algorithms Distance & Mean Intrinsic Case Extrinsic Case 4 Experiments HCA Algorithms Experiments 5 Summary & Future Work Summary & Future Work 6 References References Markos & Menexes Hierarchical Clustering on Special Manifolds

  39. The geodesic distance Hierarchical Clustering on Special Manifolds The geodesic distance or arc length is derived from the intrinsic Markos & geometry of Grassmann manifold. It is the length of the Menexes geodesic curve connecting two subspaces along the Motivation Grassmannian surface. Background Manifolds � q Special � 1 / 2 Manifolds � θ 2 d g ( X, Y ) = = � θ � 2 HCA on i Special Manifolds i =1 Distance & Mean Instead of using only the first principal angle, the full geometry Intrinsic Case Extrinsic Case of manifolds is taken into account. However, this distance is HCA Algorithms not differentiable everywhere. Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  40. The Karcher Mean The Karcher Mean (Karcher, 1977) is an intrinsic mean on Hierarchical Clustering on manifolds which minimizes the sum of squared geodesic Special Manifolds distances . Markos & Menexes Algorithm (Begelfor & Werman 2006) Motivation Input: Points p 1 , . . . , p n ∈ G ( k, m ) , ǫ (machine zero) Background Manifolds Output: Karcher mean q Special Manifolds 1 Set q = p 1 HCA on � n 2 Find A = 1 Special i =1 Log q ( p i ) Manifolds n 3 � A � < ǫ return q else, go to Step 4. Distance & Mean 4 Find the SVD U Σ V T = A and update Intrinsic Case Extrinsic Case HCA Algorithms q → qV cos(Σ) + U sin(Σ) Experiments Go to Step 2. Summary & Future Work The Karcher Mean is unique if the points are clustered close References together on the manifold (Berger, 2003). Markos & Menexes Hierarchical Clustering on Special Manifolds

  41. The Karcher Mean The Karcher Mean (Karcher, 1977) is an intrinsic mean on Hierarchical Clustering on manifolds which minimizes the sum of squared geodesic Special Manifolds distances . Markos & Menexes Algorithm (Begelfor & Werman 2006) Motivation Input: Points p 1 , . . . , p n ∈ G ( k, m ) , ǫ (machine zero) Background Manifolds Output: Karcher mean q Special Manifolds 1 Set q = p 1 HCA on � n 2 Find A = 1 Special i =1 Log q ( p i ) Manifolds n 3 � A � < ǫ return q else, go to Step 4. Distance & Mean 4 Find the SVD U Σ V T = A and update Intrinsic Case Extrinsic Case HCA Algorithms q → qV cos(Σ) + U sin(Σ) Experiments Go to Step 2. Summary & Future Work The Karcher Mean is unique if the points are clustered close References together on the manifold (Berger, 2003). Markos & Menexes Hierarchical Clustering on Special Manifolds

  42. The Karcher Mean The Karcher Mean (Karcher, 1977) is an intrinsic mean on Hierarchical Clustering on manifolds which minimizes the sum of squared geodesic Special Manifolds distances . Markos & Menexes Algorithm (Begelfor & Werman 2006) Motivation Input: Points p 1 , . . . , p n ∈ G ( k, m ) , ǫ (machine zero) Background Manifolds Output: Karcher mean q Special Manifolds 1 Set q = p 1 HCA on � n 2 Find A = 1 Special i =1 Log q ( p i ) Manifolds n 3 � A � < ǫ return q else, go to Step 4. Distance & Mean 4 Find the SVD U Σ V T = A and update Intrinsic Case Extrinsic Case HCA Algorithms q → qV cos(Σ) + U sin(Σ) Experiments Go to Step 2. Summary & Future Work The Karcher Mean is unique if the points are clustered close References together on the manifold (Berger, 2003). Markos & Menexes Hierarchical Clustering on Special Manifolds

  43. The Karcher Mean The Karcher Mean (Karcher, 1977) is an intrinsic mean on Hierarchical Clustering on manifolds which minimizes the sum of squared geodesic Special Manifolds distances . Markos & Menexes Algorithm (Begelfor & Werman 2006) Motivation Input: Points p 1 , . . . , p n ∈ G ( k, m ) , ǫ (machine zero) Background Manifolds Output: Karcher mean q Special Manifolds 1 Set q = p 1 HCA on � n 2 Find A = 1 Special i =1 Log q ( p i ) Manifolds n 3 � A � < ǫ return q else, go to Step 4. Distance & Mean 4 Find the SVD U Σ V T = A and update Intrinsic Case Extrinsic Case HCA Algorithms q → qV cos(Σ) + U sin(Σ) Experiments Go to Step 2. Summary & Future Work The Karcher Mean is unique if the points are clustered close References together on the manifold (Berger, 2003). Markos & Menexes Hierarchical Clustering on Special Manifolds

  44. The Karcher Mean The Karcher Mean (Karcher, 1977) is an intrinsic mean on Hierarchical Clustering on manifolds which minimizes the sum of squared geodesic Special Manifolds distances . Markos & Menexes Algorithm (Begelfor & Werman 2006) Motivation Input: Points p 1 , . . . , p n ∈ G ( k, m ) , ǫ (machine zero) Background Manifolds Output: Karcher mean q Special Manifolds 1 Set q = p 1 HCA on � n 2 Find A = 1 Special i =1 Log q ( p i ) Manifolds n 3 � A � < ǫ return q else, go to Step 4. Distance & Mean 4 Find the SVD U Σ V T = A and update Intrinsic Case Extrinsic Case HCA Algorithms q → qV cos(Σ) + U sin(Σ) Experiments Go to Step 2. Summary & Future Work The Karcher Mean is unique if the points are clustered close References together on the manifold (Berger, 2003). Markos & Menexes Hierarchical Clustering on Special Manifolds

  45. The Karcher Mean The Karcher Mean (Karcher, 1977) is an intrinsic mean on Hierarchical Clustering on manifolds which minimizes the sum of squared geodesic Special Manifolds distances . Markos & Menexes Algorithm (Begelfor & Werman 2006) Motivation Input: Points p 1 , . . . , p n ∈ G ( k, m ) , ǫ (machine zero) Background Manifolds Output: Karcher mean q Special Manifolds 1 Set q = p 1 HCA on � n 2 Find A = 1 Special i =1 Log q ( p i ) Manifolds n 3 � A � < ǫ return q else, go to Step 4. Distance & Mean 4 Find the SVD U Σ V T = A and update Intrinsic Case Extrinsic Case HCA Algorithms q → qV cos(Σ) + U sin(Σ) Experiments Go to Step 2. Summary & Future Work The Karcher Mean is unique if the points are clustered close References together on the manifold (Berger, 2003). Markos & Menexes Hierarchical Clustering on Special Manifolds

  46. The Karcher Mean The Karcher Mean (Karcher, 1977) is an intrinsic mean on Hierarchical Clustering on manifolds which minimizes the sum of squared geodesic Special Manifolds distances . Markos & Menexes Algorithm (Begelfor & Werman 2006) Motivation Input: Points p 1 , . . . , p n ∈ G ( k, m ) , ǫ (machine zero) Background Manifolds Output: Karcher mean q Special Manifolds 1 Set q = p 1 HCA on � n 2 Find A = 1 Special i =1 Log q ( p i ) Manifolds n 3 � A � < ǫ return q else, go to Step 4. Distance & Mean 4 Find the SVD U Σ V T = A and update Intrinsic Case Extrinsic Case HCA Algorithms q → qV cos(Σ) + U sin(Σ) Experiments Go to Step 2. Summary & Future Work The Karcher Mean is unique if the points are clustered close References together on the manifold (Berger, 2003). Markos & Menexes Hierarchical Clustering on Special Manifolds

  47. Outline Hierarchical 1 Motivation Clustering on Special Manifolds 2 Background Markos & Manifolds Menexes Special Manifolds Motivation 3 HCA on Special Manifolds Background Manifolds Distance & Mean Special Manifolds Intrinsic Case HCA on Special Extrinsic Case Manifolds HCA Algorithms Distance & Mean Intrinsic Case Extrinsic Case 4 Experiments HCA Algorithms Experiments 5 Summary & Future Work Summary & Future Work 6 References References Markos & Menexes Hierarchical Clustering on Special Manifolds

  48. The projection metric 1/2 Hierarchical Clustering on Special Manifolds Markos & Menexes When G k,m is defined as a submanifold of Euclidean space via Motivation a projection embedding, the projection metric on G k,m is given Background Manifolds in terms of the principal angles by (Edelman et al., 1999): Special Manifolds HCA on d P ( X, Y ) = � sin θ � 2 Special Manifolds Distance & Mean Intrinsic Case Extrinsic Case HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  49. The projection metric 2/2 Hierarchical d P ( X, Y ) is the distance proposed by Larsson & Villani Clustering on Special (2001) as a measure of the distance between two Manifolds Markos & cointegration spaces. Menexes Motivation � � Background m − � X ′ Y � 2 m − tr( XX ′ Y Y ′ ) = d P ( X, Y ) = F Manifolds Special Manifolds HCA on Special Note that tr( XX ′ Y Y ′ ) corresponds to a scalar product Manifolds Distance & between two positive semidefinite matrices. Mean Intrinsic Case Extrinsic Case d 2 P ( X, Y ) = tr( X ⊥ X ⊥ ′ Y Y ′ ) = COI( X ⊥ , Y ) , where COI HCA Algorithms is the co-inertia criterion, the numerator of both the RV Experiments and Tucker’s congruence coefficient for positive Summary & Future Work semidefinite matrices (Dray, 2008). References Markos & Menexes Hierarchical Clustering on Special Manifolds

  50. The projection metric 2/2 Hierarchical d P ( X, Y ) is the distance proposed by Larsson & Villani Clustering on Special (2001) as a measure of the distance between two Manifolds Markos & cointegration spaces. Menexes Motivation � � Background m − � X ′ Y � 2 m − tr( XX ′ Y Y ′ ) = d P ( X, Y ) = F Manifolds Special Manifolds HCA on Special Note that tr( XX ′ Y Y ′ ) corresponds to a scalar product Manifolds Distance & between two positive semidefinite matrices. Mean Intrinsic Case Extrinsic Case d 2 P ( X, Y ) = tr( X ⊥ X ⊥ ′ Y Y ′ ) = COI( X ⊥ , Y ) , where COI HCA Algorithms is the co-inertia criterion, the numerator of both the RV Experiments and Tucker’s congruence coefficient for positive Summary & Future Work semidefinite matrices (Dray, 2008). References Markos & Menexes Hierarchical Clustering on Special Manifolds

  51. The projection metric 2/2 Hierarchical d P ( X, Y ) is the distance proposed by Larsson & Villani Clustering on Special (2001) as a measure of the distance between two Manifolds Markos & cointegration spaces. Menexes Motivation � � Background m − � X ′ Y � 2 m − tr( XX ′ Y Y ′ ) = d P ( X, Y ) = F Manifolds Special Manifolds HCA on Special Note that tr( XX ′ Y Y ′ ) corresponds to a scalar product Manifolds Distance & between two positive semidefinite matrices. Mean Intrinsic Case Extrinsic Case d 2 P ( X, Y ) = tr( X ⊥ X ⊥ ′ Y Y ′ ) = COI( X ⊥ , Y ) , where COI HCA Algorithms is the co-inertia criterion, the numerator of both the RV Experiments and Tucker’s congruence coefficient for positive Summary & Future Work semidefinite matrices (Dray, 2008). References Markos & Menexes Hierarchical Clustering on Special Manifolds

  52. The projection metric 2/2 Hierarchical d P ( X, Y ) is the distance proposed by Larsson & Villani Clustering on Special (2001) as a measure of the distance between two Manifolds Markos & cointegration spaces. Menexes Motivation � � Background m − � X ′ Y � 2 m − tr( XX ′ Y Y ′ ) = d P ( X, Y ) = F Manifolds Special Manifolds HCA on Special Note that tr( XX ′ Y Y ′ ) corresponds to a scalar product Manifolds Distance & between two positive semidefinite matrices. Mean Intrinsic Case Extrinsic Case d 2 P ( X, Y ) = tr( X ⊥ X ⊥ ′ Y Y ′ ) = COI( X ⊥ , Y ) , where COI HCA Algorithms is the co-inertia criterion, the numerator of both the RV Experiments and Tucker’s congruence coefficient for positive Summary & Future Work semidefinite matrices (Dray, 2008). References Markos & Menexes Hierarchical Clustering on Special Manifolds

  53. The projection metric 2/2 Hierarchical d P ( X, Y ) is the distance proposed by Larsson & Villani Clustering on Special (2001) as a measure of the distance between two Manifolds Markos & cointegration spaces. Menexes Motivation � � Background m − � X ′ Y � 2 m − tr( XX ′ Y Y ′ ) = d P ( X, Y ) = F Manifolds Special Manifolds HCA on Special Note that tr( XX ′ Y Y ′ ) corresponds to a scalar product Manifolds Distance & between two positive semidefinite matrices. Mean Intrinsic Case Extrinsic Case d 2 P ( X, Y ) = tr( X ⊥ X ⊥ ′ Y Y ′ ) = COI( X ⊥ , Y ) , where COI HCA Algorithms is the co-inertia criterion, the numerator of both the RV Experiments and Tucker’s congruence coefficient for positive Summary & Future Work semidefinite matrices (Dray, 2008). References Markos & Menexes Hierarchical Clustering on Special Manifolds

  54. Other Subspace distances 1/2 Hierarchical Clustering on Special Manifolds Markos & As a slight variation of the projection metric, we may also Menexes consider the so called chordal Frobenius distance (Edelman et. Motivation al, 1999) or Procrustes metric (Chikuse, 2003), given by: Background Manifolds d F ( X, Y ) = � 2 sin 1 Special 2 θ � 2 Manifolds HCA on Special Note that the geodesic , projection and Procrustes metrics are Manifolds Distance & asymptotically equivalent for small principal angles i.e. these Mean Intrinsic Case embeddings are isometries (Edelman et al., 1999). Extrinsic Case HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  55. Other Subspace distances 2/2 Hierarchical Max Correlation (Golub & Van Loan, 1996) Clustering on Special d max ( X, Y ) = � XX ′ − Y Y ′ � 2 = sin( θ 1 ) Manifolds Markos & Menexes d max ( X, Y ) is a distance based on only the largest canonical correlation cos θ 1 (or the smallest principal angle Motivation θ 1 ) Background Manifolds Fubiny-Study metric (Edelman et al., 1999) Special Manifolds HCA on � d F S ( X, Y ) = arccos | detX ′ Y | = arccos( cos θ i ) Special Manifolds i Distance & Mean Intrinsic Case Binet-Cauchy metric (Wolf & Shashua, 2003) Extrinsic Case HCA Algorithms � 1 / 2 Experiments � cos 2 θ i � d BC ( X, Y ) = 1 − Summary & Future Work i References Markos & Menexes Hierarchical Clustering on Special Manifolds

  56. Other Subspace distances 2/2 Hierarchical Max Correlation (Golub & Van Loan, 1996) Clustering on Special d max ( X, Y ) = � XX ′ − Y Y ′ � 2 = sin( θ 1 ) Manifolds Markos & Menexes d max ( X, Y ) is a distance based on only the largest canonical correlation cos θ 1 (or the smallest principal angle Motivation θ 1 ) Background Manifolds Fubiny-Study metric (Edelman et al., 1999) Special Manifolds HCA on � d F S ( X, Y ) = arccos | detX ′ Y | = arccos( cos θ i ) Special Manifolds i Distance & Mean Intrinsic Case Binet-Cauchy metric (Wolf & Shashua, 2003) Extrinsic Case HCA Algorithms � 1 / 2 Experiments � cos 2 θ i � d BC ( X, Y ) = 1 − Summary & Future Work i References Markos & Menexes Hierarchical Clustering on Special Manifolds

  57. Other Subspace distances 2/2 Hierarchical Max Correlation (Golub & Van Loan, 1996) Clustering on Special d max ( X, Y ) = � XX ′ − Y Y ′ � 2 = sin( θ 1 ) Manifolds Markos & Menexes d max ( X, Y ) is a distance based on only the largest canonical correlation cos θ 1 (or the smallest principal angle Motivation θ 1 ) Background Manifolds Fubiny-Study metric (Edelman et al., 1999) Special Manifolds HCA on � d F S ( X, Y ) = arccos | detX ′ Y | = arccos( cos θ i ) Special Manifolds i Distance & Mean Intrinsic Case Binet-Cauchy metric (Wolf & Shashua, 2003) Extrinsic Case HCA Algorithms � 1 / 2 Experiments � cos 2 θ i � d BC ( X, Y ) = 1 − Summary & Future Work i References Markos & Menexes Hierarchical Clustering on Special Manifolds

  58. An Extrinsic Mean Hierarchical Clustering on Chikuse (2003) proposed an extrinsic mean which minimizes Special Manifolds the sum of squared projection distances on G k,m : Markos & Given a set of matrices P i = X i X ′ i on G k,m , for Menexes X i ∈ V k,m , i = 1 , . . . , n , a natural mean ˆ P ∈ G k,m is defined Motivation by minimizing: Background Manifolds n Special Manifolds � (tr I k − tr P i ˆ P ) HCA on Special i =1 Manifolds Letting the spectral decomposition of S = � n Distance & i =1 P i be Mean Intrinsic Case S = HD s H ′ , where H ∈ O ( m ) , Extrinsic Case HCA Algorithms D s = diag ( s 1 , . . . , s m ) , s 1 > . . . s m > 0 , and putting Experiments H = ( H 1 H 2 ) , with H 1 being m × k , Summary & we obtain ˆ 1 and min = kn − � k P = H 1 H ′ Future Work i =1 s i . References Markos & Menexes Hierarchical Clustering on Special Manifolds

  59. An Extrinsic Mean Hierarchical Clustering on Chikuse (2003) proposed an extrinsic mean which minimizes Special Manifolds the sum of squared projection distances on G k,m : Markos & Given a set of matrices P i = X i X ′ i on G k,m , for Menexes X i ∈ V k,m , i = 1 , . . . , n , a natural mean ˆ P ∈ G k,m is defined Motivation by minimizing: Background Manifolds n Special Manifolds � (tr I k − tr P i ˆ P ) HCA on Special i =1 Manifolds Letting the spectral decomposition of S = � n Distance & i =1 P i be Mean Intrinsic Case S = HD s H ′ , where H ∈ O ( m ) , Extrinsic Case HCA Algorithms D s = diag ( s 1 , . . . , s m ) , s 1 > . . . s m > 0 , and putting Experiments H = ( H 1 H 2 ) , with H 1 being m × k , Summary & we obtain ˆ 1 and min = kn − � k P = H 1 H ′ Future Work i =1 s i . References Markos & Menexes Hierarchical Clustering on Special Manifolds

  60. Outline Hierarchical 1 Motivation Clustering on Special Manifolds 2 Background Markos & Manifolds Menexes Special Manifolds Motivation 3 HCA on Special Manifolds Background Manifolds Distance & Mean Special Manifolds Intrinsic Case HCA on Special Extrinsic Case Manifolds HCA Algorithms Distance & Mean Intrinsic Case Extrinsic Case 4 Experiments HCA Algorithms Experiments 5 Summary & Future Work Summary & Future Work 6 References References Markos & Menexes Hierarchical Clustering on Special Manifolds

  61. HCA Algorithms 1/2 (Rencher, 2002) Hierarchical Clustering on Special Manifolds Single-Link. Defines the distance between any two clusters Markos & Menexes r and s as the minimum distance between them: Motivation d ( r, s ) = min( d ( x ri , y sj )) , i ∈ (1 , . . . n r ) , i ∈ (1 , . . . n s ) Background Manifolds Special Manifolds Complete-Link. Defines the distance as the maximum HCA on distance between them: Special Manifolds Distance & Mean d ( r, s ) = max( d ( x ri , y sj )) , i ∈ (1 , . . . n r ) , i ∈ (1 , . . . n s ) Intrinsic Case Extrinsic Case HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  62. HCA Algorithms 1/2 (Rencher, 2002) Hierarchical Clustering on Special Manifolds Single-Link. Defines the distance between any two clusters Markos & Menexes r and s as the minimum distance between them: Motivation d ( r, s ) = min( d ( x ri , y sj )) , i ∈ (1 , . . . n r ) , i ∈ (1 , . . . n s ) Background Manifolds Special Manifolds Complete-Link. Defines the distance as the maximum HCA on distance between them: Special Manifolds Distance & Mean d ( r, s ) = max( d ( x ri , y sj )) , i ∈ (1 , . . . n r ) , i ∈ (1 , . . . n s ) Intrinsic Case Extrinsic Case HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  63. HCA Algorithms 2/2 Hierarchical Clustering on Special Manifolds Average. Uses the average distance between all pairs of Markos & objects in any two clusters: Menexes Motivation n r n s 1 � � Background d ( r, s ) = d ( x ri , x sj ) n r n s Manifolds Special i =1 i =1 Manifolds HCA on Centroid. Uses a Grassmannian distance between the Special Manifolds centroids of the two clusters, e.g.: Distance & Mean d ( r, s ) = d 2 g (¯ x r , ¯ x s ) (squared geodesic) Intrinsic Case Extrinsic Case d ( r, s ) = d 2 P (¯ x r , ¯ x s ) (squared projection metric) HCA Algorithms Experiments where ¯ x is either the Karcher or the Extrinsic mean. Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  64. HCA Algorithms 2/2 Hierarchical Clustering on Special Manifolds Average. Uses the average distance between all pairs of Markos & objects in any two clusters: Menexes Motivation n r n s 1 � � Background d ( r, s ) = d ( x ri , x sj ) n r n s Manifolds Special i =1 i =1 Manifolds HCA on Centroid. Uses a Grassmannian distance between the Special Manifolds centroids of the two clusters, e.g.: Distance & Mean d ( r, s ) = d 2 g (¯ x r , ¯ x s ) (squared geodesic) Intrinsic Case Extrinsic Case d ( r, s ) = d 2 P (¯ x r , ¯ x s ) (squared projection metric) HCA Algorithms Experiments where ¯ x is either the Karcher or the Extrinsic mean. Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  65. A Toy Example Hierarchical - 15 Original matrices of size 14 × 6 , orthonormalized via Clustering on Special P = X ( X ′ X ) 1 / 2 Manifolds Markos & - 15 Geographic Areas Menexes - 14 Crop Farming Systems Motivation - 6 Outputs and Inputs [Height, Fertilizer (MJ/ha), Labor Background (MJ/ha), Machinery (MJ/ha), Fuel (MJ/ha), Transportation Manifolds Special (MJ/ha)] Manifolds HCA on Special - Three groups with high, medium, and low within-group, low Manifolds Distance & between-group correlations Mean Intrinsic Case - Distance and Mean: a.Karcher mean with squared geodesic Extrinsic Case HCA Algorithms distance b.squared projection metric with the Extrinsic mean Experiments Experiments were performed in Matlab, Summary & Future Work http://utopia.duth.gr/~amarkos/subspacehca References Markos & Menexes Hierarchical Clustering on Special Manifolds

  66. A Toy Example Hierarchical - 15 Original matrices of size 14 × 6 , orthonormalized via Clustering on Special P = X ( X ′ X ) 1 / 2 Manifolds Markos & - 15 Geographic Areas Menexes - 14 Crop Farming Systems Motivation - 6 Outputs and Inputs [Height, Fertilizer (MJ/ha), Labor Background (MJ/ha), Machinery (MJ/ha), Fuel (MJ/ha), Transportation Manifolds Special (MJ/ha)] Manifolds HCA on Special - Three groups with high, medium, and low within-group, low Manifolds Distance & between-group correlations Mean Intrinsic Case - Distance and Mean: a.Karcher mean with squared geodesic Extrinsic Case HCA Algorithms distance b.squared projection metric with the Extrinsic mean Experiments Experiments were performed in Matlab, Summary & Future Work http://utopia.duth.gr/~amarkos/subspacehca References Markos & Menexes Hierarchical Clustering on Special Manifolds

  67. A Toy Example Hierarchical - 15 Original matrices of size 14 × 6 , orthonormalized via Clustering on Special P = X ( X ′ X ) 1 / 2 Manifolds Markos & - 15 Geographic Areas Menexes - 14 Crop Farming Systems Motivation - 6 Outputs and Inputs [Height, Fertilizer (MJ/ha), Labor Background (MJ/ha), Machinery (MJ/ha), Fuel (MJ/ha), Transportation Manifolds Special (MJ/ha)] Manifolds HCA on Special - Three groups with high, medium, and low within-group, low Manifolds Distance & between-group correlations Mean Intrinsic Case - Distance and Mean: a.Karcher mean with squared geodesic Extrinsic Case HCA Algorithms distance b.squared projection metric with the Extrinsic mean Experiments Experiments were performed in Matlab, Summary & Future Work http://utopia.duth.gr/~amarkos/subspacehca References Markos & Menexes Hierarchical Clustering on Special Manifolds

  68. The 20NG Dataset (1/4) Hierarchical Settings of the experiment: Clustering on Special Manifolds The 20News-Group collection consists of newsgroup Markos & Menexes documents that were manually classified into 20 different Motivation categories, related (e.g. comp.sys.ibm.pc.hardware / Background comp.sys.mac.hardware) or not (e.g misc.forsale / Manifolds Special soc.religion.christian). Manifolds http://people.csail.mit.edu/jrennie/20Newsgroups/ . HCA on Special Each category includes 1 , 000 documents, for a total Manifolds Distance & collection size of about 20 , 000 documents. Mean Intrinsic Case Extrinsic Case We consider a particular instance of a Semantic Space, the HCA Algorithms Hyperspace Analogue to Language (HAL). The HAL space Experiments is created through the co-occurrence statistics within a Summary & Future Work corpus of documents (see Lund & Burgess, 1996). References Markos & Menexes Hierarchical Clustering on Special Manifolds

  69. The 20NG Dataset (1/4) Hierarchical Settings of the experiment: Clustering on Special Manifolds The 20News-Group collection consists of newsgroup Markos & Menexes documents that were manually classified into 20 different Motivation categories, related (e.g. comp.sys.ibm.pc.hardware / Background comp.sys.mac.hardware) or not (e.g misc.forsale / Manifolds Special soc.religion.christian). Manifolds http://people.csail.mit.edu/jrennie/20Newsgroups/ . HCA on Special Each category includes 1 , 000 documents, for a total Manifolds Distance & collection size of about 20 , 000 documents. Mean Intrinsic Case Extrinsic Case We consider a particular instance of a Semantic Space, the HCA Algorithms Hyperspace Analogue to Language (HAL). The HAL space Experiments is created through the co-occurrence statistics within a Summary & Future Work corpus of documents (see Lund & Burgess, 1996). References Markos & Menexes Hierarchical Clustering on Special Manifolds

  70. The 20NG Dataset (2/4) Hierarchical Clustering on Special Manifolds Markos & Menexes Procedure: Motivation - combine the documents of each category ( 20 sets) Background - compute the SS representation of each document set Manifolds (co-occurence matrix based on the HAL model) Special Manifolds - Agglomerative HCA (square projection metric, extrinsic mean, HCA on Special centroid linkage) Manifolds Distance & Mean Intrinsic Case Extrinsic Case HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  71. The 20NG Dataset (3/4) Hierarchical Clustering on Special Manifolds Results: 6 - cluster solution Markos & C1: comp.graphics, comp.os.ms-windows.misc, Menexes comp.sys.ibm.pc.hardware, comp.sys.mac.hardware, Motivation comp.windows.x Background C2 : rec.autos, rec.motorcycles, rec.sport.baseball, Manifolds Special Manifolds rec.sport.hockey HCA on C3: sci.crypt, sci.electronics, sci.med, sci.space Special Manifolds C4: misc.forsale Distance & Mean C5: talk.politics.misc, talk.politics.guns, talk.politics.mideast Intrinsic Case Extrinsic Case C6: talk.religion.misc, alt.atheism, soc.religion.christian HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  72. The 20NG Dataset (4/4) Hierarchical Clustering on Special Manifolds Markos & Menexes Results: 4 - cluster solution Motivation C1: comp. & sci. (computers & science) Background C2: talk. (politics & religion) Manifolds Special Manifolds C3: rec. (sports) HCA on C4: misc.forsale Special Manifolds Distance & Mean Intrinsic Case Extrinsic Case HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  73. Summary Hierarchical Clustering on Special Manifolds Affine invariance can be treated robustly and effectively Markos & Menexes using a Riemmanian framework, by viewing subspaces as Motivation points on special manifolds. Background New geometric insights in designing data analysis Manifolds Special algorithms that incorporate the geometry manifolds. Manifolds HCA on We reviewed actual distance measures and notions of the Special Manifolds mean naturally available on the Grassmann manifold and Distance & Mean defined algorithms for hierarchical clustering on the Intrinsic Case Extrinsic Case Grassmann manifold, providing empirical results. HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  74. Summary Hierarchical Clustering on Special Manifolds Affine invariance can be treated robustly and effectively Markos & Menexes using a Riemmanian framework, by viewing subspaces as Motivation points on special manifolds. Background New geometric insights in designing data analysis Manifolds Special algorithms that incorporate the geometry manifolds. Manifolds HCA on We reviewed actual distance measures and notions of the Special Manifolds mean naturally available on the Grassmann manifold and Distance & Mean defined algorithms for hierarchical clustering on the Intrinsic Case Extrinsic Case Grassmann manifold, providing empirical results. HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  75. Summary Hierarchical Clustering on Special Manifolds Affine invariance can be treated robustly and effectively Markos & Menexes using a Riemmanian framework, by viewing subspaces as Motivation points on special manifolds. Background New geometric insights in designing data analysis Manifolds Special algorithms that incorporate the geometry manifolds. Manifolds HCA on We reviewed actual distance measures and notions of the Special Manifolds mean naturally available on the Grassmann manifold and Distance & Mean defined algorithms for hierarchical clustering on the Intrinsic Case Extrinsic Case Grassmann manifold, providing empirical results. HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  76. Future Directions Hierarchical Clustering on Special Manifolds Markos & Menexes Methods which rely on distance matrices or on centroids (e.g. MDS, k-means) Motivation Background Interesting applications (further experiments) Manifolds Special Define a non-linear extension using the kernel trick Manifolds HCA on Relaxing the orthonormality condition (the invariant Special Manifolds property is lost, a uniform way of choosing the basis is Distance & Mean then needed) Intrinsic Case Extrinsic Case HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  77. Future Directions Hierarchical Clustering on Special Manifolds Markos & Menexes Methods which rely on distance matrices or on centroids (e.g. MDS, k-means) Motivation Background Interesting applications (further experiments) Manifolds Special Define a non-linear extension using the kernel trick Manifolds HCA on Relaxing the orthonormality condition (the invariant Special Manifolds property is lost, a uniform way of choosing the basis is Distance & Mean then needed) Intrinsic Case Extrinsic Case HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  78. Future Directions Hierarchical Clustering on Special Manifolds Markos & Menexes Methods which rely on distance matrices or on centroids (e.g. MDS, k-means) Motivation Background Interesting applications (further experiments) Manifolds Special Define a non-linear extension using the kernel trick Manifolds HCA on Relaxing the orthonormality condition (the invariant Special Manifolds property is lost, a uniform way of choosing the basis is Distance & Mean then needed) Intrinsic Case Extrinsic Case HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  79. Future Directions Hierarchical Clustering on Special Manifolds Markos & Menexes Methods which rely on distance matrices or on centroids (e.g. MDS, k-means) Motivation Background Interesting applications (further experiments) Manifolds Special Define a non-linear extension using the kernel trick Manifolds HCA on Relaxing the orthonormality condition (the invariant Special Manifolds property is lost, a uniform way of choosing the basis is Distance & Mean then needed) Intrinsic Case Extrinsic Case HCA Algorithms Experiments Summary & Future Work References Markos & Menexes Hierarchical Clustering on Special Manifolds

  80. References 1/4 Hierarchical Barg, A. and Nogin, D. (2002). Bounds on packings of Clustering on Special spheres in the Grassmann manifold. IEEE Trans. Manifolds Information Theory , 48(9), 2450-2454. Markos & Menexes Begelfor E. & Werman M. (2006). Affine invariance Motivation revisited. In Proc. IEEE Conf. on Computer Vision and Background Pattern Recognition, New York, NY, 2, 2087-2094. Manifolds Special Berger, M. (2003). A Panoramic View of Riemannian Manifolds HCA on Geometry . Springer, Berlin. Special Manifolds Bj¨ orck, A. & Golub, G.H. (1973) Numerical methods for Distance & Mean computing the angles between linear subspaces, Math. Intrinsic Case Extrinsic Case Comp . 27, 579-594 HCA Algorithms Experiments Boothby, W.M. (2002). An Introduction to Differentiable Summary & Manifolds and Riemannian Geometry . Academic Press, Future Work 2002. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  81. References 1/4 Hierarchical Barg, A. and Nogin, D. (2002). Bounds on packings of Clustering on Special spheres in the Grassmann manifold. IEEE Trans. Manifolds Information Theory , 48(9), 2450-2454. Markos & Menexes Begelfor E. & Werman M. (2006). Affine invariance Motivation revisited. In Proc. IEEE Conf. on Computer Vision and Background Pattern Recognition, New York, NY, 2, 2087-2094. Manifolds Special Berger, M. (2003). A Panoramic View of Riemannian Manifolds HCA on Geometry . Springer, Berlin. Special Manifolds Bj¨ orck, A. & Golub, G.H. (1973) Numerical methods for Distance & Mean computing the angles between linear subspaces, Math. Intrinsic Case Extrinsic Case Comp . 27, 579-594 HCA Algorithms Experiments Boothby, W.M. (2002). An Introduction to Differentiable Summary & Manifolds and Riemannian Geometry . Academic Press, Future Work 2002. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  82. References 1/4 Hierarchical Barg, A. and Nogin, D. (2002). Bounds on packings of Clustering on Special spheres in the Grassmann manifold. IEEE Trans. Manifolds Information Theory , 48(9), 2450-2454. Markos & Menexes Begelfor E. & Werman M. (2006). Affine invariance Motivation revisited. In Proc. IEEE Conf. on Computer Vision and Background Pattern Recognition, New York, NY, 2, 2087-2094. Manifolds Special Berger, M. (2003). A Panoramic View of Riemannian Manifolds HCA on Geometry . Springer, Berlin. Special Manifolds Bj¨ orck, A. & Golub, G.H. (1973) Numerical methods for Distance & Mean computing the angles between linear subspaces, Math. Intrinsic Case Extrinsic Case Comp . 27, 579-594 HCA Algorithms Experiments Boothby, W.M. (2002). An Introduction to Differentiable Summary & Manifolds and Riemannian Geometry . Academic Press, Future Work 2002. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  83. References 1/4 Hierarchical Barg, A. and Nogin, D. (2002). Bounds on packings of Clustering on Special spheres in the Grassmann manifold. IEEE Trans. Manifolds Information Theory , 48(9), 2450-2454. Markos & Menexes Begelfor E. & Werman M. (2006). Affine invariance Motivation revisited. In Proc. IEEE Conf. on Computer Vision and Background Pattern Recognition, New York, NY, 2, 2087-2094. Manifolds Special Berger, M. (2003). A Panoramic View of Riemannian Manifolds HCA on Geometry . Springer, Berlin. Special Manifolds Bj¨ orck, A. & Golub, G.H. (1973) Numerical methods for Distance & Mean computing the angles between linear subspaces, Math. Intrinsic Case Extrinsic Case Comp . 27, 579-594 HCA Algorithms Experiments Boothby, W.M. (2002). An Introduction to Differentiable Summary & Manifolds and Riemannian Geometry . Academic Press, Future Work 2002. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  84. References 1/4 Hierarchical Barg, A. and Nogin, D. (2002). Bounds on packings of Clustering on Special spheres in the Grassmann manifold. IEEE Trans. Manifolds Information Theory , 48(9), 2450-2454. Markos & Menexes Begelfor E. & Werman M. (2006). Affine invariance Motivation revisited. In Proc. IEEE Conf. on Computer Vision and Background Pattern Recognition, New York, NY, 2, 2087-2094. Manifolds Special Berger, M. (2003). A Panoramic View of Riemannian Manifolds HCA on Geometry . Springer, Berlin. Special Manifolds Bj¨ orck, A. & Golub, G.H. (1973) Numerical methods for Distance & Mean computing the angles between linear subspaces, Math. Intrinsic Case Extrinsic Case Comp . 27, 579-594 HCA Algorithms Experiments Boothby, W.M. (2002). An Introduction to Differentiable Summary & Manifolds and Riemannian Geometry . Academic Press, Future Work 2002. References Markos & Menexes Hierarchical Clustering on Special Manifolds

  85. References 2/4 Hierarchical Conway, J., Hardin, R. & Sloane, N. (1996). Packing Clustering on Special lines, planes, etc.: Packings in Grassmannian spaces. Manifolds Experimental Mathematics , 5, 139-159. Markos & Menexes Edelman, A., Arias, T. & Smith, S. (1998). The geometry of algorithms with orthogonality constrains. SIAM J. Motivation Matrix Anal. Appl , 20(2), 303-353. Background Manifolds Goodall, C. R. & Mardia, K.V. (1999). Projective shape Special Manifolds analysis, Journal of Computational and Graphical HCA on Special Statistics , 8(2), 143-168. Manifolds Distance & Karcher, H. (1977). Riemannian center of mass and Mean Intrinsic Case mollifier smoothing, Communications on Pure and Applied Extrinsic Case HCA Algorithms Mathematics , 30, 509-541. Experiments Larsson, R. & Villani, M. (2001). A distance measure Summary & Future Work between cointegration spaces, Economics Letters , 70(1), References 21–27. Markos & Menexes Hierarchical Clustering on Special Manifolds

  86. References 2/4 Hierarchical Conway, J., Hardin, R. & Sloane, N. (1996). Packing Clustering on Special lines, planes, etc.: Packings in Grassmannian spaces. Manifolds Experimental Mathematics , 5, 139-159. Markos & Menexes Edelman, A., Arias, T. & Smith, S. (1998). The geometry of algorithms with orthogonality constrains. SIAM J. Motivation Matrix Anal. Appl , 20(2), 303-353. Background Manifolds Goodall, C. R. & Mardia, K.V. (1999). Projective shape Special Manifolds analysis, Journal of Computational and Graphical HCA on Special Statistics , 8(2), 143-168. Manifolds Distance & Karcher, H. (1977). Riemannian center of mass and Mean Intrinsic Case mollifier smoothing, Communications on Pure and Applied Extrinsic Case HCA Algorithms Mathematics , 30, 509-541. Experiments Larsson, R. & Villani, M. (2001). A distance measure Summary & Future Work between cointegration spaces, Economics Letters , 70(1), References 21–27. Markos & Menexes Hierarchical Clustering on Special Manifolds

  87. References 2/4 Hierarchical Conway, J., Hardin, R. & Sloane, N. (1996). Packing Clustering on Special lines, planes, etc.: Packings in Grassmannian spaces. Manifolds Experimental Mathematics , 5, 139-159. Markos & Menexes Edelman, A., Arias, T. & Smith, S. (1998). The geometry of algorithms with orthogonality constrains. SIAM J. Motivation Matrix Anal. Appl , 20(2), 303-353. Background Manifolds Goodall, C. R. & Mardia, K.V. (1999). Projective shape Special Manifolds analysis, Journal of Computational and Graphical HCA on Special Statistics , 8(2), 143-168. Manifolds Distance & Karcher, H. (1977). Riemannian center of mass and Mean Intrinsic Case mollifier smoothing, Communications on Pure and Applied Extrinsic Case HCA Algorithms Mathematics , 30, 509-541. Experiments Larsson, R. & Villani, M. (2001). A distance measure Summary & Future Work between cointegration spaces, Economics Letters , 70(1), References 21–27. Markos & Menexes Hierarchical Clustering on Special Manifolds

  88. References 2/4 Hierarchical Conway, J., Hardin, R. & Sloane, N. (1996). Packing Clustering on Special lines, planes, etc.: Packings in Grassmannian spaces. Manifolds Experimental Mathematics , 5, 139-159. Markos & Menexes Edelman, A., Arias, T. & Smith, S. (1998). The geometry of algorithms with orthogonality constrains. SIAM J. Motivation Matrix Anal. Appl , 20(2), 303-353. Background Manifolds Goodall, C. R. & Mardia, K.V. (1999). Projective shape Special Manifolds analysis, Journal of Computational and Graphical HCA on Special Statistics , 8(2), 143-168. Manifolds Distance & Karcher, H. (1977). Riemannian center of mass and Mean Intrinsic Case mollifier smoothing, Communications on Pure and Applied Extrinsic Case HCA Algorithms Mathematics , 30, 509-541. Experiments Larsson, R. & Villani, M. (2001). A distance measure Summary & Future Work between cointegration spaces, Economics Letters , 70(1), References 21–27. Markos & Menexes Hierarchical Clustering on Special Manifolds

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend