equivariant representations for atomistic machine learning
play

Equivariant Representations for Atomistic Machine Learning Michele - PowerPoint PPT Presentation

Equivariant Representations for Atomistic Machine Learning Michele Ceriotti - cosmo.epfl.ch Workshop on Molecular Dynamics and its Applications to Biological Systems, Sept. 2020 The problem of representation Mapping an atomic structure to a


  1. Equivariant Representations for Atomistic Machine Learning Michele Ceriotti - cosmo.epfl.ch Workshop on Molecular Dynamics and its Applications to Biological Systems, Sept. 2020

  2. The problem of representation Mapping an atomic structure to a mathematical representation suitable to ML is the first and perhaps most important step for atomistic machine learning train set inference * * dimensionality reduc � on classi fi ca � on * * MC , Unsupervised machine learning in atomistic simulations, between predictions and understanding , JCP (2019) 2 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  3. A phylogenetic tree of ML representations Behler-Parrinello aPIPs ACE DeepMD permutation g(r) MTP GTTP projection MBTR SNAP invariant atomic Wasserstein distance SHIP polynomials symmetry metric sharp histograms δ limit functions permutations blur permutations (histogram) SOAP PIV (average) smooth density sorted Sorted CM FCHL correlation distances BoB Wavelets equivalent NICE features atom rotations, centred permutations (density products) sorted distributions (sorting) Diffraction FP eigenvalues translations molecular 3D Voxel SPRINT atom matrices potential overlap matrix internal symmetrized density fields Z matrix coordinates local field fields translations, LODE rotations molecular rotations & permutations graphs translations family of features symmetry Cartesian coordinates other relation 3 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  4. What we want from a representation Structure representations should: 1. reflect basic physical symmetries; 2. be complete (injective); 3. be smooth, regular; 4. exploit additivity Cartesian coordinates fulfill only 2 and 3 1 translations 2 permutations 4 3 rotations completeness 1 4 structure space 3 symmetry 2 feature space smoothness additivity 4 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  5. Additivity, and locality A representation of a structure in terms of a sum over atom-centered terms implies (for a linear model or an average kernel) an additive form of the property 5 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  6. Additivity, and locality A representation of a structure in terms of a sum over atom-centered terms implies (for a linear model or an average kernel) an additive form of the property 5 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  7. Additivity, and locality A representation of a structure in terms of a sum over atom-centered terms implies (for a linear model or an average kernel) an additive form of the property 5 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  8. A Dirac notation for representations representation features target & nature index correlation structure order parity radial indices rot. field symmetry angular channels center A representation maps a structure A (or one environment A i ) to a vector discretized by a feature index X Bra-ket notation � X | A ; rep. � indicates in an abstract way this mapping, leaving plenty of room to express the details of a representation Dirac-like notation reflects naturally a change of basis, the construction of a kernel, or a linear model � d X � Y | X � � X | A � � Y | A � = Willatt, Musil, MC , JCP (2019); https://tinyurl.com/dirac-rep 6 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  9. A Dirac notation for representations representation features target & nature index correlation structure order parity radial indices rot. field symmetry angular channels center A representation maps a structure A (or one environment A i ) to a vector discretized by a feature index X Bra-ket notation � X | A ; rep. � indicates in an abstract way this mapping, leaving plenty of room to express the details of a representation Dirac-like notation reflects naturally a change of basis, the construction of a kernel, or a linear model � k ( A , A ′ ) = � A | A ′ � ≈ d X � A | X � � X | A ′ � Willatt, Musil, MC , JCP (2019); https://tinyurl.com/dirac-rep 6 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  10. A Dirac notation for representations representation features target & nature index correlation structure order parity radial indices rot. field symmetry angular channels center A representation maps a structure A (or one environment A i ) to a vector discretized by a feature index X Bra-ket notation � X | A ; rep. � indicates in an abstract way this mapping, leaving plenty of room to express the details of a representation Dirac-like notation reflects naturally a change of basis, the construction of a kernel, or a linear model � d X � E | X � � X | A � E ( A ) = � E | A � ≈ Willatt, Musil, MC , JCP (2019); https://tinyurl.com/dirac-rep 6 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  11. Symmetrized field construction Start from a non-symmetric representation (Cartesian coordinates) Define a decorated atom-density | ρ � (permutation invariant) Translational average of a tensor product | ρ � ⊗ | ρ � yields atom-centred (and ˆ t invariant) | ρ i � Willatt, Musil, MC , JCP (2019) 7 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  12. Symmetrized field construction Start from a non-symmetric representation (Cartesian coordinates) Define a decorated atom-density | ρ � (permutation invariant) Translational average of a tensor product | ρ � ⊗ | ρ � yields atom-centred (and ˆ t invariant) | ρ i � Willatt, Musil, MC , JCP (2019) 7 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  13. Symmetrized field construction Start from a non-symmetric representation (Cartesian coordinates) Define a decorated atom-density | ρ � (permutation invariant) Translational average of a tensor product | ρ � ⊗ | ρ � yields atom-centred (and ˆ t invariant) | ρ i � Willatt, Musil, MC , JCP (2019) 7 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  14. A universal feature construction Rotationally-averaged representations are essentially the same n -body correlations that are used in statistical theories of liquids Linear models built on | ρ ⊗ ν ; g → δ � yield ( ν + 1 ) -body potential expansion i ij V ( 2 ) � ij V ( 3 ) � � � V ( A i ) = � r ij + � r ij , r ik , ω ijk . . . Basically any atom-centred feature can be seen as a projection of | ρ ⊗ ν � i * Willatt, Musil, MC , JCP (2019); Bartók, Kondor, Csányi PRB 2013 8 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  15. A universal feature construction Rotationally-averaged representations are essentially the same n -body correlations that are used in statistical theories of liquids Linear models built on | ρ ⊗ ν ; g → δ � yield ( ν + 1 ) -body potential expansion i ij V ( 2 ) � ij V ( 3 ) � � � V ( A i ) = � r ij + � r ij , r ik , ω ijk . . . Basically any atom-centred feature can be seen as a projection of | ρ ⊗ ν � i * Willatt, Musil, MC , JCP (2019); Drautz, PRB (2019); Glielmo, Zeni, De Vita, PRB (2018) 8 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  16. A universal feature construction Rotationally-averaged representations are essentially the same n -body correlations that are used in statistical theories of liquids Linear models built on | ρ ⊗ ν ; g → δ � yield ( ν + 1 ) -body potential expansion i ij V ( 2 ) � ij V ( 3 ) � � � V ( A i ) = � r ij + � r ij , r ik , ω ijk . . . Basically any atom-centred feature can be seen as a projection of | ρ ⊗ ν � i Willatt, Musil, MC , JCP (2019); Drautz, PRB (2019); Glielmo, Zeni, De Vita, PRB (2018) 8 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  17. A universal feature construction Rotationally-averaged representations are essentially the same n -body correlations that are used in statistical theories of liquids Linear models built on | ρ ⊗ ν ; g → δ � yield ( ν + 1 ) -body potential expansion i ij V ( 2 ) � ij V ( 3 ) � � � V ( A i ) = � r ij + � r ij , r ik , ω ijk . . . Basically any atom-centred feature can be seen as a projection of | ρ ⊗ ν � i Willatt, Musil, MC , JCP (2019); Drautz, PRB (2019); Glielmo, Zeni, De Vita, PRB (2018) 8 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

  18. Variations on a theme Most of the existing density-based representations and kernels emerge as special cases of this framework Basis set choice - e.g. plane waves basis for | ρ ⊗ 2 � (Ziletti et al. N.Comm 2018) i Projection on symmetry functions (Behler-Parrinello, DeepMD) � e i k · r ij � k | A ; ρ ⊗ 2 � = ij ∈ A Willatt, Musil, MC , JCP (2019), https://arxiv.org/pdf/1807.00408 9 Michele Ceriotti - cosmo.epfl.ch Equivariant Representations for Atomistic Machine Learning

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend