machine learning incomputably large data sets and the
play

Machine learning, incomputably large data sets, and the string - PowerPoint PPT Presentation

Machine learning, incomputably large data sets, and the string landscape 2017 Workshop on Data Science and String Theory Northeastern University December 1, 2017 Washington (Wati) Taylor, MIT Based in part on arXiv: 1510.04978, 1511.03209,


  1. Machine learning, incomputably large data sets, and the string landscape 2017 Workshop on Data Science and String Theory Northeastern University December 1, 2017 Washington (Wati) Taylor, MIT Based in part on arXiv: 1510.04978, 1511.03209, 1710.11235, written in collaboration with Y. Wang W. Taylor Machine learning and the string landscape 1 / 15

  2. Outline 1. Comments on problems for machine learning and in the landscape 2. The “skeleton” of the F-theory landscape: a very large graph W. Taylor Machine learning and the string landscape 2 / 15

  3. Some comments on problems for machine learning and in the landscape Some personal reflections ∼ 30 years ago I worked for a company named “Thinking Machines”. Goal: “Build a machine that would be proud of us” (D. Hillis) Company built a machine with 64k parallel processes; 1997 →∼ 100 Gflops Richard Feynman designed the communication network/routing system W. Taylor Machine learning and the string landscape 3 / 15

  4. As a side project, I worked on “evolving” neural network like systems; ultimate goal: play go, learning from scratch Difficult problem, not enough computer power; I went to grad school to learn string theory. Stopped following AI in any detail. AlphaGo Zero astonished me! 1. Computers are much faster, clearly But also, I suspect: 2. Humans are even less effective at playing go than I thought. A bit like physics research: we have some vague idea of our long term strategy, and some technical and analytic tools that we apply in a fairly limited human fashion . . . W. Taylor Machine learning and the string landscape 4 / 15

  5. As a side project, I worked on “evolving” neural network like systems; ultimate goal: play go, learning from scratch Difficult problem, not enough computer power; I went to grad school to learn string theory. Stopped following AI in any detail. AlphaGo Zero astonished me! 1. Computers are much faster, clearly But also, I suspect: 2. Humans are even less effective at playing go than I thought. A bit like physics research: we have some vague idea of our long term strategy, and some technical and analytic tools that we apply in a fairly limited human fashion . . . W. Taylor Machine learning and the string landscape 4 / 15

  6. As a side project, I worked on “evolving” neural network like systems; ultimate goal: play go, learning from scratch Difficult problem, not enough computer power; I went to grad school to learn string theory. Stopped following AI in any detail. AlphaGo Zero astonished me! 1. Computers are much faster, clearly But also, I suspect: 2. Humans are even less effective at playing go than I thought. A bit like physics research: we have some vague idea of our long term strategy, and some technical and analytic tools that we apply in a fairly limited human fashion . . . W. Taylor Machine learning and the string landscape 4 / 15

  7. What kinds of problems is machine learning currently best at? Classification problems: image/face recognition, speech recognition, . . . Generally large data field, categorization into finite classes Optimization problems: “pretty good” solutions in high-dimensional spaces with reasonably smooth local structure W. Taylor Machine learning and the string landscape 5 / 15

  8. Problems relevant for the string landscape Types of problems with issues that go beyond current machine learning 1. We don’t know what we’re doing, don’t have a framework — Fundamental background-independent formulation of ST/QG — Nonperturbative definition of F-theory 2. We lack mathematical frameworks, even for semi-understood problems — Classify non-geometric flux vacua — Describe G 2 manifolds with nonabelian symmetries 3. Many things we don’t know how to compute — Classify Calabi-Yau threefolds (can’t even prove finite number) — Compute superpotential and low-energy EFT for F-theory (nonpert.) 4. We don’t know what physics we are looking for — SUSY/SUSY breaking? — GUT SU(5)? SO ( 10 ) ? E 6 , E 8 ? non-Higgsable SU(3) x SU(2) x U(1)? I believe machine learning will not solve any of these problems anytime soon. W. Taylor Machine learning and the string landscape 6 / 15

  9. Problems relevant for the string landscape Types of problems with issues that go beyond current machine learning 1. We don’t know what we’re doing, don’t have a framework — Fundamental background-independent formulation of ST/QG — Nonperturbative definition of F-theory 2. We lack mathematical frameworks, even for semi-understood problems — Classify non-geometric flux vacua — Describe G 2 manifolds with nonabelian symmetries 3. Many things we don’t know how to compute — Classify Calabi-Yau threefolds (can’t even prove finite number) — Compute superpotential and low-energy EFT for F-theory (nonpert.) 4. We don’t know what physics we are looking for — SUSY/SUSY breaking? — GUT SU(5)? SO ( 10 ) ? E 6 , E 8 ? non-Higgsable SU(3) x SU(2) x U(1)? I believe machine learning will not solve any of these problems anytime soon. W. Taylor Machine learning and the string landscape 6 / 15

  10. Problems relevant for the string landscape Types of problems with issues that go beyond current machine learning 1. We don’t know what we’re doing, don’t have a framework — Fundamental background-independent formulation of ST/QG — Nonperturbative definition of F-theory 2. We lack mathematical frameworks, even for semi-understood problems — Classify non-geometric flux vacua — Describe G 2 manifolds with nonabelian symmetries 3. Many things we don’t know how to compute — Classify Calabi-Yau threefolds (can’t even prove finite number) — Compute superpotential and low-energy EFT for F-theory (nonpert.) 4. We don’t know what physics we are looking for — SUSY/SUSY breaking? — GUT SU(5)? SO ( 10 ) ? E 6 , E 8 ? non-Higgsable SU(3) x SU(2) x U(1)? I believe machine learning will not solve any of these problems anytime soon. W. Taylor Machine learning and the string landscape 6 / 15

  11. Problems relevant for the string landscape Types of problems with issues that go beyond current machine learning 1. We don’t know what we’re doing, don’t have a framework — Fundamental background-independent formulation of ST/QG — Nonperturbative definition of F-theory 2. We lack mathematical frameworks, even for semi-understood problems — Classify non-geometric flux vacua — Describe G 2 manifolds with nonabelian symmetries 3. Many things we don’t know how to compute — Classify Calabi-Yau threefolds (can’t even prove finite number) — Compute superpotential and low-energy EFT for F-theory (nonpert.) 4. We don’t know what physics we are looking for — SUSY/SUSY breaking? — GUT SU(5)? SO ( 10 ) ? E 6 , E 8 ? non-Higgsable SU(3) x SU(2) x U(1)? I believe machine learning will not solve any of these problems anytime soon. W. Taylor Machine learning and the string landscape 6 / 15

  12. Problems relevant for the string landscape Types of problems with issues that go beyond current machine learning 1. We don’t know what we’re doing, don’t have a framework — Fundamental background-independent formulation of ST/QG — Nonperturbative definition of F-theory 2. We lack mathematical frameworks, even for semi-understood problems — Classify non-geometric flux vacua — Describe G 2 manifolds with nonabelian symmetries 3. Many things we don’t know how to compute — Classify Calabi-Yau threefolds (can’t even prove finite number) — Compute superpotential and low-energy EFT for F-theory (nonpert.) 4. We don’t know what physics we are looking for — SUSY/SUSY breaking? — GUT SU(5)? SO ( 10 ) ? E 6 , E 8 ? non-Higgsable SU(3) x SU(2) x U(1)? I believe machine learning will not solve any of these problems anytime soon. W. Taylor Machine learning and the string landscape 6 / 15

  13. Nonetheless, many possible applications of large scale computation/ML Notwithstanding the preceding issues, we are getting a good enough handle on the landscape that we are beginning to have large datasets. Some (see below) are too large to enumerate (e.g. more elements than particles in the observable universe) • Many difficult computational problems • Need methods for dealing with large, poorly understood datasets (ML?) — Identify patterns, suggest hypotheses for theoretical advances — Look for elements combining desired features (e.g. SM gauge group, matter content, Yukawas, etc.; cf. Ruehle talk) — Statistics and structure of BIG datasets W. Taylor Machine learning and the string landscape 7 / 15

  14. Nonetheless, many possible applications of large scale computation/ML Notwithstanding the preceding issues, we are getting a good enough handle on the landscape that we are beginning to have large datasets. Some (see below) are too large to enumerate (e.g. more elements than particles in the observable universe) • Many difficult computational problems • Need methods for dealing with large, poorly understood datasets (ML?) — Identify patterns, suggest hypotheses for theoretical advances — Look for elements combining desired features (e.g. SM gauge group, matter content, Yukawas, etc.; cf. Ruehle talk) — Statistics and structure of BIG datasets W. Taylor Machine learning and the string landscape 7 / 15

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend