lattice calculations
play

Lattice calculations & DiRAC facility Matthew Wingate DAMTP - PowerPoint PPT Presentation

Lattice calculations & DiRAC facility Matthew Wingate DAMTP , University of Cambridge PPAP Community Meeting, 20-21 July 2017 DiRAC Outline Overview Selected physics highlights Flavour physics Muon magnetic moment QCD


  1. Lattice calculations & DiRAC facility Matthew Wingate DAMTP , University of Cambridge PPAP Community Meeting, 20-21 July 2017 DiRAC

  2. Outline • Overview • Selected physics highlights • Flavour physics • Muon magnetic moment • QCD spectrum • DiRAC computing facility 2

  3. Lattice QCD • Use methods of effective field theory and renormalization to turn a quantum physics problem into a statistical physics problem • Quarks propagating through strongly interacting QCD glue + sea of quark-antiquark bubbles • Numerically evaluate path integrals using Monte Carlo methods: importance sampling & correlation functions • Numerical challenge: solving M x = b where M is big and has a diverging condition number as am q ➙ 0 (vanishing lattice spacing × light quark mass) 3

  4. UKQCD consortium 24 faculty at 8 UK institutions • Membership/Leadership in several • international collaborations (e.g. HPQCD, RBC-UKQCD, HadSpec, QCDSF, FastSum) Image credit: CIA World Factbook Broad range of physics: quark • flavour, hadron spectrum, hot/ dense QCD; BSM theories of EWSB, dark matter Widespread impact: LHC, BES-III, • Belle, JLab, J-PARC, FAIR, RHIC, NA62 4

  5. Selected highlights Apologies for all the interesting work not mentioned here due to time.

  6. Quark flavour physics CKM matrix   V ud V us V ub = CKM Fitter V cd V cs V cb   V td V ts V tb  1 − λ 2 / 2 A λ 3 ( ρ − i η )  λ  + O ( λ 4 ) 1 − λ 2 / 2 A λ 2 − λ  A λ 3 (1 − ρ − i η ) − A λ 2 1 e + K → ⇡`⌫ B → ⇡`⌫ W + ν e B ( s ) → D ( ∗ ) ( s ) `⌫ D → ⇡`⌫ D → K `⌫ B 0 B 0 ( s ) − ¯ B c → J/ `⌫ d ′ ( s ) u tree 6

  7. Quark flavour physics Decay constants Experiment : weak decays 0.7 h 0 QCD | J µ | H i : em decays DECAY CONSTANT [GeV] Lattice QCD : predictions Weak (or EM) annihilation 0.6 : postdictions 0.5 B c → J/ `⌫ 0.4 0.3 0.2 0.1 π ψ � B � B � φ D s D � η c ψ B � Υ � η b K B D B s B c Υ s s c 0 Colquhoun et al., (HPQCD), arXiv:1503.05762 PRELIMINARY zero recoil Boyle et al, (RBC-UKQCD), arXiv:1701.02644 Form factors B → D ∗ `⌫ h D ∗ ( k ) | J µ | B ( p ) i Weak decay Harrison et al, (HPQCD), in preparation 7

  8. Quark flavour physics B s → �` + ` − Flavour changing neutral decays 1.70 1.70 PRELIMINARY PRELIMINARY 1.50 1.50 Flynn et al, (RBC-UKQCD), arXiv:1612.05112(C) f A 0 f V W t 1.30 1.30 b s s b 1.10 1.10 1.0 1.2 1.5 1.8 2.0 1.0 1.2 1.5 1.8 2.0 E 2 � [GeV 2 ] E 2 � [GeV 2 ] W W 0.66 0.38 ν PRELIMINARY PRELIMINARY t γ , Z 0.60 0.34 ℓ ℓ f A 12 f A 1 0.54 0.30 penguin box 0.48 0.26 1.0 1.2 1.5 1.8 2.0 1.0 1.2 1.5 1.8 2.0 E 2 � [GeV 2 ] E 2 � [GeV 2 ] 1.30 0.62 PRELIMINARY PRELIMINARY 0.58 B → K ∗ ` + ` − B s → �` + ` − 1.15 f T 1 f T 2 0.54 1.00 Horgan et al., (HPQCD) arXiv:1310.3722, arXiv:1310.3887 0.50 0.85 0.46 1.0 1.2 1.8 2.0 1.0 1.2 1.5 1.8 2.0 1.5 E 2 E 2 � [GeV 2 ] � [GeV 2 ] 1.05 PRELIMINARY Also rare K decays 0.95 am l =0.008 am l =0.010 f T 23 K → π l l : Christ et al., (RBC-UKQCD), arXiv:1608.07585 am l =0.006 am l =0.005 0.85 am l =0.004 K → π νν : Bai et al., (RBC-UKQCD), arXiv:1701.02858 0.75 1.0 1.2 1.8 2.0 1.5 E 2 � [GeV 2 ] 8

  9. μ magnetic moment a µ = 1 2 ( g − 2) muon Standard model contributions . Blum et al., arXiv:1311.2198 Value ( × 10 − 11 ) units QED ( � + ` ) 116 584 718 . 951 ± 0 . 009 ± 0 . 019 ± 0 . 007 ± 0 . 077 α HVP(lo) [20] 6 923 ± 42 HVP(lo) [21] 6 949 ± 43 SM theory HVP(ho) [21] − 98 . 4 ± 0 . 7 Expt HLbL 105 ± 26 EW 154 ± 1 Total SM [20] 116 591 802 ± 42 H-LO ± 26 H-HO ± 2 other ( ± 49 tot ) Total SM [21] 116 591 828 ± 43 H-LO ± 26 H-HO ± 2 other ( ± 50 tot ) Hadron Vacuum Hadronic Light-by-Light Polarization scattering (HVP) (HLbL) 9

  10. μ magnetic moment Isospin breaking effects [exploratory study] HVP in SM 0 . 0005 no new physics 0 . 0004 Lattice HPQCD this paper 0 . 0003 ETMC 1308.4327 δ V Π u ( Q 2 ) 0 . 0002 Jegerlehner Expt R ratio 1511.04473 Isospin Benayoun et al breaking 0 . 0001 1507.02943 Hagiwara et al 1105.3149 0 Jegerlehner et al 1101.2872 − 0 . 0001 0 0 . 5 1 1 . 5 2 2 . 5 3 640 650 660 670 680 690 700 710 720 730 ˆ Q 2 / 2 a HVP , LO × 10 10 µ Chakraborty et al., (HPQCD), arXiv:1601.03071 Boyle et al, (RBC-UKQCD), arXiv:1706.05293 Aim for 1% precision in lattice HVP in the next couple years + first LQCD efforts to estimate HLbL and quark-disconnected contributions. 10

  11. Spectroscopy Experimental discovery of “puzzling” hadronic resonances • X, Y, Z states: defy usual quarkonium description (e.g. exotic • quantum numbers; some are charged) Scalar D s0* (2317) and axial vector D s1 (2460) much narrower and • lighter than expected from quark model Lattice QCD can be used to study excited state spectrum, • distinguishing bound states and determining scattering properties Great care must be taken to correctly investigate resonance • structure, then control systematic errors 11

  12. Charmonium (narrow) Green ( ): Good overlap w/ operators q ¯ q Red ( ) & blue ( ): Hybrid mesons; Black ( ): Expt Cheung et al. (HadSpec), arXiv:1610.01073 m π ≈ 240 MeV a s a s ≈ 0 . 12 fm ≈ 3 . 5 a t Exotics 12

  13. Scattering amplitudes I = 3 — 3 coupled channels! 2 Finite volume ⇒ discrete • Moir et al. (HadSpec), arXiv:1607.07093 ρ i ρ j | t ij | 2 m π = 391 MeV ��� energy levels ��� D π → D π Need to reconstruct full • ��� ��� scattering amplitude ��� ��� ��� Groundbreaking results, • ��� D η → D η exploring new, sophisticated D s ¯ K → D s ¯ K ���� ���� ���� ���� ���� E cm / MeV methods ��� ��� D π → D η D π → D s ¯ Long programme to then K • D η → D s ¯ K control systematic ���� ���� ���� ���� ���� �� uncertainties �� �� ���� ���� ���� ���� ���� ���� ���� 13

  14. DiRAC computing facility DiRAC

  15. DiRAC 2 • 2011: £15M BIS investment in national distributed HPC facility for particle & nuclear physics, cosmology, & theoretical astrophysics. Recurrent costs funded by STFC • 2012: 5 systems deployed: • Extreme scaling :1.3 Pflop/s Blue Gene/Q (Edinburgh) • Data Analytic / Data Centric / Complexity : 3 tightly- coupled clusters with various levels of interconnectivity, memory, and fast I/O (Cambridge, Durham, Leicester) • Shared Memory System ( SMP ) (Cambridge) • Service started 1 December 2012 15

  16. DiRAC 2 outputs • 106 lattice publications, with 1977 citations (as of 20/7/2017) • 765 publications in a broad scientific range (PPAN) — 35,365 citations (as of 20/7/2017) • Gravitational waves, cosmology, galaxy & planet formation, exoplanets, MHD, particle pheno, nuclear physics • Valuable resource for PDRA’s & PhD students • Scientific results, training in high performance computing 16

  17. DiRAC 3 • Continued success requires continued investment • Seek approx £25M capital investment to upgrade DiRAC-2 x10 DiRAC&3((2016/17(–(TBC)( • Running costs for staff Many-Core Data Analytics Extreme Data Coding Programming Scaling Intensive and electricity Data Handling Tightly(coupled( Maximal( Archiving compute(&(storage:( computa7onal( • Improve exploitation of confronta7on(of( effort(applied( Data complex(simula7ons( to(a(problem(of( Management research and HPC with(large(data(sets !! fixed(size( training impact with PDRA and PhD support (Big Disaster Internet Data CDTs) Recovery Memory Analytics Intensive • Part of RCUK’s Fine Tuning Parallel Management Multi-threading e ‑ Infrastructure roadmap Larger(memory(footprint(per(node:(problem( size(grows(with(increasing(machine(power(( 17

  18. DiRAC 2 2011/12 Stop-gap funding: 2016/17 DiRAC 2.5 2017 DiRAC 2.5x DiRAC 3 2018/19 18

  19. DiRAC 2.5 After £1.67M capital injection • Extreme Scaling 2.5 : 1.3 Pflop/s Blue Gene/Q • Data Analytic 2.5 : Share of Peta5 system + continued access to Sandybridge system Shared EPSRC/DiRAC/Cambridge: 25K Skylake cores + 1.0 Pflop/s GPU + • 0.5 Pflop/s KNL service • Data Centric 2.5 : Over 14K cores, 128 GB RAM/node • Complexity 2.5 : 4.7K large-job cores + 3K small-job cores • SMP : 14.8TB, 1.8K core shared memory service 19

  20. DiRAC 2.5x June 2017: £9M capital funding (BEIS), lifeline to DiRAC3: • Planned investment • Extreme scaling : 1024-node, 2.5 Pflop/s system • Memory intensive : 144 nodes, 4.6K cores, 110 TB RAM • Data analytic : 128 nodes, 4K cores, 256GB/node; hierarchy of fat nodes (1-6 TB); NVMe storage for data intensive workflows • Additional storage at all DiRAC sites • Procurement procedure: November 2017 • Target for hardware availability: April 2018 20

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend