travelling securely on the grid to the origin of the
play

Travelling securely on the Grid to the origin of the Universe - PowerPoint PPT Presentation

Travelling securely on the Grid to the origin of the Universe F-Secure SPECIES 2007 conference Wolfgang von Rden 1 Head, IT Department, CERN, Geneva 24 January 2007 1 Zurich January 2007 CERN stands for over 50 years of


  1. Travelling securely on the Grid to the origin of the Universe F-Secure SPECIES 2007 conference Wolfgang von Rüden 1 Head, IT Department, CERN, Geneva 24 January 2007 1 Zurich – January 2007

  2. CERN stands for over 50 years of • fundamental research and discoveries technological innovation • • training and education • bringing the world together 2004 Global Collaboration 1954 Rebuilding Europe The Large Hadron Collider First meeting of the involves over 80 countries CERN Council 2 1980 East meets West Visit of a delegation from Beijing 2 Zurich – January 2007

  3. CERN’s mission in Science • Understand the fundamental laws of nature • We accelerate elementary particles and make them collide. • We observe the results and compare them with the theory. • We try to understand the origin of the Universe. • Provide a world-class laboratory to researchers in Europe and beyond • New: Support world-wide computing using Grid technologies • A few numbers … • 2500 employees: physicists, engineers, technicians, craftsmen, administrators, secretaries, … • 8000 visiting scientists (half of the world’s particle physicists), representing 500 universities and over 80 nationalities 3 • Budget: ~1 Billion Swiss Francs per year • Additional contributions by participating institutes 3 Zurich – January 2007

  4. 4 4 Zurich – January 2007

  5. 5 5 Zurich – January 2007

  6. 6 6 Zurich – January 2007

  7. 7 7 Zurich – January 2007

  8. How does the Grid work? • It relies on advanced software, called middleware . • Middleware automatically finds the data the scientist needs, and the computing power to analyse it. • Middleware balances the load on different resources. It also handles security, accounting, monitoring and much more. 8 8 Zurich – January 2007

  9. 9 Why does CERN need the Grid? 9 Zurich – January 2007

  10. The LHC accelerator and the four experiments 10 10 Zurich – January 2007

  11. 11 View of the LHC tunnel 11 Zurich – January 2007

  12. View of the ATLAS detector (under construction) 150 million sensors deliver data … … 40 million times per second 12 12 Zurich – January 2007

  13. 13 13 Zurich – January 2007

  14. 14 14 Zurich – January 2007

  15. 15 15 Zurich – January 2007

  16. 16 16 Zurich – January 2007

  17. 17 17 Zurich – January 2007

  18. 18 18 Zurich – January 2007

  19. 19 19 Zurich – January 2007

  20. Today’s installation at CERN: 4000 TB 8500 CPUs (Linux) on 14’000 drives in 3500 boxes (NAS Disk Storage) 45’000 Tape Slots installed and 20 170 high speed drives (10 PB capacity) 20 Zurich – January 2007

  21. Massive ramp-up during 2006-08 21 21 Zurich – January 2007

  22. Massive ramp-up during 2006-08 22 22 Zurich – January 2007

  23. Massive ramp-up during 2006-08 23 23 Zurich – January 2007

  24. Distribution of Computing Services LCG Summary of Computing Resource Requirements All experiments - 2008 From LCG TDR - June 2005 CERN All Tier-1s All Tier-2s Total CPU (MSPECint2000s) 25 56 61 142 Disk (PetaBytes) 7 31 19 57 Tape (PetaBytes) 18 35 53 CPU Disk Tape CERN CERN 12% 18% A ll Tier-2s CERN 33% 34% A ll Tier-2s 43% A ll Tier-1s 66% A ll Tier-1s A ll Tier-1s 39% 55% les robertson - cern-it-lcg

  25. WLCG Collaboration LCG � The Collaborat ion � 4 LHC experiment s � ~120 comput ing cent res � 12 large cent res (Tier-0, Tier-1) � 38 f ederat ions of smaller “Tier-2” cent res � Growing t o ~40 count ries � Memorandum of Underst anding � Agreed in Oct ober 2005, now being signed � Resources � Commit ment made each Oct ober f or t he coming year � 5-year f orward look les robertson - cern-it-lcg

  26. • Worldwide Grid for science • ~200 sites – some very big, some very small • 60 Virtual Organisations with >25 000 CPUs 26 26 Zurich – January 2007

  27. The EGEE project Enabling Grids for E-sciencE • EGEE – Started in April 2004 – Now in 2 nd phase with 91 partners in 32 countries • Objectives – Large-scale, production-quality grid infrastructure for e-Science – Attracting new resources and users from industry as well as science – Maintain and further improve “gLite” Grid middleware – Improve Grid security 27 EGEE-II INFSO-RI-031688

  28. Entering the Grid Resources Virtual Organizations Users Institute A SITE A VO A Users Institute B SITE B VO B ... ... Users ... Institute X VO X SITE X International Grid peer peer Grids Users Trust Federation Grids (X.509/PKI) Authorization flow CERN – January 2007

  29. Entering the Grid Resources Virtual Organizations Users Institute A SITE A ITGF brings common policies VO A and standards among Users accredited CAs Institute B SITE B VO B Users are then registered in ... Virtual Organisations (VO) ... Users ... Institute X VO X SITE X International Grid peer peer Grids Users Trust Federation Grids (X.509/PKI) Authorization flow CERN – January 2007

  30. Entering the Grid Resources According to their role in the Virtual Organizations Users VO, users are authorized to Institute A SITE A use the Grid services VO A Users Institute B SITE B VO B ... ... Users Users are authenticated ... using X509 certificates Institute X VO X SITE X issued by Certificate Authorities (CAs) International Grid peer peer Grids Users Trust Federation Grids (X.509/PKI) Authorization flow CERN – January 2007

  31. Security Collaboration in the LHC Grid Software Grid vulnerabilities Common Architecture Security Policies for Framework Vulnerability Grids Interoperability Group Joint MiddleWare Security Security Policy Group Group International Operational Grid Security Trust Coordination Federation Team 31 Operations Trust anchor Best Practice CA CERTs/CSIRTs (Initial picture by Ake Edlund) 31 Zurich – January 2007

  32. Security Collaboration in the LHC Grid Software Grid vulnerabilities Common Architecture Security Policies for International Grid Trust Federation Framework Vulnerability Grids Interoperability Group is maintaining global trust Joint MiddleWare Security relationships between the Security Policy Certificate Authorities Group Group International Operational Grid Security Joint Security Policy Group Trust Coordination is providing a coherent set of Federation Team security policies to be used by 32 the Grids Operations Trust anchor Best Practice CA CERTs/CSIRTs (Initial picture by Ake Edlund) 32 Zurich – January 2007

  33. Security Collaboration in the LHC Grid Software Grid vulnerabilities Common Architecture Security Policies for Framework Vulnerability Grids Interoperability Group Joint MiddleWare Security Middleware Security Group Security Policy Group is defining the security Group framework and architecture of the Grid software International Operational Grid Security Trust Coordination Grid Security Vulnerability Group Federation Team is handling Grid middleware 33 security vulnerabilities Operations Trust anchor Best Practice CA CERTs/CSIRTs (Initial picture by Ake Edlund) 33 Zurich – January 2007

  34. Security Collaboration in the LHC Grid Software Grid vulnerabilities Common Architecture Security Operational Security Coordination Team Policies for Framework Vulnerability Grids is dealing with operational issues, Interoperability Group Joint MiddleWare from best practice recommendations to Security Security multi-site incident response coordination Policy Group Group International Operational Grid Security Trust Coordination Federation Team 34 Operations Trust anchor Best Practice CA CERTs/CSIRTs (Initial picture by Ake Edlund) 34 Zurich – January 2007

  35. Applications on EGEE Enabling Grids for E-sciencE • More than 25 applications from an increasing number of domains – Astrophysics – Computational Chemistry – Earth Sciences – Financial Simulation – Fusion – Geophysics – High Energy Physics – Life Sciences – Multimedia – Material Sciences – ….. 35 EGEE-II INFSO-RI-031688

  36. Example: EGEE Attacks Avian Flu Enabling Grids for E-sciencE • EGEE used to analyse 300,000 possible potential drug compounds against bird flu virus, H5N1. • 2000 computers at 60 computer centres in Europe, Russia, Asia and Middle East ran during four weeks in April - the equivalent of 100 years on a single computer. Neuraminidase, one of the two major surface proteins of influenza viruses, facilitating the release of virions from infected cells . Image Courtesy Ying-Ta Wu, AcademiaSinica. • Potential drug compounds now being identified and ranked. 36 EGEE-II INFSO-RI-031688

  37. Example: ITU Project Enabling Grids for E-sciencE • International Telecommunication Union – ITU/BR: Radio-communication Sector � management of the radio-frequency spectrum and satellite orbits for fixed, mobile, broadcasting and other communication services RRC-06 (15 May–16 June 2006) • – 120 countries negotiate the new frequency plan – introduction of digital broadcasting � UHF (470-862 Mhz) & VHF (174-230 Mhz) – Demanding computing problem with short- deadlines – Using EGEE grid were able to complete a cycle in less than 1 hour 37 EGEE-II INFSO-RI-031688

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend