isdd friday lecture bits bytes and certainly more than
play

ISDD Friday lecture Bits, Bytes and certainly more than just - PowerPoint PPT Presentation

ISDD Friday lecture Bits, Bytes and certainly more than just Microsoft, an overview of the ESRF computing infrastructure Slide: 1 Organisational Today, many of us are computer experts or at least computer literate Home computing


  1. ISDD Friday lecture Bits, Bytes … and certainly more than just Microsoft, an overview of the ESRF computing infrastructure Slide: 1

  2. Organisational  Today, many of us are computer experts … or at least computer literate  Home computing (PCs, Smartphones, tablets, Playstations, smartTVs, etc.)  Desktop computing (office applications, data analysis, etc.)  IT (Information Technology) or ICT (Information and Communication Technology) is transforming our lives  Two Divisions provide professional computing support at ESRF: ISDD Friday lecture – Computing Infrastructure Slide: 2

  3. Computing Groups/Units Management Management Software Information Information Systems & Web Systems Data Windows Web Analysis MIS Claude Accelerator UNIX Control Beamline Control Network Jeremy Hotline ISDD Friday lecture – Computing Infrastructure Slide: 3

  4. Organisational  Monthly Computer Coordination Meetings (CCMs)  To discuss cross divisional computing matters like standards, support, developments Participants:  ISDD: G. Beruyer, JM. Chaize, C. Ferrero, A. Götz  TID: R. Dimper, B. Lebayle, D. Porte  AF. Maydew (notes) “At this point in the meeting we’ll open a discussion of whether or not we needed to have this meeting.”  Bi-monthly Computer Security Working Group (CSWG) meetings  To discuss all matters concerning IT security, define policies, follow up incidents Participants:  F. Calvelo-Vazquez, R. Dimper, L. Duparchy, B. Dupré, B. Lebayle, AF. Maydew (notes), C. Rolland  Many thematic meetings:  LINUX  Buffer Storage, etc. ISDD Friday lecture – Computing Infrastructure Slide: 4

  5. Overview This presentation is not about: This presentation is about :  Software design,  the computer rooms,  software standards,  the network,  control systems,  data storage, data management,  field buses,  IT support,  stepper motor controllers,  Upcoming projects  programmable logic controllers,  :  digital electronics,  computer infrastructure!  Microsoft Office,  OpenOffice,  data analysis software,  ISDD activities,  EX2,  CPER,  :  a million other interesting things  : ISDD Friday lecture – Computing Infrastructure Slide: 5

  6. Outline  Organisational  Overview  Network  Computer rooms  Keeping our data safe  Analysing data  Around the desktop  Its all virtual  Databases  What’s on our plate? ISDD Friday lecture – Computing Infrastructure Slide: 6

  7. Overview Today IT (Information Technology) is underpinning everybody’s work  Many systems/computers are critical for everyday work  Desktop PC  Printers  Network  Internet  Databases  Management Information Systems  Smartphones  to assure functions like:  e-mail  Internet browsing  Text editing  Order processing  Data analysis  Vacation requests  :  : ISDD Friday lecture – Computing Infrastructure Slide: 7

  8. Overview Core mission of the ESRF – produce data (and publications!) The data life cycle Step 4 Step 5 Transformation/ Archival Analysis Step 6 Step 3 Publication Transfer+ Storage Step 7 Destruction Step 2 Verification Step 1 Generation ISDD Friday lecture – Computing Infrastructure Slide: 8

  9. Outline  Organisational  Overview  Network  Computer rooms  Keeping our data safe  Analysing data  Around the desktop  Its all virtual  Databases  What’s on our plate? ISDD Friday lecture – Computing Infrastructure Slide: 9

  10. Network IT is everywhere…the network is everywhere! ESRF operates with a class B IP address: 160.103.a.b  a=subnet, b=host • • Network speed: Mbps or Gbps = Mega or Giga bits per second Network backbone based on Extreme Network switches: • BlackDiamond8k switches with multiple 10 Gbps backbone links • On the beamlines: Extreme Summit X450-48P • 398 switches, all with 1 Gbps or 10Gbps ports • Inter-switch links based on up to 8 x 10 Gbps ports • Extreme Networks = fast (10 G wires-peed routing, filtering), reliable (dual power, dual management, dual modules), stable • First 40 Gbps ports ordered ISDD Friday lecture – Computing Infrastructure Slide: 10

  11. Network Inventory  280 networks  8 627 nodes  46 routers  398 network switches  >15 000 x 1Gbps capable copper ports  >1 000 x 10Gbps fibre ports  Beamlines with 10Gbps uplinks:  BM5, BM14, ID14, ID15, ID17Sat1, ID19, ID20, BM23, ID23, ID24, ID29, ID30  Computers with “private” 10Gbps links:  hexsalsa (ID15), wid15dimax (ID15), id19sat1 (ID19), lid29io (ID29), id29gate (ID29) And the network is also:  Wi-Fi, SSL gateways, firewall, copper cabling, fibre optic cabling, network monitoring and ...  Network standby for the accelerators and beamlines ISDD Friday lecture – Computing Infrastructure Slide: 11

  12. Network Internet Central Building Synopsis Computer 80 Gbps Room Offices 40 Gbps 10 Gbps 1 Gbps Control Room 100 Mbps Building Backup links Computer Room Standard beamline High-throughput beamline ISDD Friday lecture – Computing Infrastructure Slide: 12

  13. Network ESRF/ILL/EMBL connected via RENATER ISDD Friday lecture – Computing Infrastructure Slide: 13

  14. Network – Internet cabling Avenue des Martyrs / INPG ESRF/Central Building Tigre 1 active Tigre 2 passive Site Entrance A D2 C D1 Z5 B H2 Restaurant Roundabout Metronet / Tigre E EMBL ILL/ILL17 Tigre 1 passive Fiber optic termination Tigre 2 active ILL8 Site router Active device A480 / Campus St Martin d'Heres ISDD Friday lecture – Computing Infrastructure Slide: 14

  15. Network – Firewall et al Router Tigre2 Tigre1 Renater PacketShaper Grenoble Firewall+router BGP BGP Level2 switch ILL ESRF Premises Premises DMZ DMZ DMZ ESRF LAN ILL LAN EMBL LAN ISDD Friday lecture – Computing Infrastructure Slide: 15

  16. Outline  Organisational  Overview  Network  Computer rooms  Keeping our data safe  Analysing data  Around the desktop  Its all virtual  Databases  What’s on our plate? ISDD Friday lecture – Computing Infrastructure Slide: 16

  17. Computer rooms / data centres  Two computer rooms – data centres  CTRM – 150 kW electrical power, 110 m 2  Central Building – 370 kW electrical power, 300 m 2  Why?  Never put all eggs into the same basket  keep a copy of all data in the two rooms  Disks  tapes  Split fault tolerant systems between the two rooms  Many technical rooms, at least one in each building – network hubs Why a new+bigger data centre?  Insufficient power  Insufficient cooling  Insufficient floor space  Inadequate infrastructure  Instant provisioning required:  rack space  network connections  power outlets ISDD Friday lecture – Computing Infrastructure Slide: 17

  18. Data Centre - Construction  Built around the existing computing room, all equipement kept operational during the works  Reinforced slab and false floor supporting 1000 kg/m 2  Fireproof glass windows  Noise reduction ISDD Friday lecture – Computing Infrastructure Slide: 18

  19. Data Centre - Construction  10 months (without preparatory works)  Dust minimized  Noise minimized  Disturbance minimized  Cooling kept efficient  Computing equipment kept up and running (even when replacing the racks!) ISDD Friday lecture – Computing Infrastructure Slide: 19

  20. Data Centre - design  300 m 2  370 kW  1000 kg/m 2  Cold aisle / hot aisle  Low density area = 66 racks, 170 kW  High density area = 10 racks, 200 kW ISDD Friday lecture – Computing Infrastructure Slide: 20

  21. Data Centre – behind the scene  Dual power supply for all equipment  Dual UPS in separate rooms  Aerial cable trays for electricity + network  Flexible and modular electrical distribution  Dual cooling system = chilled water + air exchangers  Smoke extraction system (in case of fire)  Chilled water circuit for the high density area  False floor: several fan-equipped tiles ISDD Friday lecture – Computing Infrastructure Slide: 21

  22. Data Centre – cooling Cold aisle / Hot aisle principle – section view Extract hot air Rack Computer Inject cold air in false floor ISDD Friday lecture – Computing Infrastructure Slide: 22

  23. Data Centre – cooling Cold aisle / Hot aisle principle – aerial view ISDD Friday lecture – Computing Infrastructure Slide: 23

  24. Why a high density area? Top view Rack  A perforated tile is not sufficient for cooling a single rack full of powerful servers (20-30 kW/rack), free air flow 6 dedicated typically allows for 10-15 kW/rack AC units maximum  More efficient : cold air has not to be pushed over 20 meters to the computers Computer  More reliable : one of the 6 units can fail without consequence Door Slide: 24 ISDD Friday lecture – Computing Infrastructure

  25. Data Centre “The cube” (closed hot aisle, up to 200 kW) Two rows of racks (cold aisle) ISDD Friday lecture – Computing Infrastructure Slide: 25

  26. Data Centre – to house what?  Network equipment  Disk systems  Tape libraries  Infrastructure servers ISDD Friday lecture – Computing Infrastructure Slide: 26

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend