World Wide Computing and the ATLAS World Wide Computing and the ATLAS Experiment Experiment
Taipei, 25 Taipei, 25th
th July 2004
World Wide Computing and the ATLAS World Wide Computing and the - - PowerPoint PPT Presentation
World Wide Computing and the ATLAS World Wide Computing and the ATLAS Experiment Experiment th July 2004 Taipei, 25 th July 2004 Taipei, 25 Roger Jones Roger Jones ATLAS International Computing Board Chair ATLAS International Computing
th July 2004
RWL Jones, Lancaster University
RWL Jones, Lancaster University
We benefit from all developments We have problems maintaining coherence
This may not be what funders like the EU want to hear!
RWL Jones, Lancaster University
40 MHz bunch-
Real-time selection
Leptons Jets
RWL Jones, Lancaster University
D A T A F L O W EB ROS
H L T LV L1
D E T RO LVL2
SFI SFO E B N E F N ROI B L2P L2SV L 2 N
Event Filt er
DFM EFP EFP EFP EFP ~ sec ~4 GB/ s ROD ROD ROD ROB ROB ROB
1.3 kSI2k = 1 P4 with 3.2 GHz
RWL Jones, Lancaster University
RWL Jones, Lancaster University
RWL Jones, Lancaster University
provide this
to ATLAS computing through in-kind local capacity
with the ATLAS infrastructure
stay and be accessed by ATLAS collaborators
collaborative tools connection to the collaboration
RWL Jones, Lancaster University
RWL Jones, Lancaster University
But also drain resources from our core activities
RWL Jones, Lancaster University
RWL Jones, Lancaster University
RWL Jones, Lancaster University
Made clear the need for integrated system
One external driver is sustainable, two is not!
RWL Jones, Lancaster University
Slice test of the computing activities in 2007
RWL Jones, Lancaster University
Express lines Calibration and alignment lines Different output streams
Re-calibration
new calibrations and alignment parameters
Re-processing Analysis using ATLAS Distributed Analysis in late phase
RWL Jones, Lancaster University
wrap layers around apps,simplify deployment Very important tools for data management MAGDA and
RWL Jones, Lancaster University
RWL Jones, Lancaster University
LCG exe LCG exe NG exe G3 exe LSF exe super super super super super
RLS RLS RLS j abber j abber soap soap j abber
RWL Jones, Lancaster University
RWL Jones, Lancaster University
CERN
Physics Department
Desktop Germany
Taipei ASCC UK France I taly NL USA Brookhaven … … …
The LHC Computing Facility
RWL Jones, Lancaster University
Tier2 Centre ~200kSI2k
Event Builder Event Filter ~7MSI2k T0 ~5MSI2k
UK Regional Centre (RAL) US Regional Centre French Regional Centre Asian Regional Centre Sheffield Manchester Liverpool Lancaster ~0.25TIPS
Workstations >10 GB/sec 450 Mb/sec 100 - 1000 MB/s
monitoring to institutes
Each Tier 2 has ~25 physicists working on one or more channels Each Tier 2 should have the full AOD, TAG & relevant Physics Group summary data Tier 2 do bulk of simulation Physics data cache ~Pb/sec ~ 300MB/s/T1 /expt
Tier2 Centre ~200kSI2k Tier2 Centre ~200kSI2k
≥622Mb/s
PC (2004) = ~1 kSpecInt2k
Northern Tier ~200kSI2k
N Tier 1s each store 1/N of raw data, reprocess it & archive the ESD, hold 2/N of current ESD for scheduled analysis & all AOD+TAG ≥622Mb/s
RWL Jones, Lancaster University
No assumption of single site (esp. T2), but must present as a single entity in human/response terms
RWL Jones, Lancaster University
RWL Jones, Lancaster University
External T1 : Storage requirement Fraction
Disk (TB) Tape (TB)
General ESD (curr.) 429 150 1/3 General ESD (prev..) 214 150 1/6 AOD 257 180 1/1 TAG 3 2 1/1 RAW Data (sample) 6 533 1/6 RAW sim 0.0 33.3 1/6 ESD Sim (curr.) 23.8 8.3 1/3 ESD Sim (prev.) 11.9 8.3 1/6 AOD Sim 14 10 1/1 Tag Sim 1/1 User Data (20 groups) 171 120 1/6 Total 1130 1195
RWL Jones, Lancaster University
External T2 : Storage requirement Fraction Disk (TB) Tape (TB) General ESD (curr.) 26 1/50 General ESD (prev..) 18 1/50 AOD 64 1/4 TAG 3 1/1 ESD Sim (curr.) 1.4 1/50 ESD Sim (prev.) 1 1/50 AOD Sim 14 10 1/1 User Data (600/6/4=25) 37 26 Total 146 57
RWL Jones, Lancaster University
CERN T1/2 : Storage requirement
Disk (TB) Tape (TB) General ESD (curr.) 26 1/50 General ESD (prev..) 18 1/50 AOD 257 1/1 TAG 3 1/1 ESD Sim (curr.) 5.7 2/25 ESD Sim (prev.) 5.7 2/25 AOD Sim 14 2/25 Tag Sim User Data (100 users) 149 104 Total 460 122
RWL Jones, Lancaster University
Auto tape (Pb) 4.4 7.2 1.4 12.9 Disk (Pb) 0.5 6.8 3.5 10.8 CPU (MSI2k) 4.8 12.7 4.8 22.2
RWL Jones, Lancaster University
RWL Jones, Lancaster University
RWL Jones, Lancaster University
Middleware service interfaces CE WMS File Catalogue etc. ... etc. Middleware services High level service interfaces (AJDL)
Analysis Service
ROOT cmd line Client GANGA cmd line Client GANGA Task Management Graphical Job Builder GANGA Job Management High-level services Client tools Catalogue services GANGA GUI Dataset Splitter Dataset Merger Job Management
RWL Jones, Lancaster University
CERN
Physics Department
Germany
Taipei ASCC UK France I taly NL USA Brookhaven … … …
The LHC Computing Facility
RWL Jones, Lancaster University
CERN
Physics Department
Germany
Taipei ASCC UK France I taly NL USA Brookhaven … … …
The LHC Computing Facility
RWL Jones, Lancaster University
Physics Department Germany
USA FermiLab UK France I taly NL USA Brookhaven … … … .
RWL Jones, Lancaster University