status des atlas experimentes am lhc beschleuniger des
play

Status des ATLAS Experimentes am LHC Beschleuniger des CERN - PowerPoint PPT Presentation

Status des ATLAS Experimentes am LHC Beschleuniger des CERN FAKT-Tagung Langenlois, 24.9.2007 E. Kneringer Institut fr Astro- und Teilchenphysik, Universitt Innsbruck Inhaltsbersicht Einleitende Kommentare Innsbruck Gruppe


  1. Status des ATLAS Experimentes am LHC Beschleuniger des CERN FAKT-Tagung Langenlois, 24.9.2007 E. Kneringer Institut für Astro- und Teilchenphysik, Universität Innsbruck

  2. Inhaltsübersicht � Einleitende Kommentare � Innsbruck Gruppe � Hauptteil � ATLAS collaboration � ATLAS installation schedule � ATLAS cavern � Milestones and commissioning � Subdetector and TDAQ systems � Offline computing, physics analysis on the Grid � Detector paper, CSC notes � Analyses strategies, statistics forum � Physics: QCD � Zum Schmunzeln 2

  3. ATLAS Collaboration 35 Countries 164 Institutions 1900 Scientific Authors total (1550 with a PhD, for M&O share) New application for CB decision: Göttingen, Germany New Expressions of Interest: Santiago/ Valparaíso, Chile Bogotá, Colombia Albany, Alberta, NIKHEF Amsterdam, Ankara, LAPP Annecy, Argonne NL, Arizona, UT Arlington, Athens, NTU Athens, Baku, IFAE Barcelona, Belgrade, Bergen, Berkeley LBL and UC, HU Berlin, Bern, Birmingham, Bologna, Bonn, Boston, Brandeis, Bratislava/SAS Kosice, Brookhaven NL, Buenos Aires, Bucharest, Cambridge, Carleton, Casablanca/Rabat, CERN, Chinese Cluster, Chicago, Clermont-Ferrand, Columbia, NBI Copenhagen, Cosenza, AGH UST Cracow, IFJ PAN Cracow, DESY, Dortmund, TU Dresden, JINR Dubna, Duke, Frascati, Freiburg, Geneva, Genoa, Giessen, Glasgow, LPSC Grenoble, Technion Haifa, Hampton, Harvard, Heidelberg, Hiroshima, Hiroshima IT, Indiana, Innsbruck , Iowa SU, Irvine UC, Istanbul Bogazici, KEK, Kobe, Kyoto, Kyoto UE, Lancaster, UN La Plata, Lecce, Lisbon LIP, Liverpool, Ljubljana, QMW London, RHBNC London, UC London, Lund, UA Madrid, Mainz, Manchester, Mannheim, CPPM Marseille, Massachusetts, MIT, Melbourne, Michigan, Michigan SU, Milano, Minsk NAS, Minsk NCPHEP, Montreal, McGill Montreal, FIAN Moscow, ITEP Moscow, MEPhI Moscow, MSU Moscow, Munich LMU, MPI Munich, Nagasaki IAS, Nagoya, Naples, New Mexico, New York, Nijmegen, BINP Novosibirsk, Ohio SU, Okayama, Oklahoma, Oklahoma SU, Oregon, LAL Orsay, Osaka, Oslo, Oxford, Paris VI and VII, Pavia, Pennsylvania, Pisa, Pittsburgh, CAS Prague, CU Prague, TU Prague, IHEP Protvino, Regina, Ritsumeikan, UFRJ Rio de Janeiro, Rome I, Rome II, Rome III, Rutherford Appleton Laboratory, DAPNIA Saclay, Santa Cruz UC, Sheffield, Shinshu, Siegen, Simon Fraser Burnaby, SLAC, Southern Methodist Dallas, NPI Petersburg, Stockholm, KTH Stockholm, Stony Brook, Sydney, AS Taipei, Tbilisi, Tel Aviv, Thessaloniki, Tokyo ICEPP, Tokyo MU, Toronto, TRIUMF, Tsukuba, Tufts, Udine, Uppsala, Urbana UI, Valencia, UBC Vancouver, 3 Victoria, Washington, Weizmann Rehovot, FH Wiener Neustadt , Wisconsin, Wuppertal, Yale, Yerevan

  4. The ATLAS detector ATLAS superimposed to the 5 floors of building 40 Inner Detector (ID) tracker: Si pixel and strip + transition radiation tracker � σ (d 0 ) = 15 μ m@20GeV � σ /p T ≈ 0.05%p T ⊕ 1% � Calorimeter Liquid Ar EM Cal, Tile Hadronic Cal � EM: σ E /E = 10%/ √ E ⊕ 0.7% � Had: σ E /E = 50%/ √ E ⊕ 3% � Diameter 25 m Muon spectrometer Barrel toroid length 26 m Drift tubes, cathode strips: precision tracking + � End-cap end-wall chamber span 46 m RPC, TGC: triggering � Overall weight 7000 Tons σ /p T ≈ 2-7% � Magnets length of cables 3000 km Solenoid (ID) → 2T � # of channels ~10 8 Air core toroids (muon) → up to 4T � 4

  5. Globe The Underground Side C Side A (irport) Cavern at Pit-1 for the ATLAS Detector Length = 55 m Width = 32 m Height = 35 m “building a ship in a bottle” 5

  6. 6 ATLAS cavern (Sept. 26, 2005) Barrel Toroid

  7. ATLAS cavern (Sept. 23, 2007) two years later the cavern is full! Muon Big Wheel 7

  8. End Cap Toroid A 29 May 2007, on schedule 15 m high, 5 m wide, 240 tons weight 8

  9. Atlas installation schedule v9.2 (14 Sep 2007) now close finish installation end of year (small wheels) 9

  10. Milestones � Milestone weeks (typically 2 weeks duration) � operating the experiment as a whole � from subdetector to permanent storage � increasing number of subdetectors involved at each stage � 12 subdetectors in total + computing + power and cooling infrastructure + detector control systems + safety systems � data taking (cosmic runs, triggers at 3 Hz for M3) � First week � stable running of systems previously integrated � Second week � integration of new components into the global system 10

  11. 11 ATLAS detector commissioning

  12. Offline Monitoring and Reconstruction For the first time, Tier-0 processing was included as part of the commissioning run and was run during most of the M3 data-taking period. The Tier-0 infrastructure picked up the data files, written to CASTOR by DAQ, and ran the offline reconstruction (provided by the Cosmic-ray tracks recorded in the Offline Commissioning Group). The complete offline barrel TRT during the M3 tests. reconstruction chain was used to reconstruct cosmic ray data from part of the inner detector, calorimeters and muon system. Tracks were fitted to the data from the inner detector and muon chambers. The full monitoring chain , which will be used to check correct performance across the detector, was also used to produce the relevant monitoring histograms for each sub-detector. In a subsequent processing step, monitoring histograms produced by the individual reconstruction jobs were merged to provide longer term data quality monitoring . The history plot , taken on Monday June 18 2007, is an example from this Tier-0 monitoring. Monitoring tools were also used to check An ATLANTIS event display showing a cosmic track synchronization among the different sub-systems. 12 fitted to hits in the ID and muon chambers.

  13. Milestones and integration schedule new Dates Systems Detector Operations Cosmic run configuration I ntegration M1 DAQ R/O Barrel calorimeters Achieve combined run 2 days 11-19/12 2006 Barrel Lar & Tile Tile cosmic trigger M2 DAQ/EB Barrel calorimeters Combined runs 2 x weekd ends 28/2 to 13/3 2007 DAQ V. 1.7 Barrel Muon Mixed runs Tile cosmic trigger + RPC cosmic trigger Muon barrel (S. 13) Periodic cosmic runs Monitoring/DQ after M2 M3 Barrel SCT Barrel and End Cap 1st week focus on 1 week calorimeters operations, checklist 4/6 to 18/6 2007 Barrel TRT Tile + Muon cosmic management, Barrel muon (5&6) trigger (side A) Muon EC (MDT,TGC) coordination between EC muon MDT Offline desks Barrel SCT, TRT EC muon TGC M4 Level-1 Calo Barrel & EC ATLAS-like operations 1 week calos+ muon 23/8 to 3/9 2007 HLT Use of DQ assessment Try also calorimeter Barrel TRT trigger 2 day setup DAQ 1.8 SCT R/O 2 week ends Offline 13 Level-1 Mu, Calo M5 ID EC (TRT) Converge to ATLAS ATLAS-like operations 1 week detector 16/10 to 23/10 2007 Pixel M6 End Cap magnets As for M5 ATLAS-like Operations Global cosmic run November/December Run during magnet test Magnets on 13

  14. 14 e-newsletter ATLAS M4 effort featured in ISG newsletter weekly

  15. Final Dress Rehearsal A complete test of the final chain of data handling, distribution and analysis from last stage of TDAQ to the user’s laptop. • Simulate 1 complete LHC fill (~10 hours of data taking) → ~7·10 6 events • Mix and filter events at MC generator level to get correct physics mixture as expected at HLT output • Pass events through G4 simulation (realistic “as installed” detector geometry) • Produce byte streams → emulate raw data format • Send “raw data” to Point 1, inject at Sub-Farm Output (SFO), write out events to separate streams, closing files at boundary of luminosity blocks • Send events from Point 1 to Tier-0; imitate final file structure and movement • Perform calibration and alignment at Tier-0/Tier-1s/Tier-2s • Run reconstruction at Tier-0/Tier-1s → produce ESD, AOD, TAGs • Distribute ESD, AOD, TAGs to Tier-1s and Tier-2s • Perform distributed analysis First test with 3.6 million events. Timescale: (4 phases) … Phase 3: 17-21 September 2007 Phase 4: 22-26 October 2007 (1-SFO) (29/10 - 6/11: N-SFOs) 15

  16. Computing Coordination At the beginning of 2007 it became clear that an enhanced level of communication is needed between the ATLAS computing organisation and the Tier-1 centres. → Visits to Tier-1 Computing Centres around the world � Main recurrent discussion points at the visits are: � a general Tier-1 description � hardware and human resources � storage set-up � local and remote networking � databases and the 3D project � installed middleware and Grid services � the regional Tier-2/3 organisation � re-processing and how to organise the recall of data from tape 16

  17. Analyses strategies Question: How are the different physics groups planning to organize the analysis of the first data and what strategies (simple cut based analysis, multivariate techniques, blind analysis) are used to perform the analysis? � Issues addressed: � How are particle signatures (e, gamma, muons, taus, jets, E T,miss ) going to be validated? � What strategies will be used to validate and Combined understand the object ID Performance (efficiencies, fake rates from early data)? Groups � What physics processes will be used to do the validation? � What are the first measurements that will be performed with data corresponding to 10, 100 and 1000 pb -1 ? Physics Groups � What strategies will be used to estimate the backgrounds from data? � How will the trigger efficiencies be evaluated? 17

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend