Status of the MICE Online Status of the MICE Online Systems - - PowerPoint PPT Presentation

status of the mice online status of the mice online
SMART_READER_LITE
LIVE PREVIEW

Status of the MICE Online Status of the MICE Online Systems - - PowerPoint PPT Presentation

Status of the MICE Online Status of the MICE Online Systems Systems Pierrick Hanlet Pierrick Hanlet 6 March 2012 6 March 2012 Outline I. Introduction II. Data Acquisition III.Online Reconstruction IV.Controls and Monitoring V.


slide-1
SLIDE 1

Status of the MICE Online Status of the MICE Online Systems Systems

Pierrick Hanlet Pierrick Hanlet 6 March 2012 6 March 2012

slide-2
SLIDE 2

6 March 2012 Pierrick Hanlet 2

Outline

  • I. Introduction
  • II. Data Acquisition

III.Online Reconstruction IV.Controls and Monitoring

  • V. Infrastructure

VI.Conclusions

slide-3
SLIDE 3

6 March 2012 Pierrick Hanlet 3

Outline

I.

  • I. Introduction

Introduction

  • II. Data Acquisition

III.Online Reconstruction IV.Controls and Monitoring

  • V. Infrastructure

VI.Conclusions

slide-4
SLIDE 4

6 March 2012 Pierrick Hanlet 4

Introduction

  • Beamline – create beam of muons
  • Particle ID – verify/tag muons (before/after)
  • Spectrometers – measure ε (before/after)
  • Absorber (LH2 or LiH) – cooling
  • RF – re-establish longitudinal momentum

MICE cooling channel MICE beamline

slide-5
SLIDE 5

6 March 2012 Pierrick Hanlet 5

Online Responsibilities

The MICE Online Group creates, maintains, and ensures proper use of all tools (hardware, software, and documentation) within the MICE Local Control Room (MLCR) used by the experiment to efficiently record high quality data. We are responsible for:

  • Data Acquisition (DAQ)
  • Online Monitoring and Reconstruction (OnMon/OnRec)
  • Controls and Monitoring (C&M)
  • Data Transfer
  • Networking and MLCR Computing

We also interface closely with systems related to the Online sector including MICE Operations, Offline Software, and Computing

slide-6
SLIDE 6

6 March 2012 Pierrick Hanlet 6

Online Structure

Linda Coney – head of Online Group, Online Reco David Colling – head of Software & Computing, GRID PP contact Yordan Karadzhov – head of DAQ, OnMon Pierrick Hanlet – head of C&M, connection to Config DB Daresbury Lab – C&M - Brian Martlew (head of DL group) Paul Hodgson – C&M (target) Matt Robinson – C&M (target,tracker), System Administrator Mike Courthold – Networking Henry Nebrensky – GRID, Data Transfer, MICE Data Manager Janusz Martynikk – MICE Data Mover – Data of Online System Paul Kyberd – GRID, Contact person for GRID PP Craig Macwaters – MLCR Network, Hardware, Computing Antony Wilson – Config DB, MICE PPD IT Contact Chris Rogers/Chris Tunnell – link with Software Group

slide-7
SLIDE 7

6 March 2012 Pierrick Hanlet 7

Online Structure

slide-8
SLIDE 8

6 March 2012 Pierrick Hanlet 8

Online Group

  • New leadership and organization (June '11)
  • Redmine used to record/track issues
  • prioritize issues and effort
  • search-able
  • remotely accessible
  • Excellent progress successful Dec '11 run
slide-9
SLIDE 9

6 March 2012 Pierrick Hanlet 9

Outline

  • I. Introduction

II.

  • II. Data Acquisition

Data Acquisition III.Online Reconstruction IV.Controls and Monitoring

  • V. Infrastructure

VI.Conclusions

slide-10
SLIDE 10

6 March 2012 Pierrick Hanlet 10

Data Acquisition

Description: Description:

slide-11
SLIDE 11

6 March 2012 Pierrick Hanlet 11

Data Acquisition

DAQ and Trigger requirements: DAQ and Trigger requirements:

  • stable over long-term & maintainable
  • non-expert use (documentation)
  • 600 particles per 1 ms spill at 1 Hz
  • event size <60 MB (normally ~30 MB)
  • flexible:
  • select FEE to read and trigger
  • run independently of target and RF
  • interface with C&M
  • interface with OnMon & OnRec
slide-12
SLIDE 12

6 March 2012 Pierrick Hanlet 12

Data Acquisition

Description: Description:

DAQ Trigger distribution Scalars and particle trigger NIM Logic TOF discriminators and trigger CAMAC Logic Interface w/Target & Spill Gate KL Shapers GVA Discri, KL cosmics trg TOF & CKOV

DATE (ATLAS) framework

slide-13
SLIDE 13

6 March 2012 Pierrick Hanlet 13

Data Acquisition

Status: Status:

  • prototype EMR detector and electronics

successfully integrated

  • simultaneous readout of both trackers

during cosmic ray data-taking using DATE

  • communication established linking DAQ,

C&M, and CDB – allows monitoring of the DAQ status and archiving of DAQ parameters in the CDB

  • new unpacking code
slide-14
SLIDE 14

6 March 2012 Pierrick Hanlet 14

Data Acquisition

Efforts: Efforts:

  • upgrade DAQ – DATE version and OS
  • software trigger selection
  • incorporate new detectors
  • EMR – spring cosmic run with new DAQ
  • tracker – single tracker station test
  • improve system performance
  • improve error handling
  • incorporate new DAQ computers (LDCs)
  • integrate with C&M and CDB
slide-15
SLIDE 15

6 March 2012 Pierrick Hanlet 15

Outline

  • I. Introduction
  • II. Data Acquisition

III. III.Online Reconstruction Online Reconstruction IV.Controls and Monitoring

  • V. Infrastructure

VI.Conclusions

slide-16
SLIDE 16

6 March 2012 Pierrick Hanlet 16

Online Monitoring and Reconstruction

  • Two components:
  • Monitoring (OnMon)– DAQ raw

distributions

  • Reconstruction (OnRec) – same code as
  • ffline reconstruction software
  • OnMon and OnRec run over socket
  • Now using new MAUS software framework
  • Excellent progress successful Dec'11 run
slide-17
SLIDE 17

6 March 2012 Pierrick Hanlet 17

Online Monitoring

  • new unpacking software

CPU monitor parallelization TOF hit profiles

TDC scalars fADC trigger

preliminary MAUS TOF hit profiles

slide-18
SLIDE 18

6 March 2012 Pierrick Hanlet 18

Online Reconstruction

  • real-time physics & detector functionality
  • TOF, KL, & CKOV detector readout
  • beam dynamics parameters
  • time-of-flight distributions for PID
  • data transfer out of the MLCR (Online

responsibility limit) automated

  • archives of all online plots
  • data transferred to public webserver
slide-19
SLIDE 19

6 March 2012 Pierrick Hanlet 19

Online Reconstruction

∆t distributions TOF 2D profiles t1-t0 t2-t1 t2-t0 px vs x py vs y µ momentum trace-space distributions

slide-20
SLIDE 20

6 March 2012 Pierrick Hanlet 20

Outline

  • I. Introduction
  • II. Data Acquisition

III.Online Reconstruction IV. IV.Controls and Monitoring Controls and Monitoring

  • V. Infrastructure

VI.Conclusions

✔ ✔ ✔

slide-21
SLIDE 21

6 March 2012 Pierrick Hanlet 21

Controls & Monitoring

Purpose: Purpose:

  • Controls refers to:
  • user interface to equipment
  • proper sequencing of equipment
  • Monitoring serves to:
  • protect equipment (early notification)
  • protect data quality
  • user monitoring
slide-22
SLIDE 22

6 March 2012 Pierrick Hanlet 22

Controls & Monitoring

Status and immediate needs: Status and immediate needs:

  • Step I complete
  • Beamline
  • Particle ID – PID
  • Alarms, archiving, external gateway
  • Experimental hall environment
  • SS and FC acceptance testing
  • Run Control
slide-23
SLIDE 23

6 March 2012 Pierrick Hanlet 23

Controls & Monitoring

AFC

µ

trackers RFCC

Next focus – cooling channel: Next focus – cooling channel:

slide-24
SLIDE 24

6 March 2012 Pierrick Hanlet 24

Controls & Monitoring

ON / OFF ON / OFF ON / OFF ON / OFF ON / OFF

SD16 CX05 SD11 SD05 CX06 SD12 SD06 CX07 SD0 7 CX08 SD 27 SD01 CX09 SD15 SD03 SD10 CX 01 CX02 CX12 Level 1 Level 2 Level 3

CC 1 CC 2 CC 3 CC 4 Single CC 5

Pressure SD02 SD08 SD14

Vacuum Vessel Radiation Shield Cold Mass

Heater Power

SD26 SD25 SD23 SD19

Fill Line

Heater Power

CX1 1 CX10

ON / OFF

SD13

"# ! $% ! & ' (& ) *! + *! , - ./ 0 *! 1 2! 34! & ' (& )*! 5- 6 *! ! 1 2! % % ! "- .) ! 7 8( 5- 6 *! 5- 9 9
  • 0 !
"# !$3! & ' (& ) *! + *! , - ./ 0 *! % ! + *! :*, *.! 1 *' (- 6 (! & ' (& ) *! + *! , - ./ 0 *! ; 6 ) ! :*, *.! (*' (- 6 ! <& 9 = & ' ! > & ..! ?! @ *' 9 ! 9 / 5*A ! B..! & ' 9 *6 ' 8 .! C 8 5.*(! *D & 9 ! 9 = 6
  • / E
= ! 9 = *!(8 0 *! 9 / 5*!
slide-25
SLIDE 25

6 March 2012 Pierrick Hanlet 25

Controls & Monitoring

power supply (LBNL) quench protection (FNAL) standalone C&M (DL)

slide-26
SLIDE 26

6 March 2012 Pierrick Hanlet 26

Controls & Monitoring

MICE is a precision precision experiment:

  • measure a muon cooling effect to 0.1%
  • imperative – control all

all systematic errors

  • ensure data taking parameters of all of

the apparatus in MICE be carefully recorded/restored to/from the CDB. To accomplish this, the target DAQ, the experiment DAQ, controls for beamline elements, MICE state machines, and PID have been integrated with the CDB into a single “Run Control” process.

slide-27
SLIDE 27

6 March 2012 Pierrick Hanlet 27

Controls & Monitoring

slide-28
SLIDE 28

6 March 2012 Pierrick Hanlet 28

Controls & Monitoring

Other aspects and future: Other aspects and future:

  • FC – similar to SS (due at same time!)
  • RF tuners – MTA single 201MHz RF cavity
  • MICE Hall services control
  • EMR test
  • Target & Tracker controls upgrade
  • LH2
  • RF
slide-29
SLIDE 29

6 March 2012 Pierrick Hanlet 29

Outline

  • I. Introduction
  • II. Data Acquisition

III.Online Reconstruction IV.Controls and Monitoring V.

  • V. Infrastructure

Infrastructure VI.Conclusions

✔ ✔ ✔ ✔

slide-30
SLIDE 30

6 March 2012 Pierrick Hanlet 30

Infrastructure

I. Dedicated system administrator!!!

  • II. Necessary improvements made to the
  • nline system infrastructure:
  • Hardware vulnerabilities were assessed,

leading to the replacement of several DAQ crates and the purchase of spares

  • Easily-swapped-in computers have been

prepared in the event of key machine failure

  • All hardware damaged during an

unexpected power surge in early 2011 has been repaired or replaced

slide-31
SLIDE 31

6 March 2012 Pierrick Hanlet 31

Infrastructure

  • An upgrade of the OS for all online

computers has been initiated using two test computers added to the MLCR network to facilitate a controlled migration

  • Operations and Online systems

documentation has been reviewed, updated, and posted

slide-32
SLIDE 32

6 March 2012 Pierrick Hanlet 32

Online Future

  • March run for online software tests
  • June run for EMR and single station tracker
  • Present foci:
  • C&M for Step IV
  • DAQ for EMR (June)
  • DAQ for single tracker station (June)
  • continued involvement from software

group for online reconstruction

  • online accelerator physics analysis tools
slide-33
SLIDE 33

6 March 2012 Pierrick Hanlet 33

Summary Summary

  • Online group restructured June'11
  • Online Systems in place
  • Step IV work progressing well
  • Evolving needs being recognized and

met

slide-34
SLIDE 34

6 March 2012 Pierrick Hanlet 34

Backup slides

slide-35
SLIDE 35

6 March 2012 Pierrick Hanlet 35

Controls & Monitoring

Procedure for a data taking run:

  • 1. query: run & trigger types, beamline, etc
  • 2. query: C&M for MICE states, verify compatibility with

requested run

  • 3. query: CDB for beamline and particle ID settings for

requested run configuration

  • 4. determine apparatus readiness (states) and initiate

setting beamline and PID parameters

  • 5. query target DAQ for actuation number, target depth and

delay

  • 6. give pertinent information to experiment DAQ and initiate

arming procedure

  • 7. inhibit C&M from allowing any changes
  • 8. user initiates run, Run Control reads experiment

parameters and stores them in CDB

  • 9. end of run, query for final actuation number, sum scalers,

and write end of run values to CDB and to experiment DAQ