dune trigger requirements
play

DUNE Trigger Requirements Requirements are not specifications - PowerPoint PPT Presentation

DUNE Trigger Requirements Requirements are not specifications But best if they are not so generic as to be useless: System X will do what it needs to Should be absolute big-picture goals that system must meet (and be


  1. DUNE Trigger Requirements • Requirements are not “specifications” • But best if they are not so generic as to be useless: “System X will do what it needs to…” • Should be absolute big-picture goals that system must meet (and be achievable!) In principle should be derived from higher-level (e.g. physics) requirements • Usually phrased as, “System X shall…” • Have contacted various group leaders and experts but still need more input! (Probably some from Amanda’s talk today)

  2. DUNE Trigger Requirements Starting Points:

  3. Data selection shall: • operate on both SP and DP data streams Rationale: Collaboration requirement • act independently and in concert from all four detector modules Rationale: Maintain sensitivity during downtimes as well as full 40 kt for Supernova bursts • enable filtering of data stream so that data on disk can be limited to < 10 PB/year Rationale: Reasonable limit for downstream analysis and storage Comments: “Enable” means downstream nearline can filter further • be deadtimeless Rationale: No excuse to be otherwise; maximum livetime for follower events, supernova bursts • act on information from TPC, PDS, timing system, and any auxiliary calibration systems Rationale: Provide robustness to variations in noise; needed for detector calibrations include timestamps for all events of interest • Rationale: Allow tests in offline analysis; coordinate with other expts; provide unique IDs • provide tags for various types of selected data, written as part of data stream Rationale: Provide simple sorting for online and offline analyses; allow systematic checks • act as a “master” for calibration systems as necessary Rationale: Simplify tagging and run control interface • provide self-triggering modes (e.g., random triggers, pulsers) Rationale: Systematic checks of overall detector health

  4. Data selection shall: • provide pre-scaling for all trigger types as well as rejected data Rationale: May need to deal with high-rate calibration sources, or low-threshold triggers • Provide a coarse estimate of event centroid Rationale: Allow zero suppression via location • provide statistics for data selection to online monitoring Rationale: Detector health feedback • allow partitioning for commissioning and debugging, within modules Rationale: Commissioning of elements that may come together at different times

  5. Data selection shall: • be >99% efficient for selection of neutrino events within beam spills with E vis > 100 MeV Rationale: Best exploitation of beam data • be >99% efficient for selection of atmospheric n s and nucleon decay events with E vis > 100 MeV Rationale: Smaller total flux in low (<100 MeV) window Comments: Change this to 50 MeV? 20 MeV…? • be >90% efficient for selection of supernova bursts within the Milky Way Rationale: Seems reasonable? • be >90% efficient for single supernova events within a burst for E vis >5 MeV Rationale: Seems reasonable? • Supernova Burst false trigger rate contributes < 10% of total data bandwidth “Loose” zero suppression still needs about x10 reduction 100 µ s x 2 MHz x 12 bits/8bits x 40 MBq =12 GB/s for full detector) = 120 PB/year

  6. Data Selection Critical Questions Is a simple trigger criterion enough to remove 39 Ar “SN Bursts”? • Warburton Rivera 8 10 Ntuple trigger rate (Hz) ∆ t 5ms 7 10 2ms 1ms 6 10 0.75ms D t = 50 µ s 0.5ms 5 10 0.25ms 4 10 3 10 2 10 10 1 Number of Adj Hits − 1 10 0 2 4 6 8 10 12 14 16 Adjacent wire multiplicity At an adjacent multiplicity of 6 hits and D t=750 µ s, we get about 1 Hz (1 event each!) in 10 ktonnes. So 10 “SN events” in 10 s. Tiny probability of 100 events Number of Adj Hits in 10s. (Expect ~250 events/10 ktonnes at 10 kpc)

  7. Data Selection Critical Questions Is a simple trigger criterion enough to remove 39 Ar “SN Bursts”? • Warburton Rivera 8 10 Ntuple trigger rate (Hz) ∆ t µ 7 500 s 10 µ 50 s 6 10 µ 40 s D t = 50 µ s 5 10 µ 20 s 4 10 3 10 2 10 10 1 − 1 10 − 2 10 0 2 4 6 8 10 Adjacent wire multiplicity At an adjacent multiplicity of 4 hits and D t=50 µ s, we get about 0.1 Hz (1 event each!) in 10 ktonnes. So 1 “SN event” in 10 s. Tiny probability of 100 events in Number of Adj Hits 10s. (Expect ~250 events/10 ktonnes at 10 kpc)

  8. Data Selection Critical Questions What are we storing for any given “SN Burst”? • If we store everything (no zero suppression for 10 s) then that is ~450 TB for each burst. We can tolerate just 2 such bursts each year if we stick with our 10% of total data requirement. Zero-suppressing just collection wires brings this down to ~250 GB, so we could tolerate ~4000 fake bursts, or ~10/day. If we only save zero-suppressed data for everything, data selection needs on 1/5 reduction. It basically becomes a tagging scheme.

  9. Data Selection Critical Questions Is simple adjacent multiplicity enough for high energy (non-beam) events? • K + -antinu can have K+ going normal to wire planes---we lose these if we require high multiplicity, but acceptance loss probably small. Nevertheless, we can gain these back with time-over-threshold and/or total charge---what does multiplicity vs. time-over-threshold vs. charge look like for high-E events?

  10. Data Selection Critical Questions What should we assume about coherent noise? • Coherent noise may be rejected with a D t>N µ s cut---how big is acceptance loss? And what does collection wire “charge” look like for this?

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend