HL-LHC Ian Lim September 15, 2017 About Me Stanford University, - - PowerPoint PPT Presentation

hl lhc
SMART_READER_LITE
LIVE PREVIEW

HL-LHC Ian Lim September 15, 2017 About Me Stanford University, - - PowerPoint PPT Presentation

Vertex Reconstruction at the HL-LHC Ian Lim September 15, 2017 About Me Stanford University, B.S. in Physics with Honors (2017) Interests in cats, baking, and high-energy physics SULI intern, autumn 2017 (August 21-December 8)


slide-1
SLIDE 1

Vertex Reconstruction at the HL-LHC

Ian Lim September 15, 2017

slide-2
SLIDE 2

About Me

  • Stanford University, B.S. in Physics with Honors (2017)
  • Interests in cats, baking, and high-energy physics
  • SULI intern, autumn 2017 (August 21-December 8)
  • Working with Maurice, Ben, and Simone on vertex

reconstruction for the HL-LHC upgrade

2

slide-3
SLIDE 3

How well do existing vertexing algorithms perform at large mu?

And how can they be improved?

3

slide-4
SLIDE 4

Plan of Attack

  • 1. Intro to vertexing
  • 2. The problem, in more detail
  • 3. My studies so far
  • 4. Future work

4

slide-5
SLIDE 5

What is Vertexing?

  • Primary vertices are locations of proton-proton collisions in the detector
  • Two main parts– position reconstruction and track association
  • How well can we determine where a collision happened in space?
  • Given the tracks left in our detector by collision products, how well can we

associate them to the correct vertex?

5

slide-6
SLIDE 6

Definitions: Events, Hard Scatter, and Pile-Up

  • Today, protons are collided in bunches with a period of 25 ns.
  • Each bunch crossing (event) results in about 20-40 actual collisions (μ).
  • One special “hard scatter” vertex and many others (pile-up)
  • We can also categorize events by how well we reconstruct vertices.

6

Image credit to Ariel Schwartzman, SLAC. Hard scatter tracks in red and pile-up tracks in blue.

slide-7
SLIDE 7

7

Based on https://arxiv.org/pdf/1611.10235.pdf

slide-8
SLIDE 8

Central problems of vertexing

  • At the HL-LHC, we expect μ~200– a tenfold increase!
  • With increased vertex density, performing a clean reconstruction

becomes significantly harder.

  • Hard scatter is obscured by 10x more pile-up
  • More tracks to assign
  • Greater likelihood of merging
  • But the detector will be upgraded as well (e.g. pseudorapidity |η|<4,
  • vs. |η|<2.5 now)

8

slide-9
SLIDE 9

Current approach (my work)

  • First, benchmark existing algorithms on simulated μ=200 data
  • Vertex reconstruction efficiency
  • Event classification and distribution
  • Next, determine new/alternate metrics for measuring performance
  • Bias introduced by misidentified tracks
  • Finally, implement and test new vertexing procedures.

9

slide-10
SLIDE 10

Truth and reco vertices/event

10

slide-11
SLIDE 11

Reconstruction efficiency + distribution

11

slide-12
SLIDE 12

Event classification vs. μ

12

slide-13
SLIDE 13

Vertex density and event classification

13

slide-14
SLIDE 14

Hard scatter z-resolution

14

slide-15
SLIDE 15

Spread of hard scatter contributions

15

slide-16
SLIDE 16

Future work

  • Continuing to develop interesting metrics for vertexing
  • Working on improved vertexing methods
  • Use of high η tracks from the forward region?
  • Track clustering methods (cf. Meloni 2017, arXiv:1705.00022v1)
  • Vertex candidate substructure (cf. Schwartzman)

16

slide-17
SLIDE 17

Thank you for your time!

17