Fun with HiggsCombine Nick Amin September 6, 2018 Replicating AN - - PowerPoint PPT Presentation

fun with higgscombine
SMART_READER_LITE
LIVE PREVIEW

Fun with HiggsCombine Nick Amin September 6, 2018 Replicating AN - - PowerPoint PPT Presentation

Fun with HiggsCombine Nick Amin September 6, 2018 Replicating AN numbers Take cards associated with 2016 paper (which were given for the combination) and add lumiscale rateParam * * 1. theorySysts group = pdf alphas isrvar fsrvar scale


slide-1
SLIDE 1

Fun with HiggsCombine

Nick Amin September 6, 2018

slide-2
SLIDE 2 ⚫ Take cards associated with 2016 paper (which were given for the combination) and add

lumiscale rateParam * * 1. theorySysts group = pdf alphas isrvar fsrvar scale expSysts group = jes jer isr bb lep lephlt hthlt btag pu datadrivenSyst group = TTWSF TTZSF rares fakes fakes_EWK flips backgrounds group = TTH TTVV XG

⚫ Two extrapolation scenarios
  • S2 ("least conservative")

text2workspace.py --channel-masks v0.10_paper_forDenys/card_tttt_srcr.txt --X-nuisance-group-function 'expSysts' 'expr::scaleexpSysts("1/sqrt(@0)",lumiscale[1])' --X-nuisance-group-function 'datadrivenSyst' 'expr::scaledatadrivenSyst("1/sqrt(@0)",lumiscale[1])' --X-nuisance-group-function 'theorySysts' '0.5' --X-nuisance- function 'lumi' '0.4' --X-nuisance-function 'tttt' '0.5' --X-nuisance-group-function 'backgrounds' '0.5';

  • S2NF ("more conservative")

text2workspace.py --channel-masks v0.10_paper_forDenys/card_tttt_srcr.txt --X-nuisance-group-function 'expSysts' 'expr::scaleexpSysts("max(0.5,1/sqrt(@0))",lumiscale[1])' --X-nuisance-group-function 'datadrivenSyst' 'expr::scaledatadrivenSyst("1/sqrt(@0)",lumiscale[1])' --X-nuisance-group-function 'theorySysts' '0.5' --X-nuisance- function 'lumi' '0.4' --X-nuisance-function 'tttt' '0.5' --X-nuisance-group-function 'backgrounds' '0.5'

⚫ Two energies
  • For 14 TeV, scale main backgrounds by 14/13TeV k-factor with the following included in the card before lumiscale rateParam *

* 1

scale14tttt rateParam * tttt 1.33 [1.33,1.33] scale14fakes rateParam * fakes 1.19 [1.19,1.19] scale14tth rateParam * tth 1.24 [1.24,1.24] scale14ttw rateParam * ttw 1.16 [1.16,1.16] scale14ttz rateParam * ttz 1.21 [1.21,1.21]

Replicating AN numbers

2

slide-3
SLIDE 3 500 1000 1500 2000 2500 3000 3500 lumi [ifb] 1.0 1.5 2.0 2.5 3.0 3.5 4.0 sigma S21) (more Fonservative) S2 (least Fonservative) naive (only lumisFale)

⚫ After setting up a given workspace from previous slide, get expected significances

  • 300/35.87 = 8.3635; 150/35.87 = 4.1818

combine -M ProfileLikelihood v0.10_paper_forDenys/card_tttt_srcr.root -- significance -t -1 --expectSignal=1 --setParameters lumiscale=8.3635

⚫ Values within ~2% except for S2NF@14TeV and 3ab which are ~5% off ⚫ When plotting my values vs lumi, not sure what happens with jumps…

Replicating AN numbers

3

lumi sqrt(s) S2 S2NF 150 13 2.029 2.027 14 2.411 2.408 300 13 2.678 2.636 14 2.817 2.780 3000 13 3.962 3.770 14 4.443 4.240

me AN

slide-4
SLIDE 4

⚫ We can add a lumi scaling rate parameter (at the bottom/end of the card!) and constrain it with the

syntax

lumiscale rateParam * * 1.0 [1.0,1.0]

  • This does not change the result of running combine on a card, but doing

lumiscale rateParam * * 2.0 [2.0,2.0]

  • gives us the exact value we would get by (naively) re-running the looper, scaling every event by 2

⚫ Also the range can be explicitly set outside the card during the combine command invocation

  • With

lumiscale rateParam * * 1.0 [1.0,1.0]

  • in the card, doing

combine -M ProfileLikelihood v1.00_2016_75p0_v1_try2/card_tttt_srcr_freya.txt -- significance -t -1 --expectSignal=1 --setParameters lumiscale=2 --setParameterRanges lumiscale=2,2

  • will give the same thing but now we don’t need to edit the card. Can just always keep the

lumiscale 1 line in the card and modify the command later to scale.

  • Note that the parameter range needs to be fixed or else lumiscale will be a freely floating nuisance

and the significance will be off

⚫ And, this can be restricted to a process like tth

lumiscale rateParam * tth 1.0 [1.0,1.0]

  • Makes it easier to do the yt scan now, for example

Side note about our lumi scaling

4

slide-5
SLIDE 5

Backup

5

slide-6
SLIDE 6

⚫ Freya is working on tttt projections for HLLHC

  • AN: http://cms.cern.ch/iCMS/jsp/openfile.jsp?

tp=draft&files=AN2018_209_v3.pdf

⚫ Normally, we re-loop on trees with an overall scalefactor on event weights to

make new root files → new card .txt → significance with lumi x scaled to y

⚫ Apparently you can do this with

lumiscale rateParam * * 1.

  • at the bottom of the card (note, top doesn’t work!)

⚫ Working directory on UAF: ~namin/2018/fourtop/all/FTAnalysis/analysis/limits/

extraptest

  • Taking 2016 MC/data cards scaled to 75.0ifb using re-loop method

(v1.00_2016_75p0_v1_try2/card_tttt_srcr.txt) and 150 (v1.00_2016_150p0_v1_try2/card_tttt_srcr.txt) using MC fakes

  • Command

combine -M ProfileLikelihood v1.00_2016_150p0_v1_try2/ card_tttt_srcr.txt --significance -t -1 --expectSignal=1

  • Expected significance of 2.025𝜏 with 75ifb, 2.692𝜏 with 150ifb

Overview

6

slide-7
SLIDE 7

⚫ Adding lumiscale line to end of text file and re-running limits should in principle give the exact

same significance (because the default value is 1). However,

combine -M ProfileLikelihood v1.00_2016_75p0_v1_try2/card_tttt_srcr_freya.txt -- significance -t -1 --expectSignal=1

  • Expected significance of 1.970𝜏 with 75ifb, 2.633𝜏 with 150ifb
  • Numbers are 2.2-2.7% lower than previous slide.

⚫ Can override the parameter with extra flag like

  • -setParameters lumiscale=1.066
  • Docs: https://cms-hcomb.gitbooks.io/combine/content/part2/

settinguptheanalysis.html#beyond-simple-datacards

⚫ Turns out 1.066 is what I need to reproduce the original numbers from the previous slide (found

by tuning by hand)

⚫ I don’t understand. But OK. Let’s say I need a ~7% "correction", so if I use 2*1.066=2.132 with

the 75ifb card, do I get close to the 150ifb card?

  • -setParameters lumiscale=2.132
  • results in 2.702𝜏 (only ~0.3% higher from 150ifb card)
  • Pretty close, especially given I had to apply a ~7% "correction" to lumiscale for some reason

7

slide-8
SLIDE 8

⚫ OK, let me restrict the lumiscale rateParam to just one process (ttH) and keep the value of 1. So again,

this should give the original number (2.025)

lumiscale rateParam * tth 1.

  • → 1.761𝜏 with 75ifb
  • Though, I guess if this worked, this would provide an easier way to scale ttH (for yt scan)

⚫ Ah, got it. The rateParam is a floating parameter (on top of r!) in the fit, so if we constrain it with the

syntax

lumiscale rateParam * * 1.0 [1.0,1.0]

  • we get back 2.025! And now with

lumiscale rateParam * * 2.0 [2.0,2.0]

  • in the 75ifb card, we get 2.692𝜏, matching exactly the naively-scaled (via looper) 150ifb card!
  • Also the range can be explicitly set outside the card. With

lumiscale rateParam * * 1.0 [1.0,1.0]

  • in the card, doing

combine -M ProfileLikelihood v1.00_2016_75p0_v1_try2/card_tttt_srcr_freya.txt --significance

  • t -1 --expectSignal=1 --setParameters lumiscale=2 --setParameterRanges lumiscale=2,2
  • will also yield 2.692𝜏 but now we don’t need to edit the card. Can just always keep the lumiscale 1

line in the card and modify the command later to scale.

8