Fermi LAT data analysis PSR J0633+1746 VER J0648+152 4 0 50 100 - - PowerPoint PPT Presentation

fermi lat data analysis
SMART_READER_LITE
LIVE PREVIEW

Fermi LAT data analysis PSR J0633+1746 VER J0648+152 4 0 50 100 - - PowerPoint PPT Presentation

Fermi LAT data analysis PSR J0633+1746 VER J0648+152 4 0 50 100 150 200 Fermi Gamma-ray Space Telescope Large Area Telescope (LAT) ~20 MeV to >300 GeV Credit: NASA E/PO, Sonoma State University, Aurore Simonnet FOV: 2.4 sr


slide-1
SLIDE 1

50 100 150 200

4 VER J0648+152 PSR J0633+1746

Fermi LAT data analysis

slide-2
SLIDE 2
  • Large Area Telescope (LAT)

✦ ~20 MeV to >300 GeV ✦ FOV: 2.4 sr

  • Gamma-ray Burst Monitor (GBM)

✦ ~10 keV to ~25 MeV ✦ FOV: >8 sr

Fermi Gamma-ray Space Telescope

Credit: NASA E/PO, Sonoma State University, Aurore Simonnet

slide-3
SLIDE 3

Fermi Science Support Center (FSSC)

  • http://fermi.gsfc.nasa.gov/ssc/
slide-4
SLIDE 4

The Cicerone

  • “Cicerone means ‘a person who conducts

sightseers; guide’”

  • http://fermi.gsfc.nasa.gov/ssc/data/analysis/

documentation/Cicerone/

slide-5
SLIDE 5

Analysis Threads

  • http://fermi.gsfc.nasa.gov/ssc/data/analysis/

scitools/

slide-6
SLIDE 6

LAT Data Extraction

slide-7
SLIDE 7
slide-8
SLIDE 8

LAT Data Extraction

target source

circular region centered at the target source start and end times

  • f observation

}

default: 100 MeV - 300 GeV

The region of interest (ROI) selection later on should be no bigger than this. Plan ahead or you have to download the data again!

slide-9
SLIDE 9

Data Selection

  • gtselect - defines data sub-selection

criteria: region of interest (ROI), energy range, time range ...

  • gtmktime - selection of good-time intervals

(GTIs)

slide-10
SLIDE 10

Data Selection - gtselect

ROI Time range Energy range

  • Max. zenith angle

Event class (hidden!)

{

“INDEF” allows the parameters to be read from the header keywords

(Associated with the previous Pass6 IRFs) zenith cone buffer region

slide-11
SLIDE 11

Data Selection - gtmktime

  • Good-time intervals - time intervals in which the

data is good

“Yes”: excludes time intervals where the buffer region intersects with the ROI

slide-12
SLIDE 12

Data Exploration - gtbin

  • gtbin - bin selected event list into image

specify that you are making a counts map

{

dimensions of the counts map

slide-13
SLIDE 13

Data Exploration - gtbin

50 100 150 200

4 VER J0648+152 PSR J0633+1746

slide-14
SLIDE 14

HEASARC Web Tools

http://heasarc.nasa.gov/docs/tools.html Coordinate converter

slide-15
SLIDE 15

HEASARC Web Tools

Time converter

slide-16
SLIDE 16

Likelihood Analysis

slide-17
SLIDE 17

Likelihood Analysis

Q: Given an input model (a set of parameters),

what is the probability of obtaining/reproducing the observed data from the model? Our model: describes the gamma-ray sources in the sky (spatially + spectrally) The probability is termed the ...

Likelihood L

slide-18
SLIDE 18

To get the model which best describes the data, we wish to maximize L. In other words, we wish to find out the set of model parameters such that L is maximized.

Suppose the data is binned according to their positions in the sky and their energies. The number of counts in each bin is characterized by the Poisson distribution. The probability of detecting ni photons in the i-th bin is given by: where mi is the number of predicted photons by the model in that bin. The likelihood function L is defined as the product of the probabilities:

Likelihood Analysis

slide-19
SLIDE 19

Unbinned Likelihood Analysis

Re-writing L ...

If we let the size of each bin to be infinitesimally small, then ni will be 0 or 1.

is the total number of predicted photons.

The unbinned likelihood

To make life simpler, we deal with the logarithm:

Good for small data sets (short observation time)

slide-20
SLIDE 20

Likelihood Analysis

The Test Statistic (TS) is defined as (“likelihood-ratio test”): where Lmax,0 = the maximum likelihood value for a model without an additional source (the ‘null hypothesis’) Lmax,1 = the maximum likelihood value for a model with the additional source at a specified location Wilk’s Theorem: If the number of photons is sufficiently large, the TS for the null hypothesis is distributed like a χ2ν distribution, where ν is the difference in the number of parameters between the models with and without the additional source. Detection significance

slide-21
SLIDE 21

Likelihood Analysis - The Source Model

“describes the gamma-ray sources in the sky (spatially + spectrally)” Source Model

  • point sources

spectral model e.g. simple power law, power law with exponential cutoff

  • galactic diffuse and

isotropic all-sky (extra- galactic) emission

  • other sources (diffuse

“templates”)

: position in the sky

In ordinary analysis, we treat as .

slide-22
SLIDE 22

Likelihood Analysis - The Instrumental Response Functions (IRFs)

The model is folded with the instrumental response functions (IRFs) to obtain the number of predicted counts in the measured quantity space:

Effective area Energy dispersion PSF

http://www.slac.stanford.edu/exp/glast/groups/canda/lat_Performance.htm http://arxiv.org/abs/1206.1896

slide-23
SLIDE 23

Likelihood Analysis

The likelihood will be evaluated many times during model fitting. To save CPU time, an “exposure map (cube)” is computed in advance (integral of total response over ROI data-space): which is independent of the source model. The log-likelihood becomes: where

slide-24
SLIDE 24

Likelihood Analysis - gtltcube & gtexpmap

Exposure map: total exposure for a given position in the sky producing counts in the ROI Pre-requisite: the time that a given position in the sky is observed at a given inclination angle (this is called the “livetime”) has to be known

This “livetime (exposure) cube” quantity is pre-computed by the tool gtltcube.

specifies the grid of the “livetime cube” specifies the grid of the exposure map the livetime cube is used as an input to compute the exposure map

slide-25
SLIDE 25
  • Simple power law
  • Power law with exponential cutoff
  • Log-parabola

Spectral Models

e.g. pulsars e.g. blazars

The LAT 2-year Point Source Catalog is based

  • n these models.
slide-26
SLIDE 26

Building the Model - make2FGLxml.py

  • from make2FGLxml import *
  • mymodel = srcList('gll_psc_v07.fit', 'eventfile.fits',

'myLATxmlmodel.xml')

  • mymodel.makeModel('gal_2yearp7v6_v0.fits', 'gal_2yearp7v6_v0',

'iso_p7v6source.txt', 'iso_p7v6source', ‘Templates’)

... ... $FERMI_DIR/refdata/fermi/galdiffuse/gal_2yearp7v6_v0.fits $FERMI_DIR/refdata/fermi/galdiffuse/isotropic_iem_v02_P6_V11_DIFFUSE.txt

Example use:

match the names!

slide-27
SLIDE 27

Building the Model - the XML file

(1) spectral part (2) spatial part [

[

slide-28
SLIDE 28

Building the Model - the XML file

Galactic diffuse and isotropic all-sky emission

slide-29
SLIDE 29

Likelihood Analysis - gtdiffrsp

Point-source: S is a delta-function, the integral is relatively easy Diffuse source: the integral is computational-intensive

Input source model, to determine: (1) whether pre-computed diffuse responses are present (2) whether an extended source is present in the model IRFs to use

“In the likelihood calculations, it is assumed that the spatial and spectral parts of a source model factor in such a way that the integral over spatial distribution of a source can be performed independently of the spectral part...”

slide-30
SLIDE 30

Likelihood Analysis - gtlike

{

keep your results! (for reference and for further iterations) we are performing unbinned likelihood analysis

  • ptimizer used for fitting,

in general: (1) DRMNFB (2) NEWMINUIT

store the screen

  • utput into a file!

“[command] >& [file]”

slide-31
SLIDE 31

Likelihood Analysis - gtlike

... ...

slide-32
SLIDE 32

Likelihood Analysis - gttsmap

{

dimensions of the

  • utput TS map
slide-33
SLIDE 33

Likelihood Analysis - gttsmap

Goal: to find sources that are barely detectable. Model the strong, known, well- identified sources

  • > look for point sources that are

not present in the model Moving a putative point source through a grid of locations in the sky and maximizing -log(likelihood).

40000 60000 80000 100000 120000