Estimation of the long Memory parameter using an Infinite Source - - PowerPoint PPT Presentation

estimation of the long memory parameter using an infinite
SMART_READER_LITE
LIVE PREVIEW

Estimation of the long Memory parameter using an Infinite Source - - PowerPoint PPT Presentation

A short introduction to Long memory The model Estimation Bibliography Estimation of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements Fran cois Roueff Ecole Nat. Sup. des T el


slide-1
SLIDE 1

A short introduction to Long memory The model Estimation Bibliography

Estimation of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements

Fran¸ cois Roueff

Ecole Nat. Sup. des T´ el´ ecommunications 46 rue Barrault, 75634 Paris cedex 13, France http://www.tsi.enst.fr/~roueff/

August 13, 2005

Joint work with Gilles Fa¨ y (UST de Lille) and Philippe Soulier (Univ. Paris X)

Fran¸ cois Roueff Estimation of the long memory parameter, p. 1

slide-2
SLIDE 2

A short introduction to Long memory The model Estimation Bibliography

Outline

A short introduction to Long memory The model Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths Estimation Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations Bibliography

Fran¸ cois Roueff Estimation of the long memory parameter, p. 2

slide-3
SLIDE 3

A short introduction to Long memory The model Estimation Bibliography

Second order long memory

A second-order stationary process X(t) has long memory parameter (or Hurst index) H ∈ (1/2, 1) if cov(X(0), X(t)) = ℓ(t) t2H−2 for some ℓ slowly varying at infinity. A possible extension to H ≥ 1 (X is non-stationary), is var t X(s) ds

  • = L(t) t2H

for some L slowly varying at infinity.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 3

slide-4
SLIDE 4

A short introduction to Long memory The model Estimation Bibliography

Examples

  • 1. Linear processes :

1.1 FGN (Gaussian, [Mandelbrot and Van Ness(1968)]), 1.2 ARFIMA ([Granger and Joyeux(1980)]);

  • 2. Non-linear processes :

2.1 Shot noise ([Giraitis et al.(1993)]), 2.2 Renewal-reward ([Taqqu and Levy(1986)]), 2.3 On-Off sources ([Taqqu et al.(1997)]), 2.4 Infinite Source Poisson ([Mikosch et al.(2002)], [Maulik et al.(2002)]).

Fran¸ cois Roueff Estimation of the long memory parameter, p. 4

slide-5
SLIDE 5

A short introduction to Long memory The model Estimation Bibliography

Linear VS Non-linear models

Remark 1

Non-linear Long memory models are often derived from Point processes : appropriate for traffic models.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 5

slide-6
SLIDE 6

A short introduction to Long memory The model Estimation Bibliography

Linear VS Non-linear models

Remark 1

Non-linear Long memory models are often derived from Point processes : appropriate for traffic models.

Remark 2

Little is known about the estimation of H in the non-linear case. Here we investigate the non-linear Poisson case.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 6

slide-7
SLIDE 7

A short introduction to Long memory The model Estimation Bibliography

An example : traffic measurements

Here is a 10ms aggregated traffic (obtained form packet counts through some point of the Internet network)

0.5 1 1.5 2 2.5 3 3.5 4 4.5 x 10

4

50 100 150 200 250 300 350 400 450 s

  • nb. of paquets per 10 ms

50 100 150 200 250 300 350 400 450 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 lag empirical auto−corr for 10ms aggregated trafic

Fran¸ cois Roueff Estimation of the long memory parameter, p. 7

slide-8
SLIDE 8

A short introduction to Long memory The model Estimation Bibliography

Same example (aggregated again)

The same 100ms aggregated traffic (obtained form packet counts through some point of the Internet network)

500 1000 1500 2000 2500 3000 3500 4000 4500 600 800 1000 1200 1400 1600 1800 2000 2200 2400 2600 s

  • nb. of paquets per 100 ms

50 100 150 200 250 300 350 400 450 −0.2 0.2 0.4 0.6 0.8 1 1.2 lag empirical auto−corr for 100ms aggregated trafic

Fran¸ cois Roueff Estimation of the long memory parameter, p. 8

slide-9
SLIDE 9

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Infinite Source Poisson model

  • 1. {Sn}n∈N: points of a unit rate homogeneous Poisson process;
  • 2. {(Un, ηn)}: i.i.d. independent of {Sn};

Un are referred to as transmission rates, ηn are the flows durations, Un and ηn may be dependent. The nth flow starts at time Sn, has rate Un and is transmitting for a duration ηn. We observe the cumulative rate X(t) :=

  • n∈N

Un 1[Sn,Sn+ηn)(t). for t ∈ [0, T] If Eη < ∞, this process has a stationary version defined by X S(t) :=

  • n∈Z

Un 1[Sn,Sn+ηn)(t).

Fran¸ cois Roueff Estimation of the long memory parameter, p. 9

slide-10
SLIDE 10

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Second order properties

If E[U2] < ∞, then, for all s ≥ 0, E[X 2(s)] < ∞ and for s ≤ t, cov(X(s), X(t)) = E[U2{s − (t − η)+}+] . The stationary version is weakly stationary if E[U2 η] < ∞. Then cov(X S(0), X S(t)) = E[U2 (η − t)+] .

Fran¸ cois Roueff Estimation of the long memory parameter, p. 10

slide-11
SLIDE 11

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Long memory is a consequence of heavy tails

Assume E[U2] < ∞ and define H(t) = E

  • U21{η>t}
  • .

Assumption H is regularly varying with index α ∈ (0, 2): H(t) = ℓ(t) t−α , where ℓ is slowly varying. In words, durations are heavy tailed with non-necessarily independent rates.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 11

slide-12
SLIDE 12

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Examples

Example 1

U and η are independent and η is heavy tailed, e.g. the M/G/∞ queue, when U = 1.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 12

slide-13
SLIDE 13

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Examples

Example 1

U and η are independent and η is heavy tailed, e.g. the M/G/∞ queue, when U = 1.

Example 2

In a more general Internet traffic framework : the workload U × η

  • f the flow is heavy tailed and independent of the transmission rate

U (and the latter stays away from zero).

Fran¸ cois Roueff Estimation of the long memory parameter, p. 13

slide-14
SLIDE 14

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Consequence on the weak dependence of X

Non-stationary case, α ∈ (0, 2)

var T X(s) ds

  • ∼ cα ℓ(T)T 2H as T → ∞

Stationary case, E[η] < ∞

cov(X S(0), X S(t)) ∼ 1 α − 1 ℓ(t) t1−α = 1 2 − 2H ℓ(t) t2H−2.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 14

slide-15
SLIDE 15

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Consequence on the dependence structure of X

The processes X and X S have Hurst index H = (3 − α)/2

Remark

H < 1 ⇔ α ∈ (1, 2) ; H ≥ 1 ⇔ α ∈ (0, 1].

Remark

α ∈ (1, 2) ⇒ E[η] < ∞ ⇒ α ∈ [1, 2)

Fran¸ cois Roueff Estimation of the long memory parameter, p. 15

slide-16
SLIDE 16

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Asymptotic properties

The asymptotic behavior of Y (t) := t (X(s) − EX(s)) ds conveniently renormalized has been studied in various situations by [Mikosch et al.(2002)], [Maulik et al.(2002)] or [Mikosch and Resnick(2004)].

Fran¸ cois Roueff Estimation of the long memory parameter, p. 16

slide-17
SLIDE 17

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Two very different cases

Let T → ∞.

Stable case : 1 < α < 2, i.e. 1/2 < H < 1

T −1/αY (Tt) converges weakly to an α-stable Levy process.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 17

slide-18
SLIDE 18

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Two very different cases

Let T → ∞.

Stable case : 1 < α < 2, i.e. 1/2 < H < 1

T −1/αY (Tt) converges weakly to an α-stable Levy process.

Unstable case : 0 < α < 1, i.e. 1 < H < 3/2

H−1/2(T)T −HY (Tt) converges weakly to the Gaussian process W with auto-covariance function cov(W (s), W (t)) = 1 1 − α t s {(u ∨ v)1−α − |u − v|1−α} du dv .

Fran¸ cois Roueff Estimation of the long memory parameter, p. 18

slide-19
SLIDE 19

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

M/G/∞ queue

α = 0.3, 0.7, 0.9, 1.1, 1.4 and 1.7.

1000 2000 3000 4000 50 100 150 Index sampledprocess 1000 2000 3000 4000 10 20 30 Index sampledprocess 1000 2000 3000 4000 5 10 15 20 Index sampledprocess 1000 2000 3000 4000 5 10 15 20 Index sampledprocess 1000 2000 3000 4000 2 4 6 8 10 Index sampledprocess 1000 2000 3000 4000 2 4 6 8 10 Index sampledprocess

Fran¸ cois Roueff Estimation of the long memory parameter, p. 19

slide-20
SLIDE 20

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Centered exponential rewards

Here are sample paths for 1 + Un ∼ exp(1) and ηn ∼ Pareto(α) with α = 0.3, 0.7, 0.9, 1.1, 1.4 and 1.7.

1000 2000 3000 4000 −10 10 20 Index sampledprocess 1000 2000 3000 4000 −10 −5 5 10 Index sampledprocess 1000 2000 3000 4000 −5 5 10 Index sampledprocess 1000 2000 3000 4000 −6 −4 −2 2 4 6 Index sampledprocess 1000 2000 3000 4000 −4 −2 2 4 6 8 Index sampledprocess 1000 2000 3000 4000 −4 −2 2 4 6 8 Index sampledprocess

Fran¸ cois Roueff Estimation of the long memory parameter, p. 20

slide-21
SLIDE 21

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Symmeterized exponential rewards

Here are sample paths for Un ∼ exp(1) − exp(1)′ and ηn ∼ Pareto(α) with α = 0.3, 0.7, 0.9, 1.1, 1.4 and 1.7.

1000 2000 3000 4000 −10 10 20 30 Index sampledprocess 1000 2000 3000 4000 −20 −15 −10 −5 5 Index sampledprocess 1000 2000 3000 4000 −15 −10 −5 5 10 Index sampledprocess 1000 2000 3000 4000 5 10 15 20 25 Index sampledprocess 1000 2000 3000 4000 −10 −5 5 10 15 Index sampledprocess 1000 2000 3000 4000 −5 5 Index sampledprocess

Fran¸ cois Roueff Estimation of the long memory parameter, p. 21

slide-22
SLIDE 22

A short introduction to Long memory The model Estimation Bibliography Infinite Source Poisson model Heavy Tails and Long Memory Asymptotic properties Some sample paths

Differentiated asymptotic sample paths

Gaussian (top) and L´ evy-stable (bottom) increments corresponding to α = 0.3, 0.7, 0.9, 1.1, 1.4 and 1.7.

100 200 300 400 500 600 700 800 900 1000 −20 −15 −10 −5 5 10 15 A simulated Gaussian process, γ=0.7 100 200 300 400 500 600 700 800 900 1000 −6 −4 −2 2 4 6 8 A simulated Gaussian process, γ=0.3 100 200 300 400 500 600 700 800 900 1000 −4 −3 −2 −1 1 2 3 4 5 6 A simulated Gaussian process, γ=0.1 50 100 150 200 250 −40 −30 −20 −10 10 20 30 40 50 100 150 200 250 −35 −30 −25 −20 −15 −10 −5 5 10 15 50 100 150 200 250 −15 −10 −5 5 10

Fran¸ cois Roueff Estimation of the long memory parameter, p. 22

slide-23
SLIDE 23

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Three possible observation schemes

  • 1. We observe the continuous path X(t) for all t ∈ [0, T].
  • 2. We observe the discrete sample path X(t) for all

t = 1, 2, . . . , T.

  • 3. We observe discrete local averages

Y k = k+1

k

X(t) dt for all k = 0, 1, . . . , T. In fact the two last cases can be treated similarly. Hence we only consider the continuous case and the discrete sample case.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 23

slide-24
SLIDE 24

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

α is the parameter of interest ⇒ asymptotic behavior, stability, (queuing performances ?) .... Natural approach : heavy tail estimator such as the Hill estimator.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 24

slide-25
SLIDE 25

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

α is the parameter of interest ⇒ asymptotic behavior, stability, (queuing performances ?) .... Natural approach : heavy tail estimator such as the Hill estimator. Drawbacks :

Hidden information

Difficult and costly to observe, say, (Un, ηn)n (one needs to reconstruct the flows),

Fran¸ cois Roueff Estimation of the long memory parameter, p. 25

slide-26
SLIDE 26

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

α is the parameter of interest ⇒ asymptotic behavior, stability, (queuing performances ?) .... Natural approach : heavy tail estimator such as the Hill estimator. Drawbacks :

Hidden information

Difficult and costly to observe, say, (Un, ηn)n (one needs to reconstruct the flows),

Tails of X

In general the marginals of X(t) are not even heavy tailed,

Fran¸ cois Roueff Estimation of the long memory parameter, p. 26

slide-27
SLIDE 27

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

α is the parameter of interest ⇒ asymptotic behavior, stability, (queuing performances ?) .... Natural approach : heavy tail estimator such as the Hill estimator. Drawbacks :

Hidden information

Difficult and costly to observe, say, (Un, ηn)n (one needs to reconstruct the flows),

Tails of X

In general the marginals of X(t) are not even heavy tailed,

Return times to the empty state

In the stable case only, there are iid heavy tailed active and exponential idle periods. But real data indicate that the model is not suitable at fine scales : the empty state is hard to identify.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 27

slide-28
SLIDE 28

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Here we estimate α through the long memory parameter H based

  • n empirical second-order properties of the path. Standard

approaches are

Fourier methods (GPH, GSE)

Efficient in practice and in theory for standard time series. Non-linear case is open.

Wavelet methods

Promising results in the context of self-similar Gaussian and stable

  • processes. Easier to adapt in the non-linear case ?

Fran¸ cois Roueff Estimation of the long memory parameter, p. 28

slide-29
SLIDE 29

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Mother and father wavelets

Take a function ψ with compact support and such that

  • ψ = 0.

Take a function φ with compact support and satisfying

  • k∈Z

φ(t − k) = 1, t ∈ R . Denote ψj,k(t) = 2−j/2ψ(2−jt − k) . In the context of a multi-resolution analysis, one calls ψ a wavelet and φ is the corresponding scaling function (or father wavelet). Additional assumptions are possibly required on φ and ψ but will not be mentioned in the following.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 29

slide-30
SLIDE 30

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Continuous or discrete time Wavelet coefficients

In continuous time, one computes dj,k =

  • X(t)ψj,k(t) dt

for any (j, k) such that the support of ψj,k falls into [0, T]. In discrete time, one first compute a continuous interpolation Iφ[X](t) :=

  • k∈Z

X(k)φ(t − k) and then (this setting includes the usual (decimated) DWT) dD

j,k =

  • Iφ[X](t)ψj,k(t) dt .

Fran¸ cois Roueff Estimation of the long memory parameter, p. 30

slide-31
SLIDE 31

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

The set of indices

Let J denote the largest j such that dj,k or dD

j,k can be computed

from {X(t), t ∈ [0, T]} for all j = 0, . . . , J and k = 0, . . . , 2J−j; One has J = log2(T) + O(1) . For positive integers J0 < J1 ≤ J, define ∆ := {(j, k) , J0 < j ≤ J1 , 0 ≤ k ≤ 2J−j − 1} .

Fran¸ cois Roueff Estimation of the long memory parameter, p. 31

slide-32
SLIDE 32

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Definition of the wavelet Whittle estimator

Adapting the Whittle estimator (Fourier) to the wavelet domain, define ˆ α := arg min

α′ log

 

(j,k)∈∆

d2

j,k

2(2−α′)j   − δ log(2)(2 − α′) , where δ = 1 #∆

  • (j,k)∈∆

j .

Fran¸ cois Roueff Estimation of the long memory parameter, p. 32

slide-33
SLIDE 33

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Consistency

Let T → ∞ with J0 → ∞ and J1 − J0 → ∞. Theorem (Stable case) If α > 1 and lim sup J0/J < 1/α then ˆ α

P

− → α. Theorem (Unstable case) If α ≤ 1 and lim sup J1/J < 1/(2 − α) then ˆ α

P

− → α. In particular we have consistency for all α ∈ (0, 2) for lim sup J1/J < 1/2

Fran¸ cois Roueff Estimation of the long memory parameter, p. 33

slide-34
SLIDE 34

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Proof of consistency

The two main points are to show that Ed2

j,k ≈ c2(2−α)j

(1) for J0 < j ≤ J1 and 2j−J

2J−j

  • k=1
  • d2

j,k

c2(2−α)j − 1

  • P

− → 0 , (2) for j ≈ J0. In the case α > 1, (2) is true only if lim sup J0/J < 1/α. In the case α ≤ 1, (1) is true only if lim sup J1/J < 1/(2 − α).

Fran¸ cois Roueff Estimation of the long memory parameter, p. 34

slide-35
SLIDE 35

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Rates (Stable case)

Assume that α > 1. Rates can be established by assuming that E[U2 cos(uη)] = 1 + |u|α (1 + O(|u|β)) . Theorem ˆ α achieves the rate T −γ/(2γ+α) for J0 = J/(2γ + α), with : γ = β in continuous time and γ = (2 − α) ∧ β in discrete time.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 35

slide-36
SLIDE 36

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Comments

Discrete observations

We estimate the zero frequency behavior of the spectral density f

  • f X t, t = 1, . . . , T, f (λ) ∼ |λ|α−2. The limitation in discrete time

is due to aliasing (can be made low in simulations but presumably not negligible in practice).

Fran¸ cois Roueff Estimation of the long memory parameter, p. 36

slide-37
SLIDE 37

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Comments

Discrete observations

We estimate the zero frequency behavior of the spectral density f

  • f X t, t = 1, . . . , T, f (λ) ∼ |λ|α−2. The limitation in discrete time

is due to aliasing (can be made low in simulations but presumably not negligible in practice).

The Pareto case

If η has Pareto distribution, we find β = 2 − α: same rate in continuous and discrete time.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 37

slide-38
SLIDE 38

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Comments

Discrete observations

We estimate the zero frequency behavior of the spectral density f

  • f X t, t = 1, . . . , T, f (λ) ∼ |λ|α−2. The limitation in discrete time

is due to aliasing (can be made low in simulations but presumably not negligible in practice).

The Pareto case

If η has Pareto distribution, we find β = 2 − α: same rate in continuous and discrete time.

Further work

Other non-linear models; Alternative estimator, e.g. based on tails of busy periods; Semiparametric standards: adaptive estimators, Minimax rates ...

Fran¸ cois Roueff Estimation of the long memory parameter, p. 38

slide-39
SLIDE 39

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Simulations

The estimator is computed on simulated discrete observations X 1, . . . , X n. The wavelet Whittle estimator is compared to the usual Fourier domain GPH and GSE estimators (whose theoretical properties are

  • nly available for Gaussian and linear processes).

Fran¸ cois Roueff Estimation of the long memory parameter, p. 39

slide-40
SLIDE 40

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Fourier methods

The periodogram ordinates: In,k = (2πn)−1

  • n
  • t=1

X teitxk

  • 2

, E[In,k] = cxα−2

k

(1 + o(1)) . The GPH and local Whittle (GSE) estimators. ˆ αGPH = arg min

c′,α′ m

  • j=1
  • log(In,k) − c′ − (2 − α′) log(k)

2 , ˆ αGSE = arg min

c′,α′ log

 

m

  • j=1

k2−α′In,k   − (2 − α′) 1 m

m

  • k=1

log(k) . m is the bandwidth parameter; plays a role similar to J0 for the path-wise estimator.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 40

slide-41
SLIDE 41

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Stable case

100 Monte Carlo simulations with α = 1.1 and α = 1.8, centered exponential rates.

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

Wavelet Whittle

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

Wavelet Whittle with mask

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

GPH

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

GSE

1000 2000 3000 4000 −5 5 1 2 3 4 5 6 7 8 5e−04 1e−03 2e−03 5e−03 1e−02 2e−02 5e−02 1e−01 2e−01 Index MSE GPH MSE GSE MSE Whittle MSE Whittle w/ mask

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

Wavelet Whittle

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

Wavelet Whittle with mask

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

GPH

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

GSE

1000 2000 3000 4000 −5 5 1 2 3 4 5 6 7 8 0.002 0.005 0.010 0.020 0.050 0.100 0.200 Index MSE GPH MSE GSE MSE Whittle MSE Whittle w/ mask

Fran¸ cois Roueff Estimation of the long memory parameter, p. 41

slide-42
SLIDE 42

A short introduction to Long memory The model Estimation Bibliography Observation schemes Heavy tails VS long memory Whittle wavelet estimator Simulations

Unstable case

100 Monte Carlo simulations with α = 0.3 and α = 0.7, centered exponential rewards.

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

Wavelet Whittle

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

Wavelet Whittle with mask

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

GPH

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

GSE

1000 2000 3000 4000 −30 1 2 3 4 5 6 7 8 0.001 0.002 0.005 0.010 0.020 0.050 0.100 0.200 Index MSE GPH MSE GSE MSE Whittle MSE Whittle w/ mask

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

Wavelet Whittle

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

Wavelet Whittle with mask

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

GPH

  • V1

V2 V3 V4 V5 V6 V7 V8 0.0 1.0 2.0

GSE

1000 2000 3000 4000 −5 5 15 1 2 3 4 5 6 7 8 5e−04 1e−03 2e−03 5e−03 1e−02 2e−02 5e−02 1e−01 2e−01 Index MSE GPH MSE GSE MSE Whittle MSE Whittle w/ mask

Fran¸ cois Roueff Estimation of the long memory parameter, p. 42

slide-43
SLIDE 43

A short introduction to Long memory The model Estimation Bibliography

Further readings I

  • L. Giraitis, S. A. Molchanov, and D. Surgailis.

Long memory shot noises and limit theorems with application to Burgers’ equation. In New directions in time series analysis, Part II, volume 46 of IMA Vol. Math. Appl., pages 153–176. Springer, New York, 1993.

  • C. W. J. Granger and R. Joyeux.

An introduction to long memory time series and fractional differencing. Journal of Time Series Analysis, 1:15–30, 1980.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 43

slide-44
SLIDE 44

A short introduction to Long memory The model Estimation Bibliography

Further readings II

Benoit B. Mandelbrot and John W. Van Ness. Fractional Brownian motions, fractional noises and applications. SIAM Rev., 10:422–437, 1968. ISSN 1095-7200. Krishanu Maulik, Sidney Resnick, and Holger Rootz´ en. Asymptotic independence and a network traffic model. Journal of Applied Probability, 39(4):671–699, 2002. Thomas Mikosch and Sidney Resnick. Activity rates with very heavy tails. Technical Report 1411, Cornell University, 2004.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 44

slide-45
SLIDE 45

A short introduction to Long memory The model Estimation Bibliography

Further readings III

Thomas Mikosch, Sidney Resnick, Holger Rootz´ en, and Alwin Stegeman. Is network traffic approximated by stable Levy motion or fractional Brownian motion? Annals of Applied Probability, 12:23–68, 2002.

  • M. Taqqu, W. Willinger, and R. Sherman.

Proof of a fundamental result in self-similar traffic modeling. Computer Communication Review, 27:5–23, 1997. M.S. Taxqqu and J.M. Levy. Using renewal processes to generate long range dependence and high variability. In E. Eberlein and M.S. Taqqu (eds), Dependence in Probability and Statistics. Boston, Birkh¨ auser, 1986.

Fran¸ cois Roueff Estimation of the long memory parameter, p. 45