Spiking neural models: from point processes to partial differential - - PowerPoint PPT Presentation

spiking neural models from point processes to partial
SMART_READER_LITE
LIVE PREVIEW

Spiking neural models: from point processes to partial differential - - PowerPoint PPT Presentation

Spiking neural models: from point processes to partial differential equations. Julien Chevallier Co-workers: M. J. Cceres, M. Doumic and P. Reynaud-Bouret LJAD University of Nice INRIA Sophia-Antipolis 2016/06/09 Introduction Thinning


slide-1
SLIDE 1

Spiking neural models: from point processes to partial differential equations.

Julien Chevallier Co-workers: M. J. Càceres, M. Doumic and P. Reynaud-Bouret

LJAD University of Nice INRIA Sophia-Antipolis

2016/06/09

slide-2
SLIDE 2

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Outline

1 Introduction 2 A key tool: The thinning procedure 3 First approach: Mathematical expectation 4 Second approach: Mean-field interactions

slide-3
SLIDE 3

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Outline

1 Introduction

Neurobiologic context Microscopic modelling Macroscopic modelling

2 A key tool: The thinning procedure 3 First approach: Mathematical expectation 4 Second approach: Mean-field interactions

slide-4
SLIDE 4

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Biological context

Action potential Voltage (mV) Depolarization R e p

  • l

a r i z a t i

  • n

Threshold Stimulus Failed initiations Resting state Refractory period +40

  • 55
  • 70

1 2 3 4 5 Time (ms)

Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.

slide-5
SLIDE 5

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Biological context

microscopic scale

Action potential Voltage (mV) Depolarization Repolarization Threshold Stimulus Failed initiations Resting state Refractory period +40

  • 55
  • 70

1 2 3 4 5 Time (ms)

Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.

slide-6
SLIDE 6

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Biological context

microscopic scale

Action potential Voltage (mV) Depolarization Repolarization Threshold Stimulus Failed initiations Resting state Refractory period +40

  • 55
  • 70

1 2 3 4 5 Time (ms)

Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.

slide-7
SLIDE 7

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Biological context

microscopic scale

Action potential Voltage (mV) Depolarization Repolarization Threshold Stimulus Failed initiations Resting state Refractory period +40

  • 55
  • 70

1 2 3 4 5 Time (ms)

. . .

Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.

slide-8
SLIDE 8

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Biological context

microscopic scale macroscopic scale

Action potential Voltage (mV) Depolarization Repolarization Threshold Stimulus Failed initiations Resting state Refractory period +40

  • 55
  • 70

1 2 3 4 5 Time (ms)

. . .

Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.

slide-9
SLIDE 9

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Biological context

microscopic scale macroscopic scale

Action potential Voltage (mV) Depolarization Repolarization Threshold Stimulus Failed initiations Resting state Refractory period +40

  • 55
  • 70

1 2 3 4 5 Time (ms)

. . .

Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.

slide-10
SLIDE 10

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Biological context

microscopic scale macroscopic scale

Action potential Voltage (mV) Depolarization Repolarization Threshold Stimulus Failed initiations Resting state Refractory period +40

  • 55
  • 70

1 2 3 4 5 Time (ms)

. . .

Neurons = electrically excitable cells. Action potential = spike of the electrical potential. Physiological constraint: refractory period.

slide-11
SLIDE 11

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Microscopic modelling

Microscopic modelling of spike trains Time point processes = random countable sets of times (points of R or R+). Point process: N = {Ti,i ∈ Z} s.t. ··· < T0 ≤ 0 < T1 < ···. Point measure: N(dt) = ∑i∈Z δTi (dt). Hence,

f (t)N(dt) = ∑i∈Z f (Ti).

slide-12
SLIDE 12

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Microscopic modelling

Microscopic modelling of spike trains Time point processes = random countable sets of times (points of R or R+). Point process: N = {Ti,i ∈ Z} s.t. ··· < T0 ≤ 0 < T1 < ···. Point measure: N(dt) = ∑i∈Z δTi (dt). Hence,

f (t)N(dt) = ∑i∈Z f (Ti).

Age process: (St−)t≥0. Age = delay since last spike.

slide-13
SLIDE 13

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Microscopic modelling

Microscopic modelling of spike trains Time point processes = random countable sets of times (points of R or R+). Point process: N = {Ti,i ∈ Z} s.t. ··· < T0 ≤ 0 < T1 < ···. Point measure: N(dt) = ∑i∈Z δTi (dt). Hence,

f (t)N(dt) = ∑i∈Z f (Ti).

Age process: (St−)t≥0. Stochastic intensity Heuristically, λt = lim

∆t→0

1 ∆t P

  • N ([t,t +∆t]) = 1|F N

t−

  • ,

where F N

t− denotes the history of N before time t.

Local behaviour: probability to find a new spike. May depend on the past (e.g. refractory period, aftershocks).

slide-14
SLIDE 14

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Some classical point processes in neuroscience

Poisson process: λt = λ(t) (deterministic, no refractory period).

slide-15
SLIDE 15

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Some classical point processes in neuroscience

Poisson process: λt = λ(t) (deterministic, no refractory period). Renewal process: λt = f (St−) ⇔ i.i.d. ISIs. (refractory period)

slide-16
SLIDE 16

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Some classical point processes in neuroscience

Poisson process: λt = λ(t) (deterministic, no refractory period). Renewal process: λt = f (St−) ⇔ i.i.d. ISIs. (refractory period) Linear Hawkes process: λt = µ +

t−

h(t −z)N(dz), h ≥ 0.

slide-17
SLIDE 17

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Some classical point processes in neuroscience

Poisson process: λt = λ(t) (deterministic, no refractory period). Renewal process: λt = f (St−) ⇔ i.i.d. ISIs. (refractory period) Linear Hawkes process: λt = µ +

t−

h(t −z)N(dz), h ≥ 0.

slide-18
SLIDE 18

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Some classical point processes in neuroscience

Poisson process: λt = λ(t) (deterministic, no refractory period). Renewal process: λt = f (St−) ⇔ i.i.d. ISIs. (refractory period) Linear Hawkes process: λt = µ +

t−

h(t −z)N(dz)

T∈N T<t

h(t −T) , h ≥ 0.

slide-19
SLIDE 19

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Some classical point processes in neuroscience

Poisson process: λt = λ(t) (deterministic, no refractory period). Renewal process: λt = f (St−) ⇔ i.i.d. ISIs. (refractory period) Linear Hawkes process: λt = µ +

t−

h(t −z)N(dz)

T∈N T<t

h(t −T) , h ≥ 0.

slide-20
SLIDE 20

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Some classical point processes in neuroscience

Poisson process: λt = λ(t) (deterministic, no refractory period). Renewal process: λt = f (St−) ⇔ i.i.d. ISIs. (refractory period) Linear Hawkes process: λt = µ +

t−

h(t −z)N(dz)

T∈N T<t

h(t −T) , h ≥ 0.

slide-21
SLIDE 21

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Age structured equations (K. Pakdaman, B. Perthame, D. Salort, 2010)

Age = delay since last spike. n(t,s) =

  • probability density of finding a neuron with age s at time t.

ratio of the neural population with age s at time t. mean firing rate →        ∂n(t,s) ∂t + ∂n(t,s) ∂s +p (s,X (t))n(t,s) = 0 n(t,0) =

+∞

p (s,X (t))n(t,s)ds. (PPS)

slide-22
SLIDE 22

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Age structured equations (K. Pakdaman, B. Perthame, D. Salort, 2010)

Age = delay since last spike. n(t,s) =

  • probability density of finding a neuron with age s at time t.

ratio of the neural population with age s at time t. mean firing rate →        ∂n(t,s) ∂t + ∂n(t,s) ∂s +p (s,X (t))n(t,s) = 0 n(t,0) =

+∞

p (s,X (t))n(t,s)ds. (PPS) Parameters rate function p. For example, p(s,X) = 1{s>σ(X)}. X(t) =

t

0 d(t −x)n(x,0)dx

(global neural activity)

Propagation time. d = delay function. For example, d(x) = e−τx.

slide-23
SLIDE 23

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Age structured equations (K. Pakdaman, B. Perthame, D. Salort, 2010)

Age = delay since last spike. n(t,s) =

  • probability density of finding a neuron with age s at time t.

ratio of the neural population with age s at time t. mean firing rate →        ∂n(t,s) ∂t + ∂n(t,s) ∂s +p (s,X (t))n(t,s) = 0 n(t,0) =

+∞

p (s,X (t))n(t,s)ds. (PPS) Parameters rate function p. For example, p(s,X) = 1{s>σ(X)}. X(t) =

t

0 d(t −x)n(x,0)dx

(global neural activity)

Propagation time. d = delay function. For example, d(x) = e−τx.

Cornerstone: X(t) ← →

t−

h(t −x)N(dx).

slide-24
SLIDE 24

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Outline

1 Introduction 2 A key tool: The thinning procedure 3 First approach: Mathematical expectation 4 Second approach: Mean-field interactions

slide-25
SLIDE 25

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Lewis and Shedler’s Thinning, 1979

: Poisson process : Poisson process

t N

Π is a Poisson process with intensity 1. Π(dt,dx) = ∑δX. E[Π(dt,dx)] = dtdx. Spatial independence.

slide-26
SLIDE 26

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Lewis and Shedler’s Thinning, 1979

t

: Poisson process : Poisson processN

Π is a Poisson process with intensity 1. Π(dt,dx) = ∑δX. E[Π(dt,dx)] = dtdx. Spatial independence. λ is deterministic. N admits λ as an intensity.

slide-27
SLIDE 27

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Ogata’s Thinning, 1981

t

: Poisson process : Point process N

Π is a Poisson process with intensity 1. Π(dt,dx) = ∑δX. E[Π(dt,dx)] = dtdx. Spatial independence.

slide-28
SLIDE 28

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Ogata’s Thinning, 1981

t

: Poisson process : Point process N

Π is a Poisson process with intensity 1. Π(dt,dx) = ∑δX. E[Π(dt,dx)] = dtdx. Spatial independence. λ is random. N admits λ as an intensity.

slide-29
SLIDE 29

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Ogata’s Thinning, 1981

t

: Poisson process : Point process N

Π is a Poisson process with intensity 1. Π(dt,dx) = ∑δX. E[Π(dt,dx)] = dtdx. Spatial independence. λ is random. N admits λ as an intensity.

slide-30
SLIDE 30

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Ogata’s Thinning, 1981

t

: Poisson process : Point process N

Π is a Poisson process with intensity 1. Π(dt,dx) = ∑δX. E[Π(dt,dx)] = dtdx. Spatial independence. λ is random. N admits λ as an intensity.

slide-31
SLIDE 31

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Ogata’s Thinning, 1981

t

: Poisson process : Point process N

Π is a Poisson process with intensity 1. Π(dt,dx) = ∑δX. E[Π(dt,dx)] = dtdx. Spatial independence. λ is random. N admits λ as an intensity.

slide-32
SLIDE 32

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Outline

1 Introduction 2 A key tool: The thinning procedure 3 First approach: Mathematical expectation

Markovian case Non-Markovian case

4 Second approach: Mean-field interactions

slide-33
SLIDE 33

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Fokker-Planck equation

Assume λt = f (t,St−) (Poisson, renewal, ...).

slide-34
SLIDE 34

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Fokker-Planck equation

Assume λt = f (t,St−) (Poisson, renewal, ...). (St−)t≥0 is Markovian with generator Gtφ(s) := φ′(s)+f (t,s)[φ(0)−φ(s)].

slide-35
SLIDE 35

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Fokker-Planck equation

Assume λt = f (t,St−) (Poisson, renewal, ...). (St−)t≥0 is Markovian with generator Gtφ(s) := φ′(s)+f (t,s)[φ(0)−φ(s)]. Fokker-Planck equation gives the following PDE system:        ∂ ∂t u (t,s)+ ∂ ∂s u (t,s)+f (t,s)u (t,s) = 0, u (t,0) =

  • s∈R+

f (t,s)u (t,s) ds, where u(t,·) is the distribution of St−.

slide-36
SLIDE 36

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

System in expectation

Theorem (C., Caceres, Doumic, Reynaud-Bouret 15) Let λt be some non negative predictable process which is L1

loc in expectation.

The distribution of St−, namely u(t,·), satisfies the following system,        ∂ ∂t u (t,s)+ ∂ ∂s u (t,s)+ρλ,P0(t,s)u (t,s) = 0, u (t,0) =

  • s∈R+

ρλ,P0(t,s)u (t,s) dt, (PPS-ρ) in the weak sense where ρλ,P0 (t,s) = E[λt|St− = s] for almost every t.

slide-37
SLIDE 37

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

System in expectation

Theorem (C., Caceres, Doumic, Reynaud-Bouret 15) Let λt be some non negative predictable process which is L1

loc in expectation.

The distribution of St−, namely u(t,·), satisfies the following system,        ∂ ∂t u (t,s)+ ∂ ∂s u (t,s)+ρλ,P0(t,s)u (t,s) = 0, u (t,0) =

  • s∈R+

ρλ,P0(t,s)u (t,s) dt, (PPS-ρ) in the weak sense where ρλ,P0 (t,s) = E[λt|St− = s] for almost every t. Law of Large Numbers: the empirical measure 1

n ∑n i=1 δSi

t−(ds) converges

to a solution of (PPS-ρ), namely u.

slide-38
SLIDE 38

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

System in expectation

Theorem (C., Caceres, Doumic, Reynaud-Bouret 15) Let λt be some non negative predictable process which is L1

loc in expectation.

The distribution of St−, namely u(t,·), satisfies the following system,        ∂ ∂t u (t,s)+ ∂ ∂s u (t,s)+ρλ,P0(t,s)u (t,s) = 0, u (t,0) =

  • s∈R+

ρλ,P0(t,s)u (t,s) dt, (PPS-ρ) in the weak sense where ρλ,P0 (t,s) = E[λt|St− = s] for almost every t. Law of Large Numbers: the empirical measure 1

n ∑n i=1 δSi

t−(ds) converges

to a solution of (PPS-ρ), namely u. Includes the Markovian case, ρλ,P0 (t,s) = f (t,s).

slide-39
SLIDE 39

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

System in expectation

Theorem (C., Caceres, Doumic, Reynaud-Bouret 15) Let λt be some non negative predictable process which is L1

loc in expectation.

The distribution of St−, namely u(t,·), satisfies the following system,        ∂ ∂t u (t,s)+ ∂ ∂s u (t,s)+ρλ,P0(t,s)u (t,s) = 0, u (t,0) =

  • s∈R+

ρλ,P0(t,s)u (t,s) dt, (PPS-ρ) in the weak sense where ρλ,P0 (t,s) = E[λt|St− = s] for almost every t. Law of Large Numbers: the empirical measure 1

n ∑n i=1 δSi

t−(ds) converges

to a solution of (PPS-ρ), namely u. Includes the Markovian case, ρλ,P0 (t,s) = f (t,s). Non-markovian ⇒ ρλ,P0 (t,s) more complex.

slide-40
SLIDE 40

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

System in expectation

Theorem (C., Caceres, Doumic, Reynaud-Bouret 15) Let λt be some non negative predictable process which is L1

loc in expectation.

The distribution of St−, namely u(t,·), satisfies the following system,        ∂ ∂t u (t,s)+ ∂ ∂s u (t,s)+ρλ,P0(t,s)u (t,s) = 0, u (t,0) =

  • s∈R+

ρλ,P0(t,s)u (t,s) dt, (PPS-ρ) in the weak sense where ρλ,P0 (t,s) = E[λt|St− = s] for almost every t. Law of Large Numbers: the empirical measure 1

n ∑n i=1 δSi

t−(ds) converges

to a solution of (PPS-ρ), namely u. Includes the Markovian case, ρλ,P0 (t,s) = f (t,s). Non-markovian ⇒ ρλ,P0 (t,s) more complex. Linear Hawkes process: closed system for v(t,s) :=

+∞

s

u(t,σ)dσ.

slide-41
SLIDE 41

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Outline

1 Introduction 2 A key tool: The thinning procedure 3 First approach: Mathematical expectation 4 Second approach: Mean-field interactions

Generalities Actual and limit dynamics Coupling of these two dynamics Mean-field approximation

slide-42
SLIDE 42

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Propagation of chaos: a tool to link the two scales

Mean field n-neurons system Weak dependence: homogeneous interactions scaled by 1/n. Symmetry: the neurons are exchangeable. The dynamics is described by a growing system of equations.

slide-43
SLIDE 43

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Propagation of chaos: a tool to link the two scales

Mean field n-neurons system Weak dependence: homogeneous interactions scaled by 1/n. Symmetry: the neurons are exchangeable. The dynamics is described by a growing system of equations. Asymptotic when n → +∞ The neurons become independent (they are i.d.). Their distribution is described by one non-linear PDE.

slide-44
SLIDE 44

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Propagation of chaos: a tool to link the two scales

Mean field n-neurons system Weak dependence: homogeneous interactions scaled by 1/n. Symmetry: the neurons are exchangeable. The dynamics is described by a growing system of equations. Asymptotic when n → +∞ The neurons become independent (they are i.d.). Their distribution is described by one non-linear PDE. Mean-field Neuroscience: Intrinsic spiking (Stannat et al. 2014), I&F (Delarue et al. 2015), point processes models (Galves and Löcherbach 2015). Hawkes: Mean field approximation (Delattre et al., 2015), inference (Delattre et al., Bacry et al. 2016).

slide-45
SLIDE 45

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Propagation of chaos: a tool to link the two scales

Mean field n-neurons system Weak dependence: homogeneous interactions scaled by 1/n. Symmetry: the neurons are exchangeable. The dynamics is described by a growing system of equations. Asymptotic when n → +∞ The neurons become independent (they are i.d.). Their distribution is described by one non-linear PDE. Mean-field Neuroscience: Intrinsic spiking (Stannat et al. 2014), I&F (Delarue et al. 2015), point processes models (Galves and Löcherbach 2015). Hawkes: Mean field approximation (Delattre et al., 2015), inference (Delattre et al., Bacry et al. 2016). Here: Age dependent Hawkes processes.

slide-46
SLIDE 46

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Multivariate Hawkes processes

Multivariate HP: (i = 1,...,n) λ i

t = Φ

t− hi→i(t −x)Ni(dx)+ ∑

j=i

t−

hj→i(t −x)Nj(dx)

  • .
slide-47
SLIDE 47

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Multivariate Hawkes processes

Multivariate HP: (i = 1,...,n) λ i

t = Φ

t− hi→i(t −x)Ni(dx)+ ∑

j=i

t−

hj→i(t −x)Nj(dx)

  • .

i → i

slide-48
SLIDE 48

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Multivariate Hawkes processes

Multivariate HP: (i = 1,...,n) λ i

t = Φ

t− hi→i(t −x)Ni(dx)+ ∑

j=i

t−

hj→i(t −x)Nj(dx)

  • .

j = i Interaction function hj→i ↔ synaptic weight of neuron j over neuron i.

slide-49
SLIDE 49

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Generalized Hawkes processes

Renewal process Multivariate HP λt = f (St−) λ i

t = Φ

  • n

j=1

t−

hj→i(t −x)Nj(dx)

slide-50
SLIDE 50

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Generalized Hawkes processes

Renewal process Multivariate HP λt = f (St−) λ i

t = Φ

  • n

j=1

t−

hj→i(t −x)Nj(dx)

  • mix
slide-51
SLIDE 51

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Generalized Hawkes processes

Renewal process Multivariate HP λt = f (St−) λ i

t = Φ

  • n

j=1

t−

hj→i(t −x)Nj(dx)

  • mix

Age dependent Hawkes process (n-neurons system) It is a multivariate point process (Ni)i=1,..,n with intensity given for all i = 1,...,n by λ i

t = Ψ

  • Si

t−, 1

n

n

j=1

t−

h(t −z)Nj(dz)

  • ,

“hj→i = 1 n h”. Example: Ψ(s,x) = Φ(x)1s≥δ strict refractory period of length δ. How to approximate them as n → +∞ ?

slide-52
SLIDE 52

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Generalized Hawkes processes

Renewal process Multivariate HP λt = f (St−) λ i

t = Φ

  • n

j=1

t−

hj→i(t −x)Nj(dx)

  • mix

Age dependent Hawkes process (n-neurons system) It is a multivariate point process (Ni)i=1,..,n with intensity given for all i = 1,...,n by λ i

t = Ψ

  • Si

t−, 1

n

n

j=1

t−

h(t −z)Nj(dz)

  • ,

“hj→i = 1 n h”. Example: Ψ(s,x) = Φ(x)1s≥δ strict refractory period of length δ. How to approximate them as n → +∞ ? LLN heuristics: they are close to independent copies of a limit process.

slide-53
SLIDE 53

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Scheme of the coupling method

Idea of coupling (Sznitman) The idea is to find a suitable coupling between the particles of the n-particle system and n i.i.d. copies of a limit process.

slide-54
SLIDE 54

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Scheme of the coupling method

Idea of coupling (Sznitman) The idea is to find a suitable coupling between the particles of the n-particle system and n i.i.d. copies of a limit process. 1 Find a good candidate for the limit process (LLN heuristics). 2 Show that it is well-defined (McKean-Vlasov fixed point problem). 3 Couple the dynamics in the right way. 4 Show the convergence.

slide-55
SLIDE 55

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Scheme of the coupling method

Idea of coupling (Sznitman) The idea is to find a suitable coupling between the particles of the n-particle system and n i.i.d. copies of a limit process. 1 Find a good candidate for the limit process (LLN heuristics). 1’ Use the PDE to find the distribution of the limit process. 2 Show that it is well-defined (McKean-Vlasov fixed point problem). 3 Couple the dynamics in the right way. 4 Show the convergence.

slide-56
SLIDE 56

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

1/ Limit process (heuristic)

Recall the intensities of the n-neurons system λ i

t = Ψ

  • Si

t−, 1

n

n

j=1

t−

h(t −z)Nj(dz)

  • .
slide-57
SLIDE 57

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

1/ Limit process (heuristic)

Recall the intensities of the n-neurons system λ i

t = Ψ

  • Si

t−, 1

n

n

j=1

t−

h(t −z)Nj(dz)

  • .

Independence at the limit ⇒ Law of Large Numbers.

slide-58
SLIDE 58

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

1/ Limit process (heuristic)

Recall the intensities of the n-neurons system λ i

t = Ψ

  • Si

t−, 1

n

n

j=1

t−

h(t −z)Nj(dz)

  • .

Independence at the limit ⇒ Law of Large Numbers. Limit process It is a point process N with intensity given by λ t = Ψ

  • St−,

t−

h(t −z)E

  • N(dz)
  • .
slide-59
SLIDE 59

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

1/ Limit process (heuristic)

Recall the intensities of the n-neurons system λ i

t = Ψ

  • Si

t−, 1

n

n

j=1

t−

h(t −z)Nj(dz)

  • .

Independence at the limit ⇒ Law of Large Numbers. Limit process It is a point process N with intensity given by λ t = Ψ

  • St−,

t−

h(t −z)E

  • N(dz)
  • .
slide-60
SLIDE 60

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

1/ Limit process (heuristic)

Recall the intensities of the n-neurons system λ i

t = Ψ

  • Si

t−, 1

n

n

j=1

t−

h(t −z)Nj(dz)

  • .

Independence at the limit ⇒ Law of Large Numbers. Limit process It is a point process N with intensity given by λ t = Ψ

  • St−,

t−

h(t −z)E

  • N(dz)
  • .

The blue terms should be close one from the other.

slide-61
SLIDE 61

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

1/ Limit process (heuristic)

Recall the intensities of the n-neurons system λ i

t = Ψ

  • Si

t−, 1

n

n

j=1

t−

h(t −z)Nj(dz)

  • .

Independence at the limit ⇒ Law of Large Numbers. Limit process It is a point process N with intensity given by λ t = Ψ

  • St−,

t−

h(t −z)E

  • N(dz)
  • .

The blue terms should be close one from the other. The process N depends on its own distribution (McKean-Vlasov equation). Its existence is not trivial. The intensity of N depends on the time and the age ⇒ St− is Markovian.

slide-62
SLIDE 62

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

1’/ Study the associated PDE system 1

If the limit process N exists, then the distribution of St−, denoted by u(t,·) satisfies (Fokker-Planck equation):      ∂u (t,s) ∂t + ∂u (t,s) ∂s +Ψ(s,X(t))u (t,s) = 0, u (t,0) =

  • s∈R+

Ψ(s,X(t))u (t,s)ds, (PPS-NL) where for all t ≥ 0, X(t) =

t

0 h(t −z)u(z,0)dz.

slide-63
SLIDE 63

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

1’/ Study the associated PDE system 1

If the limit process N exists, then the distribution of St−, denoted by u(t,·) satisfies (Fokker-Planck equation):      ∂u (t,s) ∂t + ∂u (t,s) ∂s +Ψ(s,X(t))u (t,s) = 0, u (t,0) =

  • s∈R+

Ψ(s,X(t))u (t,s)ds, (PPS-NL) where for all t ≥ 0, X(t) =

t

0 h(t −z)u(z,0)dz.

Main assumption The rate function Ψ is bounded and uniformly Lipschitz w.r.t. X(t).

slide-64
SLIDE 64

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

1’/ Study the associated PDE system 1

If the limit process N exists, then the distribution of St−, denoted by u(t,·) satisfies (Fokker-Planck equation):      ∂u (t,s) ∂t + ∂u (t,s) ∂s +Ψ(s,X(t))u (t,s) = 0, u (t,0) =

  • s∈R+

Ψ(s,X(t))u (t,s)ds, (PPS-NL) where for all t ≥ 0, X(t) =

t

0 h(t −z)u(z,0)dz.

Main assumption The rate function Ψ is bounded and uniformly Lipschitz w.r.t. X(t). Theorem (C. 15) Assume that h : R+ → R is locally integrable and that uin is a non-negative function such that both

+∞

uin(s)ds = 1 and there exists M > 0 such that for all s ≥ 0, 0 ≤ uin(s) ≤ M. Then, there exists a unique solution in the weak sense u such that t → u(t,·) belongs to BC(R+,P(R+)) (Moreover, the solution is in C(R+,L1(R+)).

slide-65
SLIDE 65

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

2/ Show that the limit process is well-posed

Recall the intensity of the limit process λ t = Ψ

  • St−,

t−

h(t −z)E

  • N(dz)
  • .

Recall the associated system (PPS-NL),      ∂u (t,s) ∂t + ∂u (t,s) ∂s +Ψ(s,X(t))u (t,s) = 0, u (t,0) =

  • s∈R+

Ψ(s,X(t))u (t,s)ds, where for all t ≥ 0, X(t) =

t

0 h(t −z)u(z,0)dz.

slide-66
SLIDE 66

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

2/ Show that the limit process is well-posed

Recall the intensity of the limit process λ t = Ψ

  • St−,

t−

h(t −z)E

  • N(dz)
  • .

Recall the associated system (PPS-NL),      ∂u (t,s) ∂t + ∂u (t,s) ∂s +Ψ(s,X(t))u (t,s) = 0, u (t,0) =

  • s∈R+

Ψ(s,X(t))u (t,s)ds, where for all t ≥ 0, X(t) =

t

0 h(t −z)u(z,0)dz.

Proposition The distribution of the age St− is the unique solution of (PPS-NL).

slide-67
SLIDE 67

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

2/ Show that the limit process is well-posed

Recall the intensity of the limit process λ t = Ψ

  • St−,

t−

h(t −z)E

  • N(dz)
  • .

Recall the associated system (PPS-NL),      ∂u (t,s) ∂t + ∂u (t,s) ∂s +Ψ(s,X(t))u (t,s) = 0, u (t,0) =

  • s∈R+

Ψ(s,X(t))u (t,s)ds, where for all t ≥ 0, X(t) =

t

0 h(t −z)u(z,0)dz.

Proposition The distribution of the age St− is the unique solution of (PPS-NL). The intensity of the limit process is given by λ t = Ψ

  • St−,

t

0 h(t −z)u(z,0)dz

  • .

Hence the limit process is well-defined.

slide-68
SLIDE 68

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

3/ The coupling

Six realizations of a Poisson process with intensity 2 on [0,1].

slide-69
SLIDE 69

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

3/ The coupling

t

: Point process : Poisson process : Limit process

slide-70
SLIDE 70

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

3/ The coupling

t

: Point process : Poisson process : Limit process

slide-71
SLIDE 71

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

3/ The coupling

t

: Point process : Poisson process : Limit process

slide-72
SLIDE 72

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

3/ The coupling

t

: Point process : Poisson process : Limit process

slide-73
SLIDE 73

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

4/ Control/Convergence 1

Theorem (C. 15) The coupling described in the previous slide is such that E

  • Card
  • (Ni△Ni)∩[0,θ]
  • number of × in
  • = E

θ

0 |λ i t −λ i t|dt

  • area of
  • n−1/2.

The constant depends on θ, Ψ and h.

slide-74
SLIDE 74

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

4/ Control/Convergence 1

Theorem (C. 15) The coupling described in the previous slide is such that E

  • Card
  • (Ni△Ni)∩[0,θ]
  • number of × in
  • = E

θ

0 |λ i t −λ i t|dt

  • area of
  • n−1/2.

The constant depends on θ, Ψ and h. Corollary If the distribution of the initial value of the age is bounded then the coupling described in the previous slide is such that P

  • (Si

t)t∈[0,θ] = (Si t)t∈[0,θ]

  • n−1/2.
slide-75
SLIDE 75

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

4/ Control/Convergence 2

Propagation of chaos Fix k in N. If the initial conditions are i.i.d., then the processes N1,...,Nk of the n-neurons system behave (when n → +∞) as i.i.d. copies of the limit process N.

slide-76
SLIDE 76

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

4/ Control/Convergence 2

Propagation of chaos Fix k in N. If the initial conditions are i.i.d., then the processes N1,...,Nk of the n-neurons system behave (when n → +∞) as i.i.d. copies of the limit process N. Theorem If the ages at time 0 are i.i.d. with common density uin, then for all t ≥ 0, 1 n

n

i=1

δSi

t −

− − →

n→∞ u(t,·),

where u is the unique solution of the (PPS-NL) system with initial condition uin.

slide-77
SLIDE 77

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

4/ Control/Convergence 2

Propagation of chaos Fix k in N. If the initial conditions are i.i.d., then the processes N1,...,Nk of the n-neurons system behave (when n → +∞) as i.i.d. copies of the limit process N. Theorem If the ages at time 0 are i.i.d. with common density uin, then for all t ≥ 0, 1 n

n

i=1

δSi

t −

− − →

n→∞ u(t,·),

where u is the unique solution of the (PPS-NL) system with initial condition uin. Link between (PPS) and a well-designed microscopic model. Goodness-of fit tests: Renewal and Hawkes processes.

slide-78
SLIDE 78

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Summary

First approach:

◮ Link with an i.i.d. network. ◮ Ends up with (PPS) for Renewal or Poisson processes. ◮ Ends up with a more intricate system with linear Hawkes processes.

slide-79
SLIDE 79

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Summary

First approach:

◮ Link with an i.i.d. network. ◮ Ends up with (PPS) for Renewal or Poisson processes. ◮ Ends up with a more intricate system with linear Hawkes processes.

Second approach:

◮ Network of weakly dependent neurons (asymptotically independent). ◮ Refractory period possible for the limit process. Its distribution is given by

(PPS).

slide-80
SLIDE 80

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Summary

First approach:

◮ Link with an i.i.d. network. ◮ Ends up with (PPS) for Renewal or Poisson processes. ◮ Ends up with a more intricate system with linear Hawkes processes.

Second approach:

◮ Network of weakly dependent neurons (asymptotically independent). ◮ Refractory period possible for the limit process. Its distribution is given by

(PPS).

◮ Remark: The hj→i’s can be i.i.d. random variables.

slide-81
SLIDE 81

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Summary

First approach:

◮ Link with an i.i.d. network. ◮ Ends up with (PPS) for Renewal or Poisson processes. ◮ Ends up with a more intricate system with linear Hawkes processes.

Second approach:

◮ Network of weakly dependent neurons (asymptotically independent). ◮ Refractory period possible for the limit process. Its distribution is given by

(PPS).

◮ Remark: The hj→i’s can be i.i.d. random variables.

Outlook:

◮ Study of the system in expectation for linear Hawkes processes.

slide-82
SLIDE 82

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Summary

First approach:

◮ Link with an i.i.d. network. ◮ Ends up with (PPS) for Renewal or Poisson processes. ◮ Ends up with a more intricate system with linear Hawkes processes.

Second approach:

◮ Network of weakly dependent neurons (asymptotically independent). ◮ Refractory period possible for the limit process. Its distribution is given by

(PPS).

◮ Remark: The hj→i’s can be i.i.d. random variables.

Outlook:

◮ Study of the system in expectation for linear Hawkes processes. ◮ Fluctuations around the mean limit behaviour (Central Limit Theorem).

slide-83
SLIDE 83

Introduction Thinning procedure 1/ Expectation 2/ Mean-field Summary

Summary

First approach:

◮ Link with an i.i.d. network. ◮ Ends up with (PPS) for Renewal or Poisson processes. ◮ Ends up with a more intricate system with linear Hawkes processes.

Second approach:

◮ Network of weakly dependent neurons (asymptotically independent). ◮ Refractory period possible for the limit process. Its distribution is given by

(PPS).

◮ Remark: The hj→i’s can be i.i.d. random variables.

Outlook:

◮ Study of the system in expectation for linear Hawkes processes. ◮ Fluctuations around the mean limit behaviour (Central Limit Theorem). ◮ Break independence with correlated synaptic weights (cf Faugeras and

Maclaurin, 2014).