pde models of neural networks
play

PDE models of neural networks Beno t Perthame Introduction The - PowerPoint PPT Presentation

PDE models of neural networks Beno t Perthame Introduction The electrically active cells are characterized by an action potential Hodgkin-Huxley FitzHugh-Nagumo Morris-Lekar Izhikevich Mitchell-Schaeffer Introduction 1


  1. PDE models of neural networks Beno ˆ ıt Perthame

  2. Introduction The electrically active cells are characterized by an action potential • Hodgkin-Huxley • FitzHugh-Nagumo • Morris-Lekar • Izhikevich • Mitchell-Schaeffer

  3. Introduction 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Solutions to the Hodgkin-Huxley model and to the FitzHugh-Nagumo model These models are accurate but very expensive/difficult to use for large assemblies of neurones.

  4. Introduction The Wilson-Cowan model (1972) describes the firing rates N ( t, x ) of neuron assemblies located at position x through an integral equation � d � � dtN ( x, t ) = − N ( x, t ) + w ( x, y ) σ N ( y, t ) dy + s ( x, t ) Feature : multiple steady states and bifurcation theory (Bressloff-Golubitsky, Chossat-Faugeras) • σ ( · ) = sigmoid • w ij = connectivity matrix • s = source

  5. Introduction The Wilson-Cowan model (1972) describes the firing rates N ( t, x ) of neuron assemblies located at position x through an integral equation � d � � dtN ( x, t ) = − N ( x, t ) + w ( x, y ) σ N ( y, t ) dy + s ( x, t ) Feature : multiple steady states and bifurcation theory (Bressloff-Golubitsky, Chossat-Faugeras) Aim : large scale brain activity, visual hallucinations (Kl¨ uver, Oster, Siegel...) .

  6. OUTLINE OF THE LECTURE I. Principle of Noisy Integrate and Fire model II. The nonlinear Noisy Integrate and Fire model III. The elapsed time approach

  7. Leaky Integrate and Fire The Leaky Integrate & Fire model is simpler � � dV ( t ) = − V ( t ) + I ( t ) dt + σdW ( t ) , V ( t ) < V Firing V ( t − ) = V Firing = ⇒ V ( t + ) = V reset . The idea was introduced by L. Lapicque (1907). • I ( t ) input current • Noise or not • Stochastic firing

  8. Leaky Integrate and Fire 3.0 2.5 2.0 1.5 1.0 0.5 0.0 � 0.5 0.0 0.2 0.4 0.6 0.8 1.0 1.2 1.4 1.6 1.8 2.0 Solution to the LIF model • N. Brunel, V. Hakim, W. Gerstner and W. Kistler... • Fit to measurements • Explains qualitatively observations on the brain activity

  9. Leaky Integrate and Fire Written in terms of PDEs, the probability n ( v, t ) to find a neuron at the potential v  Noise  leak+external currents  � �� � neurons reset   a∂ 2 n ( v, t ) � �� �  �� � � � �� �  ∂n ( v,t ) + ∂  − v + I ( t ) n ( v, t ) = δ ( v = V R ) N ( t ) , − v ≤ V F ,   ∂t ∂v ∂v 2  n ( V F , t ) = 0 , n ( −∞ , t ) = 0 ,        N ( t ) := − a ∂n ( V F ,t )   ≥ 0 , (the total flux of neurons firing at V F ) .  ∂v N ( t ) is also a Lagrange multiplier for the constraint � V F −∞ n ( v, t ) dv = 1 .

  10. Leaky Integrate and Fire  �� � � − a ∂ 2 n ( v,t ) ∂n ( v,t ) + ∂  − v + I ( t ) n ( v, t ) = δ ( v = V R ) N ( t ) , v ≤ V F ,   ∂v 2 ∂t ∂v     n ( V F , t ) = 0 , p ( −∞ , t ) = 0 ,      N ( t ) := − a ∂n ( V F ,t )  ≥ 0 , (the total flux of firing neurons at V F ) .  ∂v Properties (M. C´ aceres, J. Carrillo, BP) The solutions satisfy � V F • n ≥ 0 , −∞ n ( v, t ) dv = 1, • For I ( t ) ≡ 0 , n ( v, t ) − t →∞ P ( v ) the unique steady state of integral 1 → (desynchronization) • The convergence rate is exponential

  11. Leaky Integrate and Fire The proof uses • the Relative Entropy � V F � n ( v, t ) d � −∞ P ( v ) H dv ≤ 0 , P ( v ) dt for H ( · ) convex, • Hardy/Poincar´ e inequality, � V F � V F −∞ P ( v ) | u ( v ) | 2 dv ≤ C −∞ P ( v ) |∇ u ( v ) | 2 dv, � V F for −∞ P ( v ) u ( v ) dv = 0 [notice P ( V F ) = 0]. Ledoux, Barthe and Roberto

  12. Noisy LIF networks For networks, the current I ( t ) is related to the total activity of the network  �� � � � � ∂ 2 n ( v,t ) ∂n ( v,t ) + ∂ − v + bN ( t ) n ( v, t ) N ( t ) = δ V R ( v ) N ( t ) , v ≤ V F ,  − a  ∂v 2 ∂t ∂v     n ( V F , t ) = 0 , n ( −∞ , t ) = 0 ,   � ∂  �   N ( t ) := − a N ( t ) ∂v n ( V F , t ) ≥ 0 , total flux of firing neurons at V F .  Constitutive laws • I ( t ) = bN ( t ) • b = connectivity • b > 0 for excitatory neurones • b < 0 for inhibitory neurones • a ( N ) = a 0 + a 1 N

  13. Noisy LIF networks Theorem (J. Carrillo, BP, D. Smets) Assume • a = a 0 > 0 and b < 0 (inhibitory) • the initial data is bounded by a supersolution (in a certain sense) Then, • There are global solutions • Uniformly bounded for all t > 0

  14. Noisy LIF networks Theorem (M. C´ aceres, J. Carrillo, BP) Assume • a ≥ a 0 > 0 and b > 0 • the initial data is concentrated enough around v = V F . Then, • there are NO global weak solutions • larger nonlinear diffusion does not help Possible interpretation • N ( t ) → ρδ ( t − t BU ). • partial synchronization

  15. Noisy LIF networks Theorem (M. C´ aceres, J. Carrillo, BP) Assume • a ≥ a 0 > 0 and b > 0 • the initial data is concentrated enough around v = V F . Then, • there are NO global weak solutions • larger nonlinear diffusion does not help Possible interpretation • N ( t ) → ρδ ( t − t BU ). • partial synchronization (see S. Ha)

  16. Noisy LIF networks Numerical solution of the blow-up phenomena probability density n ( v ) Total neuronal activity N ( t )

  17. Noisy LIF networks Theorem (Steady states) For • b > 0 small enough, there is a unique steady state • b > ( V F − V R ), 2 ab < ( V F − V R ) 2 V R , then there are at least 2 steady states • b > 0 large enough, there are no steady states. b � 0.5 b � 1.5 6 b � 3 5 1 4 3 2 1 2 4 6 8 10

  18. Noisy LIF networks Similarity with a Keller-Segel type model by V. Calvez and R. Voituriez for microtubules arrangments on the membrane  ∂z [ µ ( t ) n ( z, t )] − ∂ 2 n ( z,t ) ∂n ( z,t ) − ∂ = 0 , z ≥ 0 ,   ∂z 2 ∂t     ∂ ∂z n (0 , t ) + µ ( t ) n (0 , t ) = 0 ,     dµ ( t ) = n (0 , t ) − µ ( t )   L . dt • Blow-up for large mass • Smooth solutions for small mass (and stable steady state)

  19. Elapsed time structured model K. Pakdaman, J. Champagnat, J.-F. Vibert have proposed to structure by time rather than potential which is a possible coding of neuronal information

  20. Elapsed time structured model • s represents the time elapsed since the last discharge • n ( s, t ) probability of finding a neuron in ’state’ s at time t • p ( s, N ) ≤ 1 represents the firing rate of neurons in the ’state s ’ • N ( t ) = activity of the network + external signaling  elapsed time advances   firing neurons � �� �   + ∂n ( s, t )  � �� � ∂n ( s,t )  + p ( s, bN ( t )) n ( s, t ) = 0 ,    ∂t ∂s � + ∞   n ( s = 0 , t ) = p ( s, bN ( t )) n ( s, t ) ds ,     0   � �� �  neurons reset � + ∞ This model always satisfies n ( s, t ) ds = 1 . 0

  21. Elapsed time structured model • s represents the time elapsed since the last discharge • n ( s, t ) probability of finding a neuron in ’state’ s at time t • p ( s, N ) ≤ 1 represents the firing rate of neurons in the ’state s ’ • being given a total activity N  ∂n ( s,t ) + ∂n ( s,t ) + p ( s, bN ( t )) n ( s, t ) = 0 ,  ∂t ∂s � + ∞ n ( s = 0 , t ) = p ( s, bN ( t )) n ( s, t ) ds,  0 N ( t ) := n ( s = 0 , t ) • b > 0 connectivity of the network • excitatory neurons are represented by ∂p ( s,N ) > 0 ∂N

  22. Elapsed time structured model  ∂n ( s,t ) + ∂n ( s,t ) + p ( s, bN ( t )) n ( s, t ) = 0 ,  ∂t ∂s � + ∞  n ( s = 0 , t ) = p ( s, bN ( t )) n ( s, t ) ds, 0 1.0 0.9 0.8 0.7 0.6 0.5 0.4 N 3 N 2 N 1 0.3 0.2 0.1 0.0 0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 the function s �→ p ( s, N ) (refractory state+ fast transition)

  23. Elapsed time structured model With synaptic integration  ∂n ( s,t ) + ∂n ( s,t ) + p ( s, X ( t )) n ( s, t ) = 0 ,  ∂t ∂s � + ∞ N ( t ) := n ( s = 0 , t ) = p ( s, X ( t )) n ( s, t ) ds,  0 � t X ( t ) := b 0 N ( t − u ) ω ( u ) du.

  24. Elapsed time structured model  ∂n ( s,t ) + ∂n ( s,t ) + p ( s, bN ( t )) n ( s, t ) = 0 ,  ∂t ∂s � + ∞ 1 b N ( t ) := n ( s = 0 , t ) = p ( s, bN ( t )) n ( s, t ) ds,  0 Properties � ∞ • n ≥ 0, 0 n ( s, t ) ds = 1, • N ( t ) ≤ 1, n ( s, t ) ≤ 1, • there is a unique solution, Linear case For p ≡ p ( s ) then • n ( s, t ) − t →∞ P ( s ) the unique steady state. →

  25. Elapsed time structured model The proof goes through Generalized Relative Entropy � ∞ � n ( s, t ) d � Φ( s ) P ( s ) H ds ≤ 0 , P ( s ) dt 0 for H ( · ) convex.

  26. Elapsed time structured model Properties • For small or large connectivity ( b small or large) then desynchronization still holds n ( s, t ) − t →∞ P b ( s ) → • There are several periodic solutions (explicit), • These are stable (observed numerically).

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend