coding and computing with
play

Coding and computing with balanced spiking networks Sophie Deneve - PowerPoint PPT Presentation

Coding and computing with balanced spiking networks Sophie Deneve Ecole Normale Suprieure, Paris Poisson Variability in Cortex Variance of Spike Count Mean Spike Count Trial 1 Trial 2 Trial 3 Trial 4 Cortical spike trains are highly


  1. Coding and computing with balanced spiking networks Sophie Deneve Ecole Normale Supérieure, Paris

  2. Poisson Variability in Cortex Variance of Spike Count Mean Spike Count Trial 1 Trial 2 Trial 3 Trial 4

  3. Cortical spike trains are highly variable From Churchland et al, Nature neuroscience 2010

  4. Cortical spike trains are variable From Churchland et al, Nature neuroscience 2010

  5. Balance between excitation and inhibition Integrate and fire neuron: Inhibitory, Poisson I inh I exc      V V I I exc inh Excitatory, Poisson

  6. Balance between excitation and inhibition Integrate and fire neuron: Inhibitory, Poisson I inh I exc      V V I I exc inh Excitatory, Poisson Output is more regular than input

  7. Balance between excitation and inhibition Integrate and fire neuron: Inhibitory, Poisson I inh I exc      V V I I exc inh Excitatory, Poisson Output is more regular than input How does Poisson-like variability survives?

  8. Balanced excitation/inhibition      V V I I Inhibitory exc inh I inh I exc Excitatory Random walk Shadlen and Newsome, 1996

  9. Balanced excitation/inhibition      V V I I Inhibitory exc inh I inh I exc Excitatory Random walk Variability is conserved when mean excitation =mean inhibition Shadlen and Newsome, 1996

  10. E/I balance: stimulus driven response Wehr and Zador, 2003

  11. E/I balance: spontaneous activity Okun and Lampl, 2008

  12. Two types of balanced E/I Recurrent inhibition: Feed-forward inhibition: Not random: Highly structured … E E I I E E

  13. Balanced neural networks generate their own variability J EE Weak, sparse random connections I E ext   0 J v J v EE E IE I J   J 0 J v J v EI IE EI E II I I J II e.g C. van Vreeswijk and H. Sompolinsky, Science (1996);

  14. Brunel, 2001

  15. Balanced neural networks generate their own variability J EE I E ext   0 J v J v EE E IE I J   J 0 J v J v EI IE EI E II I I Asynchronous irregular regime J Low firing rate II e.g C. van Vreeswijk and H. Sompolinsky, Science (1996);

  16. Balanced neural networks generate their own variability J EE I E ext   0 J v J v EE E IE I J   J 0 J v J v EI IE EI E II I I Asynchronous irregular regime Shuffle one spike by 0.1ms J Low firing rate II e.g C. van Vreeswijk and H. Sompolinsky, Science (1996);

  17. Balanced neural networks generate their own variability J EE I E ext   0 J v J v Reshuffle all later spikes EE E IE I J   J 0 J v J v EI IE EI E II I I Asynchronous irregular regime Shuffle one spike by 0.1ms J Low firing rate II e.g C. van Vreeswijk and H. Sompolinsky, Science (1996);

  18. Balanced neural networks generate their own variability Chaotic attractor: J EE I E ext   0 J v J v Reshuffle all later spikes EE E IE I J   J 0 J v J v EI IE EI E II I I Asynchronous irregular regime Shuffle one spike by 0.1ms J Low firing rate II e.g C. van Vreeswijk and H. Sompolinsky, Science (1996);

  19. Population coding • Asynchronous, irregular spike trains. • Population coding. • E/I balance . “Requiem for a spike” Decoding = summing from large populations Code = Mean firing rates Spikes = random samples

  20. Continuous variable: Population coding Georgopoulos 1982

  21. Population Codes 100 100 s? 80 80 Activity Activity 60 60 40 40 20 20 0 0 -100 0 100 -100 0 100 Direction (deg) Preferred Direction (deg) Tuning Curves Average pattern of activity        s f x x f x i i i

  22. Noisy population Codes 100 s? 80 Activity 60 40 20 0 -100 0 100 Preferred Direction (deg) Pattern of activity ( s )       s  i exp f x f x    i i | p s x Poisson noise: i ! s i       | | p s x p s x i Independent: i

  23. Population decoding x 𝑠 𝑗 = 𝑔 𝑗 𝑦 +Noise 𝑦 2 𝑦 = argmin 𝑦 − 𝑦

  24. Population vector   Decoding is easy … x r x but usually suboptimal. i i i Optimal when tuning curves are cosyne, noise gaussian, uniformely distributed over all orientation…

  25. Optimal: Maximum likelihood 100 Likelihood s?   80 | p s x Activity 60 40 20 ˆ x x 0 Maximum likelihood estimates -100 0 100   Preferred Direction (deg)    ˆ argmax | x p s x x Pattern of activity ( s ) Decoding always optimal … but usually hard.

  26. Optimal Population decoding Decoding = summing from large neural populations x    ˆ x r j j j

  27. Efficient population coding Decoding = summing from large neural populations x In collaboration with Christian Machens Wieland Brendel Ralph Bourdoukan    ˆ Pietro Vertechi x r j j j         2  ˆ argmin r x x C r Efficient coding: r

  28. Single neuron input signal x(t) o Time Decoding= Post-synaptic integration

  29. Single neuron input signal x(t)        *exp r t o t       ˆ x t r t o Time Decoding= Post-synaptic integration

  30. Single neuron input signal x(t)       ˆ x t r t o Time Decoding= Post-synaptic integration Where do we place the spikes?

  31. Single neuron Minimize:   Input signal x(t)     2   ˆ E x t x t t Decoding error Time

  32. Single neuron Minimize:   Input signal x(t)     2   ˆ E x t x t t Greedy spike rule:  spike no spike E E t t Time

  33. Single neuron Minimize:   Input signal x(t)     2   ˆ E x t x t t Greedy spike rule:  spike no spike E E t t Time             2 2      ˆ ˆ 0 x t x t x t x t

  34. Single neuron Minimize:   Input signal x(t)     2   ˆ E x t x t t Greedy spike rule:  spike no spike E E t t Time             2 2      ˆ ˆ 0 x t x t x t x t  2           ˆ 0 x t x t 2

  35. Single neuron Minimize:   Input signal x(t)     2   ˆ E x t x t t Greedy spike rule:  spike no spike E E t t Time  2          ˆ x t x t 2

  36. Single neuron Minimize:   Input signal x(t)     2   ˆ E x t x t t Greedy spike rule:  spike no spike E E t t Time  2             ˆ V t x t x t 2 Threshold Decoding error Membrane potential

  37. Single neuron Minimize: Input signal x(t)     2 ˆ E x x Greedily spike if  spike no spike E E Time     ˆ ˆ x V x x 2 Membrane Threshold Decoding error potential   x ˆ V x x

  38. Single neuron   x x Input signal x(t)       ˆ V x x Time o Leaky Integrate and fire           2 V V x x o ˆ x Leak Input Reset  2 Threshold: 2

  39. Neural population   Minimize:  , , , x x x x 1 2 J       2 2     ˆ E x t x t r t t Decoding error Quadratic cost    ˆ i x r ij j j

  40. Neural population   Minimize:  , , , x x x x 1 2 J       2 2     ˆ E x t x t r t t  spike j no spike j E E t t    ˆ i x r ij j j

  41. Neural population   Minimize:  , , , x x x x 1 2 J       2 2     ˆ E x t x t r t t  spike j no spike j E E t t 2             ˆ T   V x x r ˆ i x r 2 ij j j

  42. Neural population   Minimize:  , , , x x x x 1 2 J       2 2     ˆ E x t x t r t t  spike j no spike j E E t t 2             ˆ T   V x x r ˆ i x r 2 ij j j Threshold Cost Decoding error

  43. Neural population        , , , x x x x             T V V x x o o 1 2 J j j ij i i kj k i i k  T Input Reset +Recurrent   T Time    ˆ i x r ij j j

  44. Neural population x Input signal x(t)  T   T Time Input signal x(t)   ˆ x r

  45. Homogeneous network x ˆ x x 1  1 1 ˆ V x 1   V ˆ 2 x r j V 3 j

  46. Neural variability = Degeneracy x ˆ x x 1  1 1 Equivalent to ˆ x   ˆ x r j j

  47. “Chaotic”  x c Shift 1 spike by 1ms c 100 ˆ , x x 0 -100 0 200 600 1000 1400 1600 2000 2400 Time (ms)

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend