1 CSE/NB 528: Final Lecture
CSE/NB 528 Final Lecture: All Good Things Must…
2 CSE/NB 528: Final Lecture
Course Summary
- Where have we been?
- Course Highlights
- Where do we go from here?
- Challenges and Open Problems
- Further Reading
Encoding and decoding neural information Encoding : building - - PDF document
CSE/NB 528 Final Lecture: All Good Things Must CSE/NB 528: Final Lecture 1 Course Summary Where have we been? Course Highlights Where do we go from here? Challenges and Open Problems Further Reading CSE/NB 528: Final
1 CSE/NB 528: Final Lecture
2 CSE/NB 528: Final Lecture
3 CSE/NB 528: Final Lecture
What is the neural code?
What is the nature of the code? Representing the spiking output: single cells vs populations rates vs spike times vs intervals What features of the stimulus does the neural system represent?
4 CSE/NB 528: Final Lecture
Encoding: building functional models of neurons/neural systems and predicting the spiking output given the stimulus Decoding: what can we say about the stimulus given what we observe from the neuron or neural population?
5 CSE/NB 528: Final Lecture
Spike trains are variable Models are probabilistic Deviations are close to independent
6 CSE/NB 528: Final Lecture
Highlights: Neural Encoding
spike-triggering stimulus features stimulus X(t) multidimensional decision function x1 x2 x3 f1 f2 f3 spiking output r(t)
7 CSE/NB 528: Final Lecture
STA
Gaussian prior stimulus distribution Spike-conditional distribution
covariance
8 CSE/NB 528: Final Lecture
s P(s|spike) P(s) s P(s|spike) P(s)
9 CSE/NB 528: Final Lecture
p(r|+) p(r|-) <r>+ <r>- z Decoding corresponds to comparing test to threshold. a(z) = P[ r ≥ z|-] false alarm rate, “size” b(z) = P[ r ≥ z|+] hit rate, “power”
10 CSE/NB 528: Final Lecture
11 CSE/NB 528: Final Lecture
Theunissen & Miller, 1991
RMS error in estimate
e.g. cosine tuning curves
12 CSE/NB 528: Final Lecture
MAP: s* which maximizes p[s|r] ML: s* which maximizes p[r|s] Difference is the role of the prior: differ by factor p[s]/p[r]
For cercal data:
13 CSE/NB 528: Final Lecture
14 CSE/NB 528: Final Lecture
15 CSE/NB 528: Final Lecture
Excitability is due to the properties of ion channels
16 CSE/NB 528: Final Lecture
Ohm’s law: and Kirchhoff’s law
Capacitive current Ionic currents Externally applied current
17 CSE/NB 528: Final Lecture
A sequence of neural models of increasing complexity that approach the behavior of real neurons Integrate and fire neuron: subthreshold, like a passive membrane spiking is due to an imposed threshold at VT Spike response model: subthreshold, arbitrary kernel spiking is due to an imposed threshold at VT postspike, incorporates afterhyperpolarization Simple model: complete 2D dynamical system spiking threshold is intrinsic have to include a reset potential
18 CSE/NB 528: Final Lecture
V
m e L m
If V > Vthreshold Spike Then reset: V = Vreset Integrate-and- Fire Model
19 CSE/NB 528: Final Lecture
Gerstner; Keat et al. 2001
20 CSE/NB 528: Final Lecture
Filtering Shunting Delay lines Information segregation Synaptic scaling Direction selectivity
21 CSE/NB 528: Final Lecture
Coupling conductances
Neuronal structure can be modeled using electrically coupled compartments
22 CSE/NB 528: Final Lecture
Presynaptic spikes cause neurotransmitters to cross the cleft and bind to postsynaptic receptors, allowing ions to flow in and change postsynaptic potential Spike
23 CSE/NB 528: Final Lecture
Size of PSP is a measure of synaptic strength Can vary on the short term due to input history Long term due to synaptic plasticity (LTP/LTD)
24 CSE/NB 528: Final Lecture
25 CSE/NB 528: Final Lecture
Output Decay Input Feedback
26 CSE/NB 528: Final Lecture
w
T T
w
T
w
Hebb rule performs principal component analysis (PCA)
27 CSE/NB 528: Final Lecture
Droning lecture Mathematical derivations Lack of sleep
28 CSE/NB 528: Final Lecture
] ; | [ G p v u ] ; | [ G p u v
Causes v Data u Generative model (data likelihood) (posterior) Unsupervised learning = learning the hidden causes of input data G = (v, v) Causes of clustered data “Causes”
images
Use EM algorithm for learning the parameters G
] ; [ G p v
(prior)
29 CSE/NB 528: Final Lecture
Perceptron: Inputs uj (-1 or +1) Output vi (-1 or +1) Weighted Sum Threshold Separating hyperplane: u1 u2
30 CSE/NB 528: Final Lecture
)) ( (
m k k jk j ij m i
u w g W g v
2 ,
) ( 2 1 ) , (
m i i m m i jk ij
v d w W E
m k
u
Finds W and w that minimize errors:
Backpropagation for Multilayered Networks
Desired output
Example: Truck backer upper
31 CSE/NB 528: Final Lecture
rewards (TD learning):
state using TD learning
based on value of next state (using the TD error)
(http://employees.csbsju.edu/tcreed/pb/pdoganim.html)
2.5 1
32 CSE/NB 528: Final Lecture
synapses
advantages?
channels and their density
such as efficient coding and Bayesian inference?
environment and engage in purposeful behavior?
33 CSE/NB 528: Final Lecture
et al., MIT Press, 1997
Oxford University Press, 1999
al., MIT Press, 2002
2007
Sutton and A. Barto, MIT Press, 1998
34 CSE/NB 528: Final Lecture
10:30am-12:20pm in the same classroom
laptop OR
Tuesday, June 11 (by email to both Adrienne and Raj)
35 CSE/NB 528: Final Lecture
Have a great summer!
Au revoir!