(Towards)
ACTIVE MEASUREMENT FOR NEUROSCIENCE
Ross Boczar PhD Student
boczar@berkeley.edu
Eric Jonas Ben Recht
Berkeley Center for Computational Imaging
LCCC Focus Period on Large-Scale and Distributed Optimization June 2017
A CTIVE M EASUREMENT F OR N EUROSCIENCE LCCC Focus Period on - - PowerPoint PPT Presentation
(Towards) A CTIVE M EASUREMENT F OR N EUROSCIENCE LCCC Focus Period on Large-Scale and Distributed Optimization June 2017 Ross Boczar PhD Student boczar@berkeley.edu B erkeley C enter for C omputational Ben Eric I maging Recht
(Towards)
Ross Boczar PhD Student
boczar@berkeley.edu
Eric Jonas Ben Recht
Berkeley Center for Computational Imaging
LCCC Focus Period on Large-Scale and Distributed Optimization June 2017
I have to give a talk in 3 months.
finite experimental time! (can record from an organism for a very short period of time)
control over the neurons via
lasers to turn on and off individual cells or subpopulations of cells
How do we learn as much about the system as quickly as possible?
Start simple:
Sequential Optimal Experiment Design for Neurophysiological Experiments Lewi, Butera, and Paninski 2009 Adaptive Bayesian Methods for Closed-loop Neurophysiology Pillow and Park 2016
showing the adaptive measurement paradigm
Pillow and Park 2016
trial t
f(x; θ) = b + A exp ⇣ −
1 2σ2 (x − µ)2⌘
λ = f(x) p(r|x) =
1 r!λreλ.
Pillow and Park 2016
L(λt|Dt) = log p(Rt|λt) = Rt> log λt − 1>λt,
Parameter space small enough to grid in this case (not typical)
Log-likelihood based on observed responses:
Pillow and Park 2016
Uinfomax(x|Dt) = Er,θ h log p(θ|r, x, Dt) p(θ|Dt) i =
One of multiple criteria to optimize (MMSE, prediction error,…) Requires integrating over parameter and response spaces, can use MCMC / bag of samples, in this example we can numerically integrate (not typical)
trial t trial t+1
Pillow and Park 2016, Fig. 1
trial t trial t+1
Pillow and Park 2016, Fig. 1
Generate the next xt in a smart, fast way Update your belief state
Lewi et al. 2009, Fig. 2
Lewi et al. 2009, Fig. 2
a compact representation for the parameter distribution
integration to solve for the next x — have to grid or sample based on heuristics (i.i.d. is bad!)
MCMC sampling, certain ops can get computationally (and financially!) expensive, would like to deal with more complicated models, …
exploration for these computationally intensive tasks, many of which are “embarrassingly parallel”
Why is there no “cloud button”?
“Most wrens are small and rather inconspicuous, except for their loud and often complex songs.”
pywren.io
Powered by Continuum Analytics +
(Leptotyphlops carlae)
Start Delete non-AVX2 MKL strip shared libs conda clean eliminate pkg delete pyc 977 MB
1205MB 441MB
946 MB 670 MB 510MB Want our runtime to include
single-core (AVX2)
single-core (AVX2)
hyperparams
Current parameter distribution
hyperparams
Current parameter distribution Possible sample locations x1 x2 x3 x4
hyperparams
Current parameter distribution Possible sample locations Dream about the future x1 x2 x3 x4
hyperparams
Current parameter distribution Possible sample locations n-step evaluation Dream about the future x1 x2 x3 x4
hyperparams
Current parameter distribution x2
policies:
Belief update function
Adaptive measurement function
learning / deep learning technique based on problem
learning / deep learning technique based on problem
http://www.argmin.net/2017/04/03/evolution/