On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou Harvard University Kai-Min Chung Academia Sinica Chi-Jen Lu Academia Sinica
On the Algorithmic Power of Spiking Neural Networks 1/10
ITCS 2019
Chi-Ning Chou (Harvard University)
On the Algorithmic Power of Spiking Neural Networks Chi-Ning Chou - - PowerPoint PPT Presentation
On the Algorithmic Power of Spiking Neural Networks Chi-Ning Chou Kai-Min Chung Chi-Jen Lu Harvard University Academia Sinica Academia Sinica ITCS 2019 Chi-Ning Chou (Harvard University) On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou Harvard University Kai-Min Chung Academia Sinica Chi-Jen Lu Academia Sinica
On the Algorithmic Power of Spiking Neural Networks 1/10
ITCS 2019
Chi-Ning Chou (Harvard University)
On the Algorithmic Power of Spiking Neural Networks 2/10
Chi-Ning Chou (Harvard University)
On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou (Harvard University)
2/10
On the Algorithmic Power of Spiking Neural Networks
* By Pennstatenews https://www.flickr.com/photos/pennstatelive/37247502805
Chi-Ning Chou (Harvard University)
2/10
On the Algorithmic Power of Spiking Neural Networks
* By Pennstatenews https://www.flickr.com/photos/pennstatelive/37247502805
Neurons: Nerve cells
Chi-Ning Chou (Harvard University)
2/10
On the Algorithmic Power of Spiking Neural Networks
* By Pennstatenews https://www.flickr.com/photos/pennstatelive/37247502805
Neurons: Nerve cells Synapses: Connections between neurons
Chi-Ning Chou (Harvard University)
2/10
On the Algorithmic Power of Spiking Neural Networks
* By Pennstatenews https://www.flickr.com/photos/pennstatelive/37247502805
Neurons: Nerve cells Spikes: Instantaneous signals Synapses: Connections between neurons
Chi-Ning Chou (Harvard University)
2/10
On the Algorithmic Power of Spiking Neural Networks
Neurons Synapses Spikes
Chi-Ning Chou (Harvard University) 2/10
On the Algorithmic Power of Spiking Neural Networks
their variants [Fit61, Ste65, ML81, HR84, Ger95, KGH97, BL03, FTHVVB03, I+03, TMS14]. Neurons Synapses Spikes
Chi-Ning Chou (Harvard University) 2/10
On the Algorithmic Power of Spiking Neural Networks
their variants [Fit61, Ste65, ML81, HR84, Ger95, KGH97, BL03, FTHVVB03, I+03, TMS14].
Neurons Synapses Spikes
Chi-Ning Chou (Harvard University) 2/10
On the Algorithmic Power of Spiking Neural Networks
their variants [Fit61, Ste65, ML81, HR84, Ger95, KGH97, BL03, FTHVVB03, I+03, TMS14].
the firing rate of integrate-and-fire SNNs and an optimization problem.
Neurons Synapses Spikes
Chi-Ning Chou (Harvard University) 2/10
On the Algorithmic Power of Spiking Neural Networks
their variants [Fit61, Ste65, ML81, HR84, Ger95, KGH97, BL03, FTHVVB03, I+03, TMS14].
the firing rate of integrate-and-fire SNNs and an optimization problem. SNNs seem to have non-trivial computational power. Can we understand them better through the lens of algorithms?
Neurons Synapses Spikes
Chi-Ning Chou (Harvard University) 2/10
On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou (Harvard University) 3/10
On the Algorithmic Power of Spiking Neural Networks
1 2 3 4 5 7 6
Chi-Ning Chou (Harvard University) 3/10
On the Algorithmic Power of Spiking Neural Networks
1 2 3 4 5 7 6
𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃
Chi-Ning Chou (Harvard University) 3/10
On the Algorithmic Power of Spiking Neural Networks
1 2 3 4 5 7 6
𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃
Chi-Ning Chou (Harvard University) 3/10
On the Algorithmic Power of Spiking Neural Networks
1 2 3 4 5 7 6
𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃
External charging
Chi-Ning Chou (Harvard University) 3/10
On the Algorithmic Power of Spiking Neural Networks
1 2 3 4 5 7 6
𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃
Spiking effects
Chi-Ning Chou (Harvard University) 3/10
On the Algorithmic Power of Spiking Neural Networks
𝑱4 𝑱5 𝑱𝟓 𝑱7 𝑱8 𝑱9 𝑱𝟖
1 2 3 4 5 7 6
𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃
Chi-Ning Chou (Harvard University) 3/10
On the Algorithmic Power of Spiking Neural Networks
1 2 3 4 5 7 6
𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃
𝒕< 𝑢 = 1 ⇔ 𝒗< 𝑢 ≥ 𝜃
Firing Rule
Chi-Ning Chou (Harvard University) 3/10
On the Algorithmic Power of Spiking Neural Networks
−𝐷8@ −𝐷5@ −𝐷4@ −𝐷@@ −𝐷7@ −𝐷A@ −𝐷9@
1 2 3 4 5 7 6
𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃
𝒕< 𝑢 = 1 ⇔ 𝒗< 𝑢 ≥ 𝜃
Firing Rule
Chi-Ning Chou (Harvard University) 3/10
On the Algorithmic Power of Spiking Neural Networks
−𝐷8@ −𝐷5@ −𝐷4@ −𝐷@@ −𝐷7@ −𝐷A@ −𝐷9@
1 2 3 4 5 7 6
𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃
𝒕< 𝑢 = 1 ⇔ 𝒗< 𝑢 ≥ 𝜃
Firing Rule
Chi-Ning Chou (Harvard University) 3/10
On the Algorithmic Power of Spiking Neural Networks
1 2 3 4 5 7 6
𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃
𝒕< 𝑢 = 1 ⇔ 𝒗< 𝑢 ≥ 𝜃
Firing Rule
Chi-Ning Chou (Harvard University) 3/10
On the Algorithmic Power of Spiking Neural Networks
1 2 3 4 5 7 6
𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃
𝒕< 𝑢 = 1 ⇔ 𝒗< 𝑢 ≥ 𝜃
Firing Rule
Chi-Ning Chou (Harvard University) 3/10
On the Algorithmic Power of Spiking Neural Networks
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 0
#spikes = 0; 𝒚4 𝑢 = 0; 𝒗4 𝑢 = 0 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0
𝜃 𝜃
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 1
𝜃 𝜃
#spikes = 0; 𝒚4 𝑢 = 0; 𝒗4 𝑢 = 0.2 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 2
𝜃 𝜃
#spikes = 0; 𝒚4 𝑢 = 0; 𝒗4 𝑢 = 0.4 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 3
𝜃 𝜃
#spikes = 0; 𝒚4 𝑢 = 0; 𝒗4 𝑢 = 0.6 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 4
𝜃 𝜃
#spikes = 0; 𝒚4 𝑢 = 0; 𝒗4 𝑢 = 0.8 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 5
𝜃 𝜃
#spikes = 0; 𝒚4 𝑢 = 0; 𝒗4 𝑢 = 1 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 5
𝜃 𝜃
#spikes = 1; 𝒚4 𝑢 = 0.2; 𝒗4 𝑢 = 1 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 5
𝜃 𝜃
−1 0.5
#spikes = 1; 𝒚4 𝑢 = 0.2; 𝒗4 𝑢 = 1 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 6
𝜃 𝜃
#spikes = 1; 𝒚4 𝑢 = 0.167; 𝒗4 𝑢 = 0.2 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 7
𝜃 𝜃
#spikes = 1; 𝒚4 𝑢 = 0.143; 𝒗4 𝑢 = 0.4 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 8
𝜃 𝜃
#spikes = 1; 𝒚4 𝑢 = 0.125; 𝒗4 𝑢 = 0.6 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 9
𝜃 𝜃
#spikes = 1; 𝒚4 𝑢 = 0.111; 𝒗4 𝑢 = 0.8 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 10
𝜃 𝜃
#spikes = 1; 𝒚4 𝑢 = 0.1; 𝒗4 𝑢 = 1 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 10
𝜃 𝜃
#spikes = 2; 𝒚4 𝑢 = 0.2; 𝒗4 𝑢 = 1 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 10
𝜃 𝜃
−1 0.5
#spikes = 2; 𝒚4 𝑢 = 0.2; 𝒗4 𝑢 = 1 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 11
𝜃 𝜃
#spikes = 2; 𝒚4 𝑢 = 0.182; 𝒗4 𝑢 = 0.2 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 1
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 11
𝜃 𝜃
#spikes = 2; 𝒚4 𝑢 = 0.182; 𝒗4 𝑢 = 0.2 #spikes = 1; 𝒚5 𝑢 = 0.09; 𝒗5 𝑢 = 1
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 11
𝜃 𝜃
−1
#spikes = 2; 𝒚4 𝑢 = 0.182; 𝒗4 𝑢 = 0.2 #spikes = 1; 𝒚5 𝑢 = 0.09; 𝒗5 𝑢 = 1
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 12
𝜃 𝜃
#spikes = 2; 𝒚4 𝑢 = 0.167; 𝒗4 𝑢 = 0.4 #spikes = 1; 𝒚5 𝑢 = 0.83; 𝒗5 𝑢 = 0
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 1000
#spikes = 200; 𝒚4 𝑢 = 0.200; 𝒗4 𝑢 = 1 #spikes = 99; 𝒚5 𝑢 = 0.099; 𝒗5 𝑢 = 0.5
𝜃 𝜃
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
1 2
0.2
𝑢 = 1000
#spikes = 200; 𝒚4 𝑢 = 0.200; 𝒗4 𝑢 = 1 #spikes = 99; 𝒚5 𝑢 = 0.099; 𝒗5 𝑢 = 0.5
𝜃 𝜃
1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.
Chi-Ning Chou (Harvard University) 4/10
On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou (Harvard University) 5/10
On the Algorithmic Power of Spiking Neural Networks
[Barrett-Denève-Machens, NIPS 2013] Using an “optimization problem” to analyze the firing rate of an integrate-and-fire SNN.
Chi-Ning Chou (Harvard University) 5/10
On the Algorithmic Power of Spiking Neural Networks
firing rate
(𝐷, 𝑱)
Chi-Ning Chou (Harvard University)
[Barrett-Denève-Machens, NIPS 2013] Using an “optimization problem” to analyze the firing rate of an integrate-and-fire SNN.
5/10
On the Algorithmic Power of Spiking Neural Networks
Non-negative Least Squares min
𝒚∈ℝ] 𝐷𝒚 − 𝑱 5 5
s.t. 𝒚< ≥ 0, ∀𝑗 ∈ [𝑜]
firing rate solution
(𝐷, 𝑱)
Chi-Ning Chou (Harvard University)
[Barrett-Denève-Machens, NIPS 2013] Using an “optimization problem” to analyze the firing rate of an integrate-and-fire SNN.
5/10
On the Algorithmic Power of Spiking Neural Networks
solution
Using solution to estimate firing rate
firing rate solution
Non-negative Least Squares min
𝒚∈ℝ] 𝐷𝒚 − 𝑱 5 5
s.t. 𝒚< ≥ 0, ∀𝑗 ∈ [𝑜]
(𝐷, 𝑱) (𝐷, 𝑱)
Chi-Ning Chou (Harvard University)
[Barrett-Denève-Machens, NIPS 2013] Using an “optimization problem” to analyze the firing rate of an integrate-and-fire SNN.
5/10
On the Algorithmic Power of Spiking Neural Networks
solution
Using solution to estimate firing rate
firing rate
Using firing rate to estimate solution
firing rate solution
Non-negative Least Squares min
𝒚∈ℝ] 𝐷𝒚 − 𝑱 5 5
s.t. 𝒚< ≥ 0, ∀𝑗 ∈ [𝑜]
(𝐷, 𝑱) (𝐷, 𝑱) (𝐷, 𝑱)
Chi-Ning Chou (Harvard University)
[Barrett-Denève-Machens, NIPS 2013] Using an “optimization problem” to analyze the firing rate of an integrate-and-fire SNN.
5/10
On the Algorithmic Power of Spiking Neural Networks
(𝐷, 𝑱)
solution
Using solution to estimate firing rate
firing rate (𝐷, 𝑱)
Using firing rate to estimate solution
firing rate solution No provable analysis!
Non-negative Least Squares min
𝒚∈ℝ] 𝐷𝒚 − 𝑱 5 5
s.t. 𝒚< ≥ 0, ∀𝑗 ∈ [𝑜]
(𝐷, 𝑱)
Chi-Ning Chou (Harvard University)
[Barrett-Denève-Machens, NIPS 2013] Using an “optimization problem” to analyze the firing rate of an integrate-and-fire SNN.
5/10
On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou (Harvard University) 6/10
On the Algorithmic Power of Spiking Neural Networks
The first proof for the firing rate of integrate-and-fire SNNs efficiently solving the non-negative least squares problem.
Chi-Ning Chou (Harvard University) 6/10
On the Algorithmic Power of Spiking Neural Networks
The first proof for the firing rate of integrate-and-fire SNNs efficiently solving the non-negative least squares problem.
Chi-Ning Chou (Harvard University)
What if there are infinitely many solutions?
6/10
On the Algorithmic Power of Spiking Neural Networks
The first proof for the firing rate of integrate-and-fire SNNs efficiently solving the non-negative least squares problem.
Chi-Ning Chou (Harvard University)
What if there are infinitely many solutions?
sparse solution (in the ℓ4 sense)
6/10
On the Algorithmic Power of Spiking Neural Networks
The first proof for the firing rate of integrate-and-fire SNNs efficiently solving the non-negative least squares problem.
Chi-Ning Chou (Harvard University)
What if there are infinitely many solutions?
sparse solution (in the ℓ4 sense), by implementing a primal-dual + projected gradient descent algorithm.
6/10
On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou (Harvard University) 7/10
On the Algorithmic Power of Spiking Neural Networks
Given 𝐵 ∈ ℝe×-, 𝒄 ∈ ℝe, and 𝜗 > 0. Suppose 𝐵 satisfies some regular
𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 , 𝑱 = 𝐵i𝒄 −𝐵i𝒄 , and properly set the integrate-and-fire SNN.
Chi-Ning Chou (Harvard University) 7/10
On the Algorithmic Power of Spiking Neural Networks
Given 𝐵 ∈ ℝe×-, 𝒄 ∈ ℝe, and 𝜗 > 0. Suppose 𝐵 satisfies some regular
𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 , 𝑱 = 𝐵i𝒄 −𝐵i𝒄 , and properly set the integrate-and-fire SNN. Let 𝒚∗ be the optimal solution to the ℓ𝟐 minimization problem. min
𝒚∈ℝ] 𝒚 4
s.t. 𝐵𝒚 = 𝒄
Chi-Ning Chou (Harvard University) 7/10
On the Algorithmic Power of Spiking Neural Networks
Given 𝐵 ∈ ℝe×-, 𝒄 ∈ ℝe, and 𝜗 > 0. Suppose 𝐵 satisfies some regular
𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 , 𝑱 = 𝐵i𝒄 −𝐵i𝒄 , and properly set the integrate-and-fire SNN. Let 𝒚∗ be the optimal solution to the ℓ𝟐 minimization problem. min
𝒚∈ℝ] 𝒚 4
s.t. 𝐵𝒚 = 𝒄 When 𝑢 ≥ Ω(
no), we have (i) 𝒄 − 𝐵𝒚 𝑢 5 ≤ 𝜗 ⋅ 𝒄 5 and
(ii) 𝒚 𝑢
4 − 𝒚∗ 4 ≤ 𝜗 ⋅ 𝒚∗ 4.
Chi-Ning Chou (Harvard University) 7/10
On the Algorithmic Power of Spiking Neural Networks
Given 𝐵 ∈ ℝe×-, 𝒄 ∈ ℝe, and 𝜗 > 0. Suppose 𝐵 satisfies some regular
𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 , 𝑱 = 𝐵i𝒄 −𝐵i𝒄 , and properly set the integrate-and-fire SNN. Let 𝒚∗ be the optimal solution to the ℓ𝟐 minimization problem. min
𝒚∈ℝ] 𝒚 4
s.t. 𝐵𝒚 = 𝒄 When 𝑢 ≥ Ω(
no), we have (i) 𝒄 − 𝐵𝒚 𝑢 5 ≤ 𝜗 ⋅ 𝒄 5 and
(ii) 𝒚 𝑢
4 − 𝒚∗ 4 ≤ 𝜗 ⋅ 𝒚∗ 4.
Chi-Ning Chou (Harvard University) 7/10
On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou (Harvard University) 8/10
On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou (Harvard University)
Optimization problem
𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄
Dynamics
8/10
On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou (Harvard University)
Optimization problem min
𝒚∈ℝ] 𝒚 4
s.t. 𝐵𝒚 = 𝒄 (ℓ4 minimization)
𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄
Dynamics
8/10
On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou (Harvard University)
Optimization problem min
𝒚∈ℝ] 𝒚 4
s.t. 𝐵𝒚 = 𝒄 (ℓ4 minimization)
𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄
𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄
Dynamics
8/10
On the Algorithmic Power of Spiking Neural Networks
Optimization problem min
𝒚∈ℝ] 𝒚 4
s.t. 𝐵𝒚 = 𝒄 max
𝒘∈ℝu 𝒄i𝒘
s.t. 𝐵i𝒘 v ≤ 1 (ℓ4 minimization) (The dual of ℓ4 minimization)
Chi-Ning Chou (Harvard University)
𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄
𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄
Dynamics
8/10
On the Algorithmic Power of Spiking Neural Networks
𝒘 𝑢 is a projected gradient descent algorithm for dual program with non-standard projection.
Chi-Ning Chou (Harvard University)
Optimization problem min
𝒚∈ℝ] 𝒚 4
s.t. 𝐵𝒚 = 𝒄 max
𝒘∈ℝu 𝒄i𝒘
s.t. 𝐵i𝒘 v ≤ 1 (ℓ4 minimization) (The dual of ℓ4 minimization)
𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄
𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄
Dynamics
8/10
On the Algorithmic Power of Spiking Neural Networks
𝒘 𝑢 is a projected gradient descent algorithm for dual program with non-standard projection.
Chi-Ning Chou (Harvard University)
Optimization problem min
𝒚∈ℝ] 𝒚 4
s.t. 𝐵𝒚 = 𝒄 max
𝒘∈ℝu 𝒄i𝒘
s.t. 𝐵i𝒘 v ≤ 1 (ℓ4 minimization) (The dual of ℓ4 minimization)
𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄
𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄
Dynamics
8/10
External charging ≈ Gradient
On the Algorithmic Power of Spiking Neural Networks
𝒘 𝑢 is a projected gradient descent algorithm for dual program with non-standard projection.
Chi-Ning Chou (Harvard University)
Optimization problem min
𝒚∈ℝ] 𝒚 4
s.t. 𝐵𝒚 = 𝒄 max
𝒘∈ℝu 𝒄i𝒘
s.t. 𝐵i𝒘 v ≤ 1 (ℓ4 minimization) (The dual of ℓ4 minimization)
𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄
𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄
Dynamics
8/10
Spiking effect ≈ Projection
On the Algorithmic Power of Spiking Neural Networks
𝒘 𝑢 is a projected gradient descent algorithm for dual program with non-standard projection.
KKT conditions
The firing rate in primal SNN solves the primal program.
Perturbation theory
Chi-Ning Chou (Harvard University)
Optimization problem min
𝒚∈ℝ] 𝒚 4
s.t. 𝐵𝒚 = 𝒄 max
𝒘∈ℝu 𝒄i𝒘
s.t. 𝐵i𝒘 v ≤ 1 (ℓ4 minimization) (The dual of ℓ4 minimization)
𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄
𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄
Dynamics
8/10
On the Algorithmic Power of Spiking Neural Networks
𝒘 𝑢 is a projected gradient descent algorithm for dual program with non-standard projection.
KKT conditions
The firing rate in primal SNN solves the primal program.
Perturbation theory
Chi-Ning Chou (Harvard University)
Spikes are non-monotone and difficult to analyze!
Optimization problem min
𝒚∈ℝ] 𝒚 4
s.t. 𝐵𝒚 = 𝒄 max
𝒘∈ℝu 𝒄i𝒘
s.t. 𝐵i𝒘 v ≤ 1 (ℓ4 minimization) (The dual of ℓ4 minimization)
𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄
𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄
Dynamics
8/10
On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou (Harvard University) 9/10
On the Algorithmic Power of Spiking Neural Networks
“How algorithmic ideas can enrich our understanding of nature?”
Chi-Ning Chou (Harvard University) 9/10
On the Algorithmic Power of Spiking Neural Networks
Through the lens of natural algorithms, can we understand SNNs more via its algorithmic power and even discover new algorithmic ideas?
“How algorithmic ideas can enrich our understanding of nature?”
Chi-Ning Chou (Harvard University) 9/10
On the Algorithmic Power of Spiking Neural Networks
Through the lens of natural algorithms, can we understand SNNs more via its algorithmic power and even discover new algorithmic ideas?
“How algorithmic ideas can enrich our understanding of nature?”
In this work, we show that integrate-and-fire SNN uses its firing rate to efficiently solve some optimization problems in a primal-dual way!
Chi-Ning Chou (Harvard University) 9/10
On the Algorithmic Power of Spiking Neural Networks
Chi-Ning Chou (Harvard University) 10/10
On the Algorithmic Power of Spiking Neural Networks
efficiently solving non-negative least squares problem and ℓ4 minimization problem.
Chi-Ning Chou (Harvard University) 10/10
On the Algorithmic Power of Spiking Neural Networks
efficiently solving non-negative least squares problem and ℓ4 minimization problem.
Maa99, MB01].
negative least squares[BDM13].
JHM14, Maa15, JHM16].
Chi-Ning Chou (Harvard University) 10/10
On the Algorithmic Power of Spiking Neural Networks
efficiently solving non-negative least squares problem and ℓ4 minimization problem.
Maa99, MB01].
negative least squares[BDM13].
JHM14, Maa15, JHM16].
Chi-Ning Chou (Harvard University)
Very few works provide provable analysis
10/10
On the Algorithmic Power of Spiking Neural Networks
efficiently solving non-negative least squares problem and ℓ4 minimization problem.
Maa99, MB01].
negative least squares[BDM13].
JHM14, Maa15, JHM16].
Chi-Ning Chou (Harvard University)
Very few works provide provable analysis
10/10
On the Algorithmic Power of Spiking Neural Networks
efficiently solving non-negative least squares problem and ℓ4 minimization problem.
Maa99, MB01].
negative least squares[BDM13].
JHM14, Maa15, JHM16].
Thanks for your attention!
Very few works provide provable analysis
Chi-Ning Chou (Harvard University) 10/10