On the Algorithmic Power of Spiking Neural Networks Chi-Ning Chou - - PowerPoint PPT Presentation

on the algorithmic power of spiking neural networks
SMART_READER_LITE
LIVE PREVIEW

On the Algorithmic Power of Spiking Neural Networks Chi-Ning Chou - - PowerPoint PPT Presentation

On the Algorithmic Power of Spiking Neural Networks Chi-Ning Chou Kai-Min Chung Chi-Jen Lu Harvard University Academia Sinica Academia Sinica ITCS 2019 Chi-Ning Chou (Harvard University) On the Algorithmic Power of Spiking Neural Networks


slide-1
SLIDE 1

On the Algorithmic Power of Spiking Neural Networks

Chi-Ning Chou Harvard University Kai-Min Chung Academia Sinica Chi-Jen Lu Academia Sinica

On the Algorithmic Power of Spiking Neural Networks 1/10

ITCS 2019

Chi-Ning Chou (Harvard University)

slide-2
SLIDE 2

On the Algorithmic Power of Spiking Neural Networks 2/10

What is Spiking Neural Networks (SNNs)?

Chi-Ning Chou (Harvard University)

slide-3
SLIDE 3

On the Algorithmic Power of Spiking Neural Networks

What is Spiking Neural Networks (SNNs)?

Chi-Ning Chou (Harvard University)

  • Mathematical models for “biological neural networks”.

2/10

slide-4
SLIDE 4

On the Algorithmic Power of Spiking Neural Networks

What is Spiking Neural Networks (SNNs)?

* By Pennstatenews https://www.flickr.com/photos/pennstatelive/37247502805

Chi-Ning Chou (Harvard University)

  • Mathematical models for “biological neural networks”.

2/10

slide-5
SLIDE 5

On the Algorithmic Power of Spiking Neural Networks

What is Spiking Neural Networks (SNNs)?

* By Pennstatenews https://www.flickr.com/photos/pennstatelive/37247502805

Neurons: Nerve cells

Chi-Ning Chou (Harvard University)

  • Mathematical models for “biological neural networks”.

2/10

slide-6
SLIDE 6

On the Algorithmic Power of Spiking Neural Networks

What is Spiking Neural Networks (SNNs)?

* By Pennstatenews https://www.flickr.com/photos/pennstatelive/37247502805

Neurons: Nerve cells Synapses: Connections between neurons

Chi-Ning Chou (Harvard University)

  • Mathematical models for “biological neural networks”.

2/10

slide-7
SLIDE 7

On the Algorithmic Power of Spiking Neural Networks

What is Spiking Neural Networks (SNNs)?

* By Pennstatenews https://www.flickr.com/photos/pennstatelive/37247502805

Neurons: Nerve cells Spikes: Instantaneous signals Synapses: Connections between neurons

Chi-Ning Chou (Harvard University)

  • Mathematical models for “biological neural networks”.

2/10

slide-8
SLIDE 8

On the Algorithmic Power of Spiking Neural Networks

What is Spiking Neural Networks (SNNs)?

  • Mathematical models for “biological neural networks”.

Neurons Synapses Spikes

Chi-Ning Chou (Harvard University) 2/10

slide-9
SLIDE 9

On the Algorithmic Power of Spiking Neural Networks

What is Spiking Neural Networks (SNNs)?

  • Mathematical models for “biological neural networks”.
  • Various models since the 1900s.
  • Integrate-and-fire [Lap07], Hodgkin-Huxley [HH52],

their variants [Fit61, Ste65, ML81, HR84, Ger95, KGH97, BL03, FTHVVB03, I+03, TMS14]. Neurons Synapses Spikes

Chi-Ning Chou (Harvard University) 2/10

slide-10
SLIDE 10

On the Algorithmic Power of Spiking Neural Networks

What is Spiking Neural Networks (SNNs)?

  • Mathematical models for “biological neural networks”.
  • Various models since the 1900s.
  • Integrate-and-fire [Lap07], Hodgkin-Huxley [HH52],

their variants [Fit61, Ste65, ML81, HR84, Ger95, KGH97, BL03, FTHVVB03, I+03, TMS14].

  • Study the behaviors/statistics of SNNs, e.g., firing rate.

Neurons Synapses Spikes

Chi-Ning Chou (Harvard University) 2/10

slide-11
SLIDE 11

On the Algorithmic Power of Spiking Neural Networks

What is Spiking Neural Networks (SNNs)?

  • Mathematical models for “biological neural networks”.
  • Various models since the 1900s.
  • Integrate-and-fire [Lap07], Hodgkin-Huxley [HH52],

their variants [Fit61, Ste65, ML81, HR84, Ger95, KGH97, BL03, FTHVVB03, I+03, TMS14].

  • Study the behaviors/statistics of SNNs, e.g., firing rate.
  • [Barret-Denève-Machens 2013] empirically showed a connection between

the firing rate of integrate-and-fire SNNs and an optimization problem.

Neurons Synapses Spikes

Chi-Ning Chou (Harvard University) 2/10

slide-12
SLIDE 12

On the Algorithmic Power of Spiking Neural Networks

What is Spiking Neural Networks (SNNs)?

  • Mathematical models for “biological neural networks”.
  • Various models since the 1900s.
  • Integrate-and-fire [Lap07], Hodgkin-Huxley [HH52],

their variants [Fit61, Ste65, ML81, HR84, Ger95, KGH97, BL03, FTHVVB03, I+03, TMS14].

  • Study the behaviors/statistics of SNNs, e.g., firing rate.
  • [Barret-Denève-Machens 2013] empirically showed a connection between

the firing rate of integrate-and-fire SNNs and an optimization problem. SNNs seem to have non-trivial computational power. Can we understand them better through the lens of algorithms?

Neurons Synapses Spikes

Chi-Ning Chou (Harvard University) 2/10

slide-13
SLIDE 13

On the Algorithmic Power of Spiking Neural Networks

Integrate-and-Fire (IAF) Model [Lapicque 1907]

Chi-Ning Chou (Harvard University) 3/10

slide-14
SLIDE 14

On the Algorithmic Power of Spiking Neural Networks

Integrate-and-Fire (IAF) Model [Lapicque 1907]

  • Neurons: 𝑜 = {1,2, … , 𝑜}

1 2 3 4 5 7 6

Chi-Ning Chou (Harvard University) 3/10

slide-15
SLIDE 15

On the Algorithmic Power of Spiking Neural Networks

Integrate-and-Fire (IAF) Model [Lapicque 1907]

  • Neurons: 𝑜 = {1,2, … , 𝑜}
  • Potential: 𝒗 𝑢 ∈ ℝ-

1 2 3 4 5 7 6

𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃

Chi-Ning Chou (Harvard University) 3/10

slide-16
SLIDE 16

On the Algorithmic Power of Spiking Neural Networks

Integrate-and-Fire (IAF) Model [Lapicque 1907]

  • Neurons: 𝑜 = {1,2, … , 𝑜}
  • Potential: 𝒗 𝑢 ∈ ℝ-
  • Dynamics: 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

1 2 3 4 5 7 6

𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃

Chi-Ning Chou (Harvard University) 3/10

slide-17
SLIDE 17

On the Algorithmic Power of Spiking Neural Networks

Integrate-and-Fire (IAF) Model [Lapicque 1907]

  • Neurons: 𝑜 = {1,2, … , 𝑜}
  • Potential: 𝒗 𝑢 ∈ ℝ-
  • Dynamics: 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

1 2 3 4 5 7 6

𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃

External charging

Chi-Ning Chou (Harvard University) 3/10

slide-18
SLIDE 18

On the Algorithmic Power of Spiking Neural Networks

Integrate-and-Fire (IAF) Model [Lapicque 1907]

  • Neurons: 𝑜 = {1,2, … , 𝑜}
  • Potential: 𝒗 𝑢 ∈ ℝ-
  • Dynamics: 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

1 2 3 4 5 7 6

𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃

Spiking effects

Chi-Ning Chou (Harvard University) 3/10

slide-19
SLIDE 19

On the Algorithmic Power of Spiking Neural Networks

Integrate-and-Fire (IAF) Model [Lapicque 1907]

  • Neurons: 𝑜 = {1,2, … , 𝑜}
  • Potential: 𝒗 𝑢 ∈ ℝ-
  • Dynamics: 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱
  • External charging: 𝑱 ∈ ℝ-

𝑱4 𝑱5 𝑱𝟓 𝑱7 𝑱8 𝑱9 𝑱𝟖

1 2 3 4 5 7 6

𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃

Chi-Ning Chou (Harvard University) 3/10

slide-20
SLIDE 20

On the Algorithmic Power of Spiking Neural Networks

Integrate-and-Fire (IAF) Model [Lapicque 1907]

  • Neurons: 𝑜 = {1,2, … , 𝑜}
  • Potential: 𝒗 𝑢 ∈ ℝ-
  • Dynamics: 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -

1 2 3 4 5 7 6

𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃

𝒕< 𝑢 = 1 ⇔ 𝒗< 𝑢 ≥ 𝜃

Firing Rule

Chi-Ning Chou (Harvard University) 3/10

slide-21
SLIDE 21

On the Algorithmic Power of Spiking Neural Networks

Integrate-and-Fire (IAF) Model [Lapicque 1907]

  • Neurons: 𝑜 = {1,2, … , 𝑜}
  • Potential: 𝒗 𝑢 ∈ ℝ-
  • Dynamics: 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-

−𝐷8@ −𝐷5@ −𝐷4@ −𝐷@@ −𝐷7@ −𝐷A@ −𝐷9@

1 2 3 4 5 7 6

𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃

𝒕< 𝑢 = 1 ⇔ 𝒗< 𝑢 ≥ 𝜃

Firing Rule

Chi-Ning Chou (Harvard University) 3/10

slide-22
SLIDE 22

On the Algorithmic Power of Spiking Neural Networks

Integrate-and-Fire (IAF) Model [Lapicque 1907]

  • Neurons: 𝑜 = {1,2, … , 𝑜}
  • Potential: 𝒗 𝑢 ∈ ℝ-
  • Dynamics: 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-

−𝐷8@ −𝐷5@ −𝐷4@ −𝐷@@ −𝐷7@ −𝐷A@ −𝐷9@

1 2 3 4 5 7 6

𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃

𝒕< 𝑢 = 1 ⇔ 𝒗< 𝑢 ≥ 𝜃

Firing Rule

Chi-Ning Chou (Harvard University) 3/10

slide-23
SLIDE 23

On the Algorithmic Power of Spiking Neural Networks

Integrate-and-Fire (IAF) Model [Lapicque 1907]

  • Neurons: 𝑜 = {1,2, … , 𝑜}
  • Potential: 𝒗 𝑢 ∈ ℝ-
  • Dynamics: 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-

1 2 3 4 5 7 6

𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃

𝒕< 𝑢 = 1 ⇔ 𝒗< 𝑢 ≥ 𝜃

Firing Rule

Chi-Ning Chou (Harvard University) 3/10

slide-24
SLIDE 24

On the Algorithmic Power of Spiking Neural Networks

Integrate-and-Fire (IAF) Model [Lapicque 1907]

  • Neurons: 𝑜 = {1,2, … , 𝑜}
  • Potential: 𝒗 𝑢 ∈ ℝ-
  • Dynamics: 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢 = (#spikes before time 𝑢)/𝑢

1 2 3 4 5 7 6

𝜃 𝜃 𝜃 𝜃 𝜃 𝜃 𝜃

𝒕< 𝑢 = 1 ⇔ 𝒗< 𝑢 ≥ 𝜃

Firing Rule

Chi-Ning Chou (Harvard University) 3/10

slide-25
SLIDE 25

On the Algorithmic Power of Spiking Neural Networks

Example

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-26
SLIDE 26

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 0

#spikes = 0; 𝒚4 𝑢 = 0; 𝒗4 𝑢 = 0 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0

𝜃 𝜃

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-27
SLIDE 27

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 1

𝜃 𝜃

#spikes = 0; 𝒚4 𝑢 = 0; 𝒗4 𝑢 = 0.2 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-28
SLIDE 28

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 2

𝜃 𝜃

#spikes = 0; 𝒚4 𝑢 = 0; 𝒗4 𝑢 = 0.4 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-29
SLIDE 29

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 3

𝜃 𝜃

#spikes = 0; 𝒚4 𝑢 = 0; 𝒗4 𝑢 = 0.6 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-30
SLIDE 30

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 4

𝜃 𝜃

#spikes = 0; 𝒚4 𝑢 = 0; 𝒗4 𝑢 = 0.8 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-31
SLIDE 31

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 5

𝜃 𝜃

#spikes = 0; 𝒚4 𝑢 = 0; 𝒗4 𝑢 = 1 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-32
SLIDE 32

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 5

𝜃 𝜃

#spikes = 1; 𝒚4 𝑢 = 0.2; 𝒗4 𝑢 = 1 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-33
SLIDE 33

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 5

𝜃 𝜃

−1 0.5

#spikes = 1; 𝒚4 𝑢 = 0.2; 𝒗4 𝑢 = 1 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-34
SLIDE 34

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 6

𝜃 𝜃

#spikes = 1; 𝒚4 𝑢 = 0.167; 𝒗4 𝑢 = 0.2 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-35
SLIDE 35

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 7

𝜃 𝜃

#spikes = 1; 𝒚4 𝑢 = 0.143; 𝒗4 𝑢 = 0.4 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-36
SLIDE 36

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 8

𝜃 𝜃

#spikes = 1; 𝒚4 𝑢 = 0.125; 𝒗4 𝑢 = 0.6 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-37
SLIDE 37

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 9

𝜃 𝜃

#spikes = 1; 𝒚4 𝑢 = 0.111; 𝒗4 𝑢 = 0.8 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-38
SLIDE 38

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 10

𝜃 𝜃

#spikes = 1; 𝒚4 𝑢 = 0.1; 𝒗4 𝑢 = 1 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-39
SLIDE 39

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 10

𝜃 𝜃

#spikes = 2; 𝒚4 𝑢 = 0.2; 𝒗4 𝑢 = 1 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-40
SLIDE 40

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 10

𝜃 𝜃

−1 0.5

#spikes = 2; 𝒚4 𝑢 = 0.2; 𝒗4 𝑢 = 1 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 0.5

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-41
SLIDE 41

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 11

𝜃 𝜃

#spikes = 2; 𝒚4 𝑢 = 0.182; 𝒗4 𝑢 = 0.2 #spikes = 0; 𝒚5 𝑢 = 0; 𝒗5 𝑢 = 1

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-42
SLIDE 42

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 11

𝜃 𝜃

#spikes = 2; 𝒚4 𝑢 = 0.182; 𝒗4 𝑢 = 0.2 #spikes = 1; 𝒚5 𝑢 = 0.09; 𝒗5 𝑢 = 1

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-43
SLIDE 43

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 11

𝜃 𝜃

−1

#spikes = 2; 𝒚4 𝑢 = 0.182; 𝒗4 𝑢 = 0.2 #spikes = 1; 𝒚5 𝑢 = 0.09; 𝒗5 𝑢 = 1

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-44
SLIDE 44

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 12

𝜃 𝜃

#spikes = 2; 𝒚4 𝑢 = 0.167; 𝒗4 𝑢 = 0.4 #spikes = 1; 𝒚5 𝑢 = 0.83; 𝒗5 𝑢 = 0

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-45
SLIDE 45

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 1000

#spikes = 200; 𝒚4 𝑢 = 0.200; 𝒗4 𝑢 = 1 #spikes = 99; 𝒚5 𝑢 = 0.099; 𝒗5 𝑢 = 0.5

𝜃 𝜃

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-46
SLIDE 46

On the Algorithmic Power of Spiking Neural Networks

Example

1 2

0.2

𝑢 = 1000

#spikes = 200; 𝒚4 𝑢 = 0.200; 𝒗4 𝑢 = 1 #spikes = 99; 𝒚5 𝑢 = 0.099; 𝒗5 𝑢 = 0.5

𝜃 𝜃

  • Setup: 𝐷 =

1 −0.5 1 , 𝑱 = 0.2 , 𝒗 0 = 0 0 , 𝜃 = 1.

  • Potential: 𝒗 𝑢 ∈ ℝ-
  • External charging: 𝑱 ∈ ℝ-
  • Spikes: 𝒕 𝑢 ∈ 0,1 -
  • Connectivity: 𝐷 ∈ ℝ-×-
  • Firing rate: 𝒚 𝑢
  • 𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐷𝒕 𝑢 + 𝑱

Chi-Ning Chou (Harvard University) 4/10

slide-47
SLIDE 47

On the Algorithmic Power of Spiking Neural Networks

The Algorithmic Power of Integrate-and-fire SNN?

Chi-Ning Chou (Harvard University) 5/10

slide-48
SLIDE 48

On the Algorithmic Power of Spiking Neural Networks

[Barrett-Denève-Machens, NIPS 2013] Using an “optimization problem” to analyze the firing rate of an integrate-and-fire SNN.

The Algorithmic Power of Integrate-and-fire SNN?

Chi-Ning Chou (Harvard University) 5/10

slide-49
SLIDE 49

On the Algorithmic Power of Spiking Neural Networks

The Algorithmic Power of Integrate-and-fire SNN?

firing rate

(𝐷, 𝑱)

Chi-Ning Chou (Harvard University)

[Barrett-Denève-Machens, NIPS 2013] Using an “optimization problem” to analyze the firing rate of an integrate-and-fire SNN.

5/10

slide-50
SLIDE 50

On the Algorithmic Power of Spiking Neural Networks

Non-negative Least Squares min

𝒚∈ℝ] 𝐷𝒚 − 𝑱 5 5

s.t. 𝒚< ≥ 0, ∀𝑗 ∈ [𝑜]

The Algorithmic Power of Integrate-and-fire SNN?

firing rate solution

(𝐷, 𝑱)

Chi-Ning Chou (Harvard University)

[Barrett-Denève-Machens, NIPS 2013] Using an “optimization problem” to analyze the firing rate of an integrate-and-fire SNN.

5/10

slide-51
SLIDE 51

On the Algorithmic Power of Spiking Neural Networks

The Algorithmic Power of Integrate-and-fire SNN?

solution

Using solution to estimate firing rate

firing rate solution

Non-negative Least Squares min

𝒚∈ℝ] 𝐷𝒚 − 𝑱 5 5

s.t. 𝒚< ≥ 0, ∀𝑗 ∈ [𝑜]

(𝐷, 𝑱) (𝐷, 𝑱)

Chi-Ning Chou (Harvard University)

[Barrett-Denève-Machens, NIPS 2013] Using an “optimization problem” to analyze the firing rate of an integrate-and-fire SNN.

5/10

slide-52
SLIDE 52

On the Algorithmic Power of Spiking Neural Networks

The Algorithmic Power of Integrate-and-fire SNN?

solution

Using solution to estimate firing rate

firing rate

Using firing rate to estimate solution

firing rate solution

Non-negative Least Squares min

𝒚∈ℝ] 𝐷𝒚 − 𝑱 5 5

s.t. 𝒚< ≥ 0, ∀𝑗 ∈ [𝑜]

(𝐷, 𝑱) (𝐷, 𝑱) (𝐷, 𝑱)

Chi-Ning Chou (Harvard University)

[Barrett-Denève-Machens, NIPS 2013] Using an “optimization problem” to analyze the firing rate of an integrate-and-fire SNN.

5/10

slide-53
SLIDE 53

On the Algorithmic Power of Spiking Neural Networks

(𝐷, 𝑱)

The Algorithmic Power of Integrate-and-fire SNN?

solution

Using solution to estimate firing rate

firing rate (𝐷, 𝑱)

Using firing rate to estimate solution

firing rate solution No provable analysis!

Non-negative Least Squares min

𝒚∈ℝ] 𝐷𝒚 − 𝑱 5 5

s.t. 𝒚< ≥ 0, ∀𝑗 ∈ [𝑜]

(𝐷, 𝑱)

Chi-Ning Chou (Harvard University)

[Barrett-Denève-Machens, NIPS 2013] Using an “optimization problem” to analyze the firing rate of an integrate-and-fire SNN.

5/10

slide-54
SLIDE 54

On the Algorithmic Power of Spiking Neural Networks

Our Contributions

Chi-Ning Chou (Harvard University) 6/10

slide-55
SLIDE 55

On the Algorithmic Power of Spiking Neural Networks

Our Contributions

The first proof for the firing rate of integrate-and-fire SNNs efficiently solving the non-negative least squares problem.

  • Confirm the empirical discovery of [Barret-Denève-Machens 2013].

Chi-Ning Chou (Harvard University) 6/10

slide-56
SLIDE 56

On the Algorithmic Power of Spiking Neural Networks

Our Contributions

The first proof for the firing rate of integrate-and-fire SNNs efficiently solving the non-negative least squares problem.

  • Confirm the empirical discovery of [Barret-Denève-Machens 2013].

Chi-Ning Chou (Harvard University)

What if there are infinitely many solutions?

6/10

slide-57
SLIDE 57

On the Algorithmic Power of Spiking Neural Networks

Our Contributions

The first proof for the firing rate of integrate-and-fire SNNs efficiently solving the non-negative least squares problem.

  • Confirm the empirical discovery of [Barret-Denève-Machens 2013].

Chi-Ning Chou (Harvard University)

What if there are infinitely many solutions?

  • Further show that the firing rate of integrate-and-fire SNN efficiently finds the

sparse solution (in the ℓ4 sense)

6/10

slide-58
SLIDE 58

On the Algorithmic Power of Spiking Neural Networks

Our Contributions

The first proof for the firing rate of integrate-and-fire SNNs efficiently solving the non-negative least squares problem.

  • Confirm the empirical discovery of [Barret-Denève-Machens 2013].

Chi-Ning Chou (Harvard University)

What if there are infinitely many solutions?

  • Further show that the firing rate of integrate-and-fire SNN efficiently finds the

sparse solution (in the ℓ4 sense), by implementing a primal-dual + projected gradient descent algorithm.

6/10

slide-59
SLIDE 59

On the Algorithmic Power of Spiking Neural Networks

Theorem (ℓ4 minimization problem)

Chi-Ning Chou (Harvard University) 7/10

slide-60
SLIDE 60

On the Algorithmic Power of Spiking Neural Networks

Given 𝐵 ∈ ℝe×-, 𝒄 ∈ ℝe, and 𝜗 > 0. Suppose 𝐵 satisfies some regular

  • conditions. Set 𝐷 =

𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 , 𝑱 = 𝐵i𝒄 −𝐵i𝒄 , and properly set the integrate-and-fire SNN.

Chi-Ning Chou (Harvard University) 7/10

Theorem (ℓ4 minimization problem)

slide-61
SLIDE 61

On the Algorithmic Power of Spiking Neural Networks

Given 𝐵 ∈ ℝe×-, 𝒄 ∈ ℝe, and 𝜗 > 0. Suppose 𝐵 satisfies some regular

  • conditions. Set 𝐷 =

𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 , 𝑱 = 𝐵i𝒄 −𝐵i𝒄 , and properly set the integrate-and-fire SNN. Let 𝒚∗ be the optimal solution to the ℓ𝟐 minimization problem. min

𝒚∈ℝ] 𝒚 4

s.t. 𝐵𝒚 = 𝒄

Chi-Ning Chou (Harvard University) 7/10

Theorem (ℓ4 minimization problem)

slide-62
SLIDE 62

On the Algorithmic Power of Spiking Neural Networks

Given 𝐵 ∈ ℝe×-, 𝒄 ∈ ℝe, and 𝜗 > 0. Suppose 𝐵 satisfies some regular

  • conditions. Set 𝐷 =

𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 , 𝑱 = 𝐵i𝒄 −𝐵i𝒄 , and properly set the integrate-and-fire SNN. Let 𝒚∗ be the optimal solution to the ℓ𝟐 minimization problem. min

𝒚∈ℝ] 𝒚 4

s.t. 𝐵𝒚 = 𝒄 When 𝑢 ≥ Ω(

  • m

no), we have (i) 𝒄 − 𝐵𝒚 𝑢 5 ≤ 𝜗 ⋅ 𝒄 5 and

(ii) 𝒚 𝑢

4 − 𝒚∗ 4 ≤ 𝜗 ⋅ 𝒚∗ 4.

Chi-Ning Chou (Harvard University) 7/10

Theorem (ℓ4 minimization problem)

slide-63
SLIDE 63

On the Algorithmic Power of Spiking Neural Networks

Given 𝐵 ∈ ℝe×-, 𝒄 ∈ ℝe, and 𝜗 > 0. Suppose 𝐵 satisfies some regular

  • conditions. Set 𝐷 =

𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 , 𝑱 = 𝐵i𝒄 −𝐵i𝒄 , and properly set the integrate-and-fire SNN. Let 𝒚∗ be the optimal solution to the ℓ𝟐 minimization problem. min

𝒚∈ℝ] 𝒚 4

s.t. 𝐵𝒚 = 𝒄 When 𝑢 ≥ Ω(

  • m

no), we have (i) 𝒄 − 𝐵𝒚 𝑢 5 ≤ 𝜗 ⋅ 𝒄 5 and

(ii) 𝒚 𝑢

4 − 𝒚∗ 4 ≤ 𝜗 ⋅ 𝒚∗ 4.

Chi-Ning Chou (Harvard University) 7/10

Theorem (ℓ4 minimization problem)

slide-64
SLIDE 64

On the Algorithmic Power of Spiking Neural Networks

Key Technique – A Dual View of SNN

Chi-Ning Chou (Harvard University) 8/10

slide-65
SLIDE 65

On the Algorithmic Power of Spiking Neural Networks

Key Technique – A Dual View of SNN

Chi-Ning Chou (Harvard University)

Primal SNN Dual SNN

Optimization problem

𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄

Dynamics

8/10

slide-66
SLIDE 66

On the Algorithmic Power of Spiking Neural Networks

Key Technique – A Dual View of SNN

Chi-Ning Chou (Harvard University)

Primal SNN Dual SNN

Optimization problem min

𝒚∈ℝ] 𝒚 4

s.t. 𝐵𝒚 = 𝒄 (ℓ4 minimization)

𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄

Dynamics

8/10

slide-67
SLIDE 67

On the Algorithmic Power of Spiking Neural Networks

Key Technique – A Dual View of SNN

Chi-Ning Chou (Harvard University)

Primal SNN Dual SNN

Optimization problem min

𝒚∈ℝ] 𝒚 4

s.t. 𝐵𝒚 = 𝒄 (ℓ4 minimization)

𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄

𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄

Dynamics

8/10

slide-68
SLIDE 68

On the Algorithmic Power of Spiking Neural Networks

Key Technique – A Dual View of SNN

Primal SNN Dual SNN

Optimization problem min

𝒚∈ℝ] 𝒚 4

s.t. 𝐵𝒚 = 𝒄 max

𝒘∈ℝu 𝒄i𝒘

s.t. 𝐵i𝒘 v ≤ 1 (ℓ4 minimization) (The dual of ℓ4 minimization)

Chi-Ning Chou (Harvard University)

𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄

𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄

Dynamics

8/10

slide-69
SLIDE 69

On the Algorithmic Power of Spiking Neural Networks

Key Technique – A Dual View of SNN

𝒘 𝑢 is a projected gradient descent algorithm for dual program with non-standard projection.

Chi-Ning Chou (Harvard University)

Primal SNN Dual SNN

Optimization problem min

𝒚∈ℝ] 𝒚 4

s.t. 𝐵𝒚 = 𝒄 max

𝒘∈ℝu 𝒄i𝒘

s.t. 𝐵i𝒘 v ≤ 1 (ℓ4 minimization) (The dual of ℓ4 minimization)

𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄

𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄

Dynamics

8/10

slide-70
SLIDE 70

On the Algorithmic Power of Spiking Neural Networks

Key Technique – A Dual View of SNN

𝒘 𝑢 is a projected gradient descent algorithm for dual program with non-standard projection.

Chi-Ning Chou (Harvard University)

Primal SNN Dual SNN

Optimization problem min

𝒚∈ℝ] 𝒚 4

s.t. 𝐵𝒚 = 𝒄 max

𝒘∈ℝu 𝒄i𝒘

s.t. 𝐵i𝒘 v ≤ 1 (ℓ4 minimization) (The dual of ℓ4 minimization)

𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄

𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄

Dynamics

8/10

External charging ≈ Gradient

slide-71
SLIDE 71

On the Algorithmic Power of Spiking Neural Networks

Key Technique – A Dual View of SNN

𝒘 𝑢 is a projected gradient descent algorithm for dual program with non-standard projection.

Chi-Ning Chou (Harvard University)

Primal SNN Dual SNN

Optimization problem min

𝒚∈ℝ] 𝒚 4

s.t. 𝐵𝒚 = 𝒄 max

𝒘∈ℝu 𝒄i𝒘

s.t. 𝐵i𝒘 v ≤ 1 (ℓ4 minimization) (The dual of ℓ4 minimization)

𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄

𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄

Dynamics

8/10

Spiking effect ≈ Projection

slide-72
SLIDE 72

On the Algorithmic Power of Spiking Neural Networks

Key Technique – A Dual View of SNN

𝒘 𝑢 is a projected gradient descent algorithm for dual program with non-standard projection.

KKT conditions

The firing rate in primal SNN solves the primal program.

Perturbation theory

Chi-Ning Chou (Harvard University)

Primal SNN Dual SNN

Optimization problem min

𝒚∈ℝ] 𝒚 4

s.t. 𝐵𝒚 = 𝒄 max

𝒘∈ℝu 𝒄i𝒘

s.t. 𝐵i𝒘 v ≤ 1 (ℓ4 minimization) (The dual of ℓ4 minimization)

𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄

𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄

Dynamics

8/10

slide-73
SLIDE 73

On the Algorithmic Power of Spiking Neural Networks

Key Technique – A Dual View of SNN

𝒘 𝑢 is a projected gradient descent algorithm for dual program with non-standard projection.

KKT conditions

The firing rate in primal SNN solves the primal program.

Perturbation theory

Chi-Ning Chou (Harvard University)

Spikes are non-monotone and difficult to analyze!

Primal SNN Dual SNN

Optimization problem min

𝒚∈ℝ] 𝒚 4

s.t. 𝐵𝒚 = 𝒄 max

𝒘∈ℝu 𝒄i𝒘

s.t. 𝐵i𝒘 v ≤ 1 (ℓ4 minimization) (The dual of ℓ4 minimization)

𝒗 𝑢 + 1 = 𝒗 𝑢 − 𝐵i𝐵 −𝐵i𝐵 −𝐵i𝐵 𝐵i𝐵 𝒕 𝑢 + 𝐵i𝒄 −𝐵i𝒄

𝒘 𝑢 + 1 = 𝒘 𝑢 − 𝐵 −𝐵 𝒕 𝑢 + 𝒄

Dynamics

8/10

slide-74
SLIDE 74

On the Algorithmic Power of Spiking Neural Networks

Perspectives – Natural Algorithms

Chi-Ning Chou (Harvard University) 9/10

slide-75
SLIDE 75

On the Algorithmic Power of Spiking Neural Networks

Perspectives – Natural Algorithms

“How algorithmic ideas can enrich our understanding of nature?”

  • Bernard Chazelle

Chi-Ning Chou (Harvard University) 9/10

slide-76
SLIDE 76

On the Algorithmic Power of Spiking Neural Networks

Perspectives – Natural Algorithms

Through the lens of natural algorithms, can we understand SNNs more via its algorithmic power and even discover new algorithmic ideas?

“How algorithmic ideas can enrich our understanding of nature?”

  • Bernard Chazelle

Chi-Ning Chou (Harvard University) 9/10

slide-77
SLIDE 77

On the Algorithmic Power of Spiking Neural Networks

Perspectives – Natural Algorithms

Through the lens of natural algorithms, can we understand SNNs more via its algorithmic power and even discover new algorithmic ideas?

“How algorithmic ideas can enrich our understanding of nature?”

  • Bernard Chazelle

In this work, we show that integrate-and-fire SNN uses its firing rate to efficiently solve some optimization problems in a primal-dual way!

Chi-Ning Chou (Harvard University) 9/10

slide-78
SLIDE 78

On the Algorithmic Power of Spiking Neural Networks

Conclusions

Chi-Ning Chou (Harvard University) 10/10

slide-79
SLIDE 79

On the Algorithmic Power of Spiking Neural Networks

  • In this work, we give the first proof for the firing rate of integrate-and-fire SNNs

efficiently solving non-negative least squares problem and ℓ4 minimization problem.

Conclusions

Chi-Ning Chou (Harvard University) 10/10

slide-80
SLIDE 80

On the Algorithmic Power of Spiking Neural Networks

  • In this work, we give the first proof for the firing rate of integrate-and-fire SNNs

efficiently solving non-negative least squares problem and ℓ4 minimization problem.

  • Related works.
  • Universality and computational complexity.
  • SNN is able to simulate Turing machines, random access machines (RAM), and threshold circuits etc. [Maa96, Maa97b,

Maa99, MB01].

  • Using SNNs to solve computational/optimization problems.
  • Sparse coding [ZMD11, Tan16, TLD17], dictionary learning [LT18], pattern recognition [DC15, KGM16, BMF+17], and non-

negative least squares[BDM13].

  • Implementing MCMC to solve traveling salesman problem (TSP) and constraint satisfaction problem (CSP) [BBNM11,

JHM14, Maa15, JHM16].

  • Assemblies of neurons and random projection [ADMPSV18, LPVM18, PV19].
  • The efficiency of SNNs in solving computational problems.
  • Solving Winner-Take-All (WTA) problem, similarity testing, and neural coding [LMP17a, LMP17b, LMP17c, LM18].

Conclusions

Chi-Ning Chou (Harvard University) 10/10

slide-81
SLIDE 81

On the Algorithmic Power of Spiking Neural Networks

  • In this work, we give the first proof for the firing rate of integrate-and-fire SNNs

efficiently solving non-negative least squares problem and ℓ4 minimization problem.

  • Related works.
  • Universality and computational complexity.
  • SNN is able to simulate Turing machines, random access machines (RAM), and threshold circuits etc. [Maa96, Maa97b,

Maa99, MB01].

  • Using SNNs to solve computational/optimization problems.
  • Sparse coding [ZMD11, Tan16, TLD17], dictionary learning [LT18], pattern recognition [DC15, KGM16, BMF+17], and non-

negative least squares[BDM13].

  • Implementing MCMC to solve traveling salesman problem (TSP) and constraint satisfaction problem (CSP) [BBNM11,

JHM14, Maa15, JHM16].

  • Assemblies of neurons and random projection [ADMPSV18, LPVM18, PV19].
  • The efficiency of SNNs in solving computational problems.
  • Solving Winner-Take-All (WTA) problem, similarity testing, and neural coding [LMP17a, LMP17b, LMP17c, LM18].

Conclusions

Chi-Ning Chou (Harvard University)

Very few works provide provable analysis

  • n the efficiency of SNN algorithms!

10/10

slide-82
SLIDE 82

On the Algorithmic Power of Spiking Neural Networks

  • In this work, we give the first proof for the firing rate of integrate-and-fire SNNs

efficiently solving non-negative least squares problem and ℓ4 minimization problem.

  • Related works.
  • Universality and computational complexity.
  • SNN is able to simulate Turing machines, random access machines (RAM), and threshold circuits etc. [Maa96, Maa97b,

Maa99, MB01].

  • Using SNNs to solve computational/optimization problems.
  • Sparse coding [ZMD11, Tan16, TLD17], dictionary learning [LT18], pattern recognition [DC15, KGM16, BMF+17], and non-

negative least squares[BDM13].

  • Implementing MCMC to solve traveling salesman problem (TSP) and constraint satisfaction problem (CSP) [BBNM11,

JHM14, Maa15, JHM16].

  • Assemblies of neurons and random projection [ADMPSV18, LPVM18, PV19].
  • The efficiency of SNNs in solving computational problems.
  • Solving Winner-Take-All (WTA) problem, similarity testing, and neural coding [LMP17a, LMP17b, LMP17c, LM18].
  • Next step?
  • Giving more rigorous analysis for the efficiency of other SNN algorithms!

Conclusions

Chi-Ning Chou (Harvard University)

Very few works provide provable analysis

  • n the efficiency of SNN algorithms!

10/10

slide-83
SLIDE 83

On the Algorithmic Power of Spiking Neural Networks

  • In this work, we give the first proof for the firing rate of integrate-and-fire SNNs

efficiently solving non-negative least squares problem and ℓ4 minimization problem.

  • Related works.
  • Universality and computational complexity.
  • SNN is able to simulate Turing machines, random access machines (RAM), and threshold circuits etc. [Maa96, Maa97b,

Maa99, MB01].

  • Using SNNs to solve computational/optimization problems.
  • Sparse coding [ZMD11, Tan16, TLD17], dictionary learning [LT18], pattern recognition [DC15, KGM16, BMF+17], and non-

negative least squares[BDM13].

  • Implementing MCMC to solve traveling salesman problem (TSP) and constraint satisfaction problem (CSP) [BBNM11,

JHM14, Maa15, JHM16].

  • Assemblies of neurons and random projection [ADMPSV18, LPVM18, PV19].
  • The efficiency of SNNs in solving computational problems.
  • Solving Winner-Take-All (WTA) problem, similarity testing, and neural coding [LMP17a, LMP17b, LMP17c, LM18].
  • Next step?
  • Giving more rigorous analysis for the efficiency of other SNN algorithms!

Thanks for your attention!

Conclusions

Very few works provide provable analysis

  • n the efficiency of SNN algorithms!

Chi-Ning Chou (Harvard University) 10/10