Hebbian Learning Algorithms for Training Convolutional Neural - - PowerPoint PPT Presentation

hebbian learning algorithms for training convolutional
SMART_READER_LITE
LIVE PREVIEW

Hebbian Learning Algorithms for Training Convolutional Neural - - PowerPoint PPT Presentation

Hebbian Learning Algorithms for Training Convolutional Neural Networks Gabriele Lagani Computer Science PhD University of Pisa Outline SGD vs Hebbian learning Hebbian learning variants Training CNNs with Hebbian + WTA approach on


slide-1
SLIDE 1

Hebbian Learning Algorithms for Training Convolutional Neural Networks

Gabriele Lagani Computer Science PhD University of Pisa

slide-2
SLIDE 2

Hebbian Learning Algorithms

Outline

  • SGD vs Hebbian learning
  • Hebbian learning variants
  • Training CNNs with Hebbian + WTA approach on image

classification tasks (CIFAR-10 dataset)

  • Comparison with CNNs trained with SGD
  • Results and conclusions

2

slide-3
SLIDE 3

Hebbian Learning Algorithms

SGD vs Hebbian Learning

3

  • SGD training requires forward and backward pass
slide-4
SLIDE 4

Hebbian Learning Algorithms

SGD vs Hebbian Learning

4

  • SGD training requires forward and backward pass
slide-5
SLIDE 5

Hebbian Learning Algorithms

SGD vs Hebbian Learning

5

  • Hebbian learning rule:
  • Unique local forward pass
  • Advantage: layer-wise parallelizable
slide-6
SLIDE 6

Hebbian Learning Algorithms

Hebbian Learning Variants

6

  • Weight decay:
  • Taking ɣ(x, w) = η y(x, w) w
slide-7
SLIDE 7

Hebbian Learning Algorithms

Lateral Interaction

7

  • Competitive learning

○ Winner-Takes-All (WTA) ○ Self-Organizing Maps (SOM)

slide-8
SLIDE 8

Hebbian Learning Algorithms

Convolutional Layers

8

  • Sparse connectivity
  • Shared weights
  • Translation invariance

Update aggregation by averaging in order to maintain shared weights

slide-9
SLIDE 9

Hebbian Learning Algorithms

Final Classification Layer

  • Supervised Hebbian learning with teacher neuron

9

slide-10
SLIDE 10

Hebbian Learning Algorithms

Experimental Setup

10

  • Hebbian + WTA approach applied to deep CNNs
  • Extension of Hebbian rules to convolutional layers with

shared kernels: update aggregation

  • Teacher neuron for supervised Hebbian learning
  • Hybrid network architectures (Hebbian + SGD layers)
slide-11
SLIDE 11

Hebbian Learning Algorithms

Network Architecture

11

slide-12
SLIDE 12

Hebbian Learning Algorithms

Different Configurations

12

slide-13
SLIDE 13

Hebbian Learning Algorithms

Classifiers on top of Deep Layers

13

slide-14
SLIDE 14

Hebbian Learning Algorithms

Classifiers on Deep Layers Trained with SGD

Considerations on Hebbian classifier:

  • Pros: good on high-level features, fast training (1-2 epochs)
  • Cons: bad on low-level features

14

slide-15
SLIDE 15

Hebbian Learning Algorithms

15

Classifiers on Hebbian Deep Layers

slide-16
SLIDE 16

Hebbian Learning Algorithms

16

Layer 1 Kernels

slide-17
SLIDE 17

Hebbian Learning Algorithms

17

Hybrid Networks

slide-18
SLIDE 18

Hebbian Learning Algorithms

18

Hybrid Networks: Bottom Hebb. - Top SGD

slide-19
SLIDE 19

Hebbian Learning Algorithms

19

Hybrid Networks: Bottom SGD - Top Hebb.

slide-20
SLIDE 20

Hebbian Learning Algorithms

20

Hybrid Networks: SGD - Hebb. - SGD

slide-21
SLIDE 21

Hebbian Learning Algorithms

Hybrid Networks: SGD - Hebb. - SGD

21

slide-22
SLIDE 22

Hebbian Learning Algorithms

Conclusions

  • Pros of Hebbian + WTA:

○ Effective for low level feature extraction ○ Effective for training higher network layers, including a classifier

  • n top of high-level features

○ Takes fewer epochs than SGD (2 vs 10) → useful for transfer learning

  • Cons of Hebbian + WTA:

○ Not effective for training intermediate network layers ○ Not effective for training a classier on top of low-level features.

22

slide-23
SLIDE 23

Hebbian Learning Algorithms

  • Explore other Hebbian learning variants

○ Hebbian PCA ■ Can achieve distributed coding at intermediate layers ○ Contrastive Hebbian Learning (CHL) ■ Free phase + clamped phase ■ Update step: ■ Equivalent to Gradient Descent

Future Works

23

slide-24
SLIDE 24

Hebbian Learning Algorithms

  • Switch to Spiking Neural Networks (SNN)

○ Spike Time Dependent Plasticity (STDP) ○ Higher biological plausibility ○ Low power consumption ■ Good for neuromorphic hardware implementation ■ Ideal for applications on constrained devices

Future Works

24

slide-25
SLIDE 25

Hebbian Learning Algorithms

References

  • G. Amato, F. Carrara, F. Falchi, C. Gennaro,G. Lagani; Hebbian Learning Meets

Deep Convolutional Neural Networks (2019) http://www.nmis.isti.cnr.it/falchi/Draft/2019-ICIAP-HLMSD.pdf

  • S. Haykin; Neural Networks and Learning Machines (2009)
  • W.Gerstner, W. Kistler; Spiking Neuron Models (2002)
  • X. Xie, S. H. Seung; Equivalence of Backpropagation and Contrastive Hebbian

Learning in a Layered Network (2003)

25